Home

The Web

I was at a library today and happened across the Linux Magazine (Dec 2015) version. It's feature article is about the change from HTTP1.1 to HTTP2.

Let me put in a quote from that article: The top 1000 sites in 199 contained only an average of about 10 objects. in the years between 2009 and 2012, the number of obects doubled from 50 to 100, and as a result the average load time increased by 48% between 2010 and 2012 - despite faster systems and higher bandwidths. The steady rise in website conplexity led to a need for web standards that use network resources more eficiently. Right there in the quote, they say that faster systems and bandwidths did not offset the cost of complexity. And yet in the very next sentence they say "let's build a system that can handle more objects more efficiently" In essense, this is exactly the same as a faster system or more bandwidth. Nothing fundamental has changed, they're just looking for a way to stuff more bits through a network pipe. All the change to http2 seems to be (from my laymans perspective) is a reduction of overhead. If websites continue to to increase in complexity, no matter how small you make the overhead, your systems will still be slow.
So what is the solution?

The Solution

No-one has stopped to analyze the problem from a non-technical perspective. If you look at a website from the 90's, you'll see something like this:

If you look at a modern similar website you see thigs like this:

Notice the difference? While M4040 contains content and only content, the more 'modern' ones have huge amount of complexities unassociated with what the website is actually about.

Can you see the solution yet? The solution is to ... make the websites contain more content and less complexity. But many people like websites that look fancy and have lots of flashy moving things and complex menu's (not to mention web-based games). So what do we do there?

The long term solution

HTML and http were designed around the transfer of text. Everything on top of that is patching a system far far beyond it's original design. To truly make a web 2.0 we need to have a complete paridigm shift.

Actually, the paradigm shift has been coming slowly, and a soon-to-be problem is that it will arrive with the baggage of the pervious systems. Modern web-browsers are becomming more like a sand-boxed environment for running remote code on your computer. The web browser handles access to your computers hardware - providing a standardized platform for a websites code to run on. In essense, the web browser becomes a kernel, and a website is becomming software. Have a think about modern websites. Modern games have access to hardware graphics cards thought webGL, and to your webcam and microphone. They have access to a small amount of storage, and can create network connections.

This is not without it's own problems. Any high-school student can open a text file and type HTML, and through that, they can make a website. If the web-browser is to succeed as a kernel, then creating content needs to be just as easy, if not easier.

Notes:

I can't take credit for all of this. Obviously there are the websites I took screenshots from, but I also drew ideas from Alan Kay's many talks (such as thisone)