Rickard wrote:If anything, modem users would benefit from a client side solution. Only sending raw data to the client and then doing all the processing on the client would drastically reduce the bandwidth use. The initial page view might take a bit longer as the client side scripts are downloaded, but then the browser cache takes over.
True indeed, to the point where the user quit the waiting because it's too long. Below that point, all good. Hence the "loader on demand".
On that "point", on real everyday use of the web, it's very hard to determine, especially from your point of view (delivering software). You have to take into account the things a "basic webmaster' want to add, the others broad basic pre-requisites scripts (like IE7). Not much margin to work with.
Rickard wrote:It's very much possible. Why wouldn't it be? You could use e.g. Javascript to process data supplied by a webserver.
First of all because some users de-activate javascript, or use a non-javascript client. If you want javascript to be a pre-requisite of casual surfing, you have to forget them.
But that's for some basic apps, on the other hand the recent proliferation of PHP web game would gain a *lot* with more javascript (for some time now I play a little hilarious french game, like a satiric dungeon monster bashing, it's maybe 5 mns a day. All the actions in the game result in a reload of the page ? 100% server side ? that's a huge waste).
But for some features on a forum, like the thread read/not read I was refering too, I'm really not sure javascript is the answer (and in that case I'm really sure it's not as much a feature as it is a definition of the tool, i.e. a forum). I may be wrong, that's why I'm testing it, but ...
Rickard wrote:The biggest performance problem on the web today is unsufficient server performance. Have a look at your CPU usage while browsing the web. It hardly ever goes over 10%. If we could utilize this idle processing time, webservers could go back to doing what they do best and that is to serve content.
Again, I agree on the theory; not the idea that could be done with nowadays protocols and standards. But I could be wrong, for example I don't know the bandwith and server consumption of XUL.
And there is another trouble with wide and heavy client side, that's new hardware client. Let's take commons forum features, and say put half of them client side. Are you sure a pda or a pocket phone can handle that much ? They will, someday. But right now, you add an item of worry for webmasters. They have to worry about browsers bugs, plateform bugs, end-user needs (as in real, actual and reasonnable ones) and desires (as in futiles useless gadgets), faulty or cryptic writing of standards, server bugs, user hardware, user bandwith, server bandwith, and I forgot some. Adding user cpu power at hand ? Ouch ^^
Jérémie wrote:Who would be reading your e-mails? Google?
That's what they do to target ads Email is cheap, storage too, I don't see the needs for gmail. I admire their implementation of mail threading (something Qualcomm did iirc but never implementend on Eudora, I still don't know why), but that's it. I don't use webmail or IMAP, so I don't need 1gB, and if I did use IMAP that would be quite ridiculous (I have around 600mB of emails, not counting attachements, that's several gB with them). But that's me, if some people need to save $2 or $3 to have a decent email provider that's there choice :-)