This seems like a really bad idea. If every user running a browser is doi=
HTTP communicaitons X% of the time their browser is running, this means
that each browser will now be performing HTTP communications X+n% of the
time. So in one fell swoop, this technology increases each browser's
contribution to net congestion.
Well, that all depends on the 'hit rate' of its prefetching. I can imagin=
a very smart prefetcher that would not increase X but would dramatically
decrease latency: it would see that I never follow ad links, but always
follow cringely when I get to InfoWorld. It would know to prefetch
cybertimes and never NY Sports.
Finally, if there is a caching proxy intermediate, then the prefetching
results are really being shared organization-wide, and the load never hit=
the public Internet either.
All in all, a potential win.
>=95Page-streaming HTTP loads an entire Web page all in one shot rather
>than in intermittent connections.
Is this just a persistent connection? If a web page is larger than the
packet sizes between the client and server, you certainly can't load a we=
page in one shot (whatever a shot is).
Just persistent connections. Marketing BS.