I havent noticed it in widespread use either, and I think the reasons are
Firstly, its probably too processor intensive to run on dynamic content (e.g
request generated HTML pages).
Secondly, its not really much use when applied to the most intensive static
content (GIFs, JPEGs)
Thirdly, it doesnt benefit modem users (on the ISP->modem->user leg) at all
(they already have compression)
On the other hand, Ive been finding that rich media static content benefits
muchly (it seems to work a treat on certain kinds of Flash files).
It would probably be most usefull implemented in a proxy cache.
> -----Original Message-----
> From: Gordon Mohr [mailto:firstname.lastname@example.org]
> > IIS 5.0 supports http/1.1 compression of selected file
> types out of the box.
> > > netscape has supported transparent gzipping since..um, a long
> > > time ago.
> > > and I've seen sites use it.
> Yes -- I knew this stuff was possible, I just haven't noticed it
> in widespread use.
> That suggests to me that either (1) it's so transparent it's easy
> to miss; or (2) it doesn't save enough in bandwidth costs to be
> worth the trouble.
> If (2), Curl's business case is really in trouble, because if
> people can't be convinced to use free off-the-shelf compression
> to save on bandwidth, why would they spend for a whole new
> active-content system to save on bandwidth?
> - Gordon
This archive was generated by hypermail 2b29 : Fri Apr 27 2001 - 23:14:15 PDT