Re: HTTP serving compressed content (was Re: Perl "competitor" Curl raises $52M)

From: Andy Armstrong (andy@tagish.com)
Date: Thu Mar 15 2001 - 15:44:21 PST


Damien Morton wrote:
>
> I havent noticed it in widespread use either, and I think the reasons are
> simple.
>
> Firstly, its probably too processor intensive to run on dynamic content (e.g
> request generated HTML pages).

Not really -- I don't have specific figures to hand, but the processing
cost tends to be fairly insignificant compared with the bandwidth
saving.

> Secondly, its not really much use when applied to the most intensive static
> content (GIFs, JPEGs)

No indeed, but there are plenty of sites that re-use the same graphics
from page to page changing only the text -- news sites being an obvious
case in point.

> Thirdly, it doesnt benefit modem users (on the ISP->modem->user leg) at all
> (they already have compression)

It does benefit modem users quite considerably because it tends to
compress a lot more than modems do (we've seen 6:1 compression ratios on
typical HTML content).

> On the other hand, Ive been finding that rich media static content benefits
> muchly (it seems to work a treat on certain kinds of Flash files).
>
> It would probably be most usefull implemented in a proxy cache.

mod_gzip works pretty transparently with Apache.

I can't actually understand why more people aren't using compression on
their servers -- in the right circumstances it's extremely effective.

-- 
Andy Armstrong, Tagish



This archive was generated by hypermail 2b29 : Fri Apr 27 2001 - 23:14:15 PDT