Plucker Email Reflector (fwd)

Meltsner, Kenneth Kenneth.Meltsner@ca.com
Sun, 19 Jan 2003 12:43:40 -0500


This is a multi-part message in MIME format.

------_=_NextPart_001_01C2BFE2.4D5A0481
Content-Type: text/plain;
	charset="Windows-1252"
Content-Transfer-Encoding: quoted-printable

I like WinHTTrack, which is a good offline copier and free. =
(http://www.httrack.org)

Ken

-----Original Message-----
From:	Eugen Leitl [mailto:eugen@leitl.org]
Sent:	Sun 1/19/2003 4:25 AM
To:	Tom
Cc:	fork@xent.com
Subject:	Re: Plucker Email Reflector (fwd)

Speaking of which, what do you people use for pull up a number of =
defined=20
pages (both free and requiring authentication) into your notebook to =
read=20
it at leasure on the way to work? Ideally some crontabbable Perl LWP=20
thingy which doesn't heed robots.txt and grabs a shallow (2-3 links =
deep,=20
local server) copy of an dailynewsurls.txt?






------_=_NextPart_001_01C2BFE2.4D5A0481
Content-Type: text/html;
	charset="Windows-1252"
Content-Transfer-Encoding: quoted-printable

<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2//EN">
<HTML>
<HEAD>
<META HTTP-EQUIV=3D"Content-Type" CONTENT=3D"text/html; =
charset=3DWindows-1252">
<META NAME=3D"Generator" CONTENT=3D"MS Exchange Server version =
6.0.6249.1">
<TITLE>RE: Plucker Email Reflector (fwd)</TITLE>
</HEAD>
<BODY>
<!-- Converted from text/plain format -->

<P><FONT SIZE=3D2>I like WinHTTrack, which is a good offline copier and =
free. (<A =
HREF=3D"http://www.httrack.org">http://www.httrack.org</A>)<BR>
<BR>
Ken<BR>
<BR>
-----Original Message-----<BR>
From:&nbsp;&nbsp; Eugen Leitl [<A =
HREF=3D"mailto:eugen@leitl.org">mailto:eugen@leitl.org</A>]<BR>
Sent:&nbsp;&nbsp; Sun 1/19/2003 4:25 AM<BR>
To:&nbsp;&nbsp;&nbsp;&nbsp; Tom<BR>
Cc:&nbsp;&nbsp;&nbsp;&nbsp; fork@xent.com<BR>
Subject:&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Re: Plucker Email =
Reflector (fwd)<BR>
<BR>
Speaking of which, what do you people use for pull up a number of =
defined<BR>
pages (both free and requiring authentication) into your notebook to =
read<BR>
it at leasure on the way to work? Ideally some crontabbable Perl LWP<BR>
thingy which doesn't heed robots.txt and grabs a shallow (2-3 links =
deep,<BR>
local server) copy of an dailynewsurls.txt?<BR>
<BR>
<BR>
<BR>
<BR>
</FONT>
</P>

</BODY>
</HTML>
------_=_NextPart_001_01C2BFE2.4D5A0481--