Re: distributed spiders

Date view Thread view Subject view Author view

From: Sandor Spruit (aspruit@acm.org)
Date: Tue Feb 08 2000 - 01:47:01 PST


Eugene,

On Tuesday, February 08, 2000, 12:19:53 AM, you wrote:

Eugene> How could one motivate people to install warez which scan their
Eugene> network vicinity (all addresses within a few hops) for indexable
Eugene> sites, locally build an index, and submit it to a central (where the
Eugene> main search engine resides) location?

Why would you submit the information to a central engine ? I thought
we'd given up on "central-location-I-control-it-all" and were moving
into the distributed area. Why not distribute the search engine and
gradually escalate a search attempt from "local" to "far away" ?

I'm wondering for a long time now why this isn't happening in some
structured way already. No one can keep up with the growth of the Net.
The results of global search engines get ever more useless. On the
other hand, local search engines often prove to be very up-to-date.

Our "local" and - incidently - very clueful Dutch research network
SURFnet, for example, has a private engine that indexes all the
websites of Dutch institutions connected to it. The obvious result:
very fast, up-to-date and helpful responses.

Eugene> I mean motivation other than paying them and/or giving them priority
Eugene> in using the search engine.

My response would be: the search engines would once again be useful -
back to the good old days when the Web just started to take off.

Regards,
Sandor

-- 
ir A.G.L. Spruit, Department of mathematics and computer science
Utrecht University, the Netherlands


Date view Thread view Subject view Author view

This archive was generated by hypermail 2b29 : Tue Feb 08 2000 - 01:47:17 PST