Date: Thu Jul 27 2000 - 13:27:12 PDT
Full text of the article included below. I don't think a DoS
against all search engines plus link hubs (Yahoo et al.) will
somehow take out every user's bookmarks, though.
Assuming that the Net is not a randomly meshed network, where each
node is maintains several physical links to the next one (as it
should be), one could try a DoS (doubtful, though, since these
things are designed to process a lot of packets) or corruption
of a bottleneck router, linking two large network islands. This
would fragment the net, and thus reduce it's usability somewhat,
but not take it out altogether.
Anyway, we need a web which interconnects local users with multiple
local links, instead of the typical tree topology (where trunks get
fatter and fatter, until they turn into few backbones), which can
be most easily and cheaply done with meshed networks implemented as
cellular wireless packet switched networks.
We also need distributed search engines. Each website (ideally,
even users with static IP) should be able to offer a fully
searchable index of itself plus its neighbours.
From: Jesse Davis <email@example.com>
Off-topic, but if anyone's curiosity was piqued by our discussion a few
months ago about taking down the 'net--topology, strategy,
analysis--there's an article on topological vulnerabilities in the *web*
They take an interesting tack--they analyze the web for *linking*
topology, rather than routing topology, and find that the diameter is
about 19 clickable links. If one disabled the most-linked-through sites,
the article claims you could disable the web without taking down any
routers, because users wouldn't be able to click from site to site. This
is naive--it focusses on linking among static pages, which is increasingly
irrelevant--but it's quite interesting. There's also a discussion of the
net as an organism.
|| A. Jesse Davis || firstname.lastname@example.org || www.cs.oberlin.edu/~ajdavis/ ||
"I think what an abrupt precipice cleaves asunder the male intelligence,
and how they pride themselves upon a point of view which much resembles
stupidity." --Virginia Woolf, Diaries 1919.
Beowulf mailing list
'Self-Assembly' Leaves Large Networks Vulnerable to Attack, Study
By FLORENCE OLSEN
A report in today's issue of the journal Nature describes a
fundamental principle of self-organizing systems that helps explain
why the Internet and the Web, like some other communications networks,
are sitting ducks for saboteurs.
Researchers led by Albert-Laszló Barabási, an associate professor of
physics at the University of Notre Dame, present evidence that the
Internet is highly tolerant of random failures among the millions of
routers and servers that make up the network. Even the Web, comprising
millions of documents and links, can tolerate random failures quite
But using computer-modeling and experimental data-visualization
techniques, Mr. Barabási and his colleagues demonstrate why the Web
and the Internet are highly vulnerable to attacks that use malicious
software agents. To render the networks unusable, such agents would
have to attack only the routers and servers -- or the Web pages --
that provide the most connections to the rest of the network.
"Understanding the structure of these networks is the first step
towards designing tools that could, in the long term, help us,"
Mr. Barabási says. "However, this is not an easy task," he adds,
because the networks' vulnerability is not the result of engineering
design. Each institution adds its own links and routers as needed, he
The Internet and the Web are the result of "a self-assembly process,"
Mr. Barabási says. "There is no central engineering design that is
flawed." If people could grasp that notion, he says, they would not
expect that "a silver bullet" could fix the networks' structural
Proponents of new spending on protection against cyber-terrorism may
see immediate practical value in Mr. Barabási's research. But he sees
much larger implications for his theoretical model of large
Even though most people think of the Internet and Web as one network,
the Notre Dame researchers analyzed them as separate -- and tried to
explain why they are vulnerable to cyber-attack but resilient in other
For instance: Mr. Barabási and his colleagues -- Hawoong Jeong, a
postdoctoral research associate, and Réka Albert, a doctoral candidate
in physics at the university -- write that the hidden structures and
growth patterns of the Web appear to be similar to those of complex
living systems, including the metabolic networks that operate inside
cells and the social networks that make up societies.
The Nature article builds on earlier research in which Mr. Barabási
described the Web, which now comprises nearly one billion pages, as
having a surprisingly small "diameter." The Web's diameter, as he
describes it, is a measure of the average number of clicks required to
navigate between any two Web pages. Any randomly selected Web page is
separated by an average of only 19 links, or mouse clicks, from any
other randomly selected Web page. (See an article from The Chronicle,
September 9, 1999.)
In today's Nature, Mr. Barabási's group writes that a well-aimed
attack could expand the diameter of the Web to such an extent that it
would no longer be practical to follow links. In modeling attacks on
both the Internet and Web, they found that "connectivity is maintained
by a few highly connected nodes," and that destroying only 4 percent
of those nodes would effectively disable the Web.
"The diameter of these networks increases rapidly, and they break into
many isolated fragments when the most connected nodes are targeted,"
the group reports.
Could that knowledge help malicious people launch more efficient and
devastating attacks on the Web? "I hope not, but my suspicion is that
it might," Mr. Barabási says.
Yuhai Tu, a scientist at the International Business Machines
Corporation's Thomas J. Watson Research Center, says the Notre Dame
researchers present a useful approach to analyzing many varieties of
"self-organized" networks, including gene-regulatory networks and
"Perhaps if we could understand why a certain network topology is
preferred and selected by nature," he writes in the same issue of
Nature, "such knowledge could ultimately help us design more-robust
This archive was generated by hypermail 2b29 : Thu Jul 27 2000 - 14:37:21 PDT