In honor of the P2P conference, so I can rant by proxy. :-) Maybe I
should have sent this to the conference folks, and gotten myself an
invite too. I could turn this into a half hour talk no problem, with a
title like "Essential Qualities and Emergent Properties of P2P". Ah
well. It's that lurker DNA somewhere in my codon set.
I wrote this several months ago, and thought I had posted it to fork,
but hadn't, just to a small private list. I got a suggestion from
several folks that I fork this, including the list owner, but hadn't
gotten around to sanitizing it enough. I get too cranky sometimes when
I write, and don't take care to buffer against potential ego-damage.
The adjustable spray nozzle of personal opinion has now been adjusted
from "needle massage" to "fine spray". :-)
-----microrant, copyright 2000 Strata R Chalup, please don't forward----
Yet another "vision" of the Internet. This one is dangerous, because it
sounds like it could be so right, but it's SO fundamentally so wrong!
The guy is standing right next to the car, and all he's thinking is
"wheel, wheel, wheel".
I had a suspicion, took a look at [author]'s homepage, and indeed found
the thing that I suspected: "Since discovering the net in 1993, (blah
The "PC People" of the 90's have a disadvantage when it comes to the
Internet, and in some ways, so do folks who were lucky enough to have
access to personal computers in the 80's. That disadvantage is this:
they do not ASSUME the network, and network services.
I had one week-long experience with Fortran programming on a dumb
terminal, during a summer program in 1980. After that, *all* of my
experience of *any* computers, including further highschool stuff at
UMaine Orono, was of network-connected systems. Even the IBM system at
UMO had some kind of link such that the most compelling app (for me!)
was a chat system.
When I got to MIT in 1981, my first account there was on XX (later
XX.LCS.MIT.EDU). I had accounts on CCC, EDDIE, DEEP-THOUGHT, MC, AI,
OZ, and so on. Every one of those systems had some kind of chat,
filesharing, network services, and the like. Remember Chaosnet?
MC/AI/MX/DM had file sharing! Remember TOPS-20? Job batching to other
systems! and so on. Putting up a new machine meant *joining the
community of services*, and folks measured the contribution of a machine
by what services it provided. The old MIT/UCB/SAIL "turist economy" is
where the whole "P2P ethic" stuff of Napsterism comes from! But even
more important, there was the assumption that *every machine was a
To some of the P2P crowd, actually being a *server* (ooh!) instead of
just a glorified terminal with an ethernet card up its butt is new and magical,
and the essence of P2P. DOH! I've seen claims like "DNS was designed
to be updated rarely". Hee hee! Has the gentleman even looked at the
innards of DNS, or has he been busy being a PC/Web/P2P guy rather than
doing infrastructure? Does he perhaps remember the daily HOSTS.TXT
update, done precisely because new servers were coming online, and old
ones might change as universities' networks grew? I don't think so!
These poor guys have *always* been stuck out there in DHCP land, where
the net infrastructure indeed breaks down because you can't *find* things.
[Which is by design, for reasons I understand, but don't necessarily
agree with and won't go into here, mostly address conservation (fine)
and attempted bandwidth conservation (less so).]
And other folks are eating this up, because at heart they are PC guys
too. I don't mean whether somebody bleeds 4 colors or 6 colors, I mean, "is
your fundamental experience all about The Box rather than The Network?"
Think about most of the services and apps that you know-- aren't they at
heart just The Box talks to Another Box?
Now we have P2P, and folks are essentially casting this as "the Orphaned
Box talks to Other Orphaned Boxes". This is where they're so close to
right, but so far away!
Someone said that "Calling this new class of applications peer-to-peer
emphasizes their difference from the dominant client/server model". He
does not seem to realize that until/unless things like Napster start
doing direct broadcast discovery of other nodes on the network, they are
VERY much still client and server. Anyone forgotten how Napster was
almost shut down? Try calling it "not client and server" with Napster
Central shut down!
Napster clients register an IP addr with the server to avoid DNS lookups
and because Napster shifts control of the file transfers to the nodes
once the server hooks them up. Yeah, but that's still server-client,
NOT peer to peer.
Think about Napster searches: I wanna find "I'm Forever Blowing
Bubbles", so I search for it. My client searches the Napster computer
I'm connected to, just like asking the DNS server in resolv.conf where
something is. The Napster server is in that way like a DNS root node.
Unfortunately, as we know with Napster, I can only ask a single Napster
server, and if I log into a different one, I get different yet still
Unlike the root.ca servers of DNS, the Napster servers
are all separate. Why? Because they are brokering IP addrs for each
other, just as described. BUT (very important) there is no reason why
they couldn't do a server to server protocol so that a Napster server
could return the results of searches on ALL the Napster servers, and
pass the client IP info between servers. I don't know what their
backend protocol looks like, but it's fundamentally flawed if they can't
do this. The point is, that Napster is DNS for Music, if you want to
put it in this guy's value system-- it maps a song name, not a machine
name, into an IP address.
Bingo! You've just discovered the TRUE magic of P2P, and why DHCP vs
fixed addressing/DNS is just a big ol' red herring. The specific
essence of P2P is that it maps *something besides machine names* into IP
addresses. Hello!! That's IT. The rest is just UI.
Now there's a whole lot of extra goodies that come out as emergent
properties of the system. Look at all the P2P apps, and think of which
ones implement which things. These are the building blocks.
- community (you are mapping the same metaclass of things into
IP addrs and, within that space there are going to be multiple
individuals mapping the same object-- other folks looking for "I'm
Forever Blowing Bubbles" for instance ),
- liveness (you are doing this based on who is live right now,
since they register or are scanned via broadcast),
- directory index construction (results of all searches for
"bubbles", etc; can build results oriented, category oriented (what is
"bubbles", both etc yawn...)
- knowledge base construction (it's HOMR and clones again, I
like this, this, and this, so if you like that and that, you probably
- reputation system (though no one seems to be doing except
MojoNation, but every time you have a system of repeatable accesses, you
can proffer an arbitrary unique id string and build reputation on the
- anonymity at transaction level (ie, we can find out who you
are eventually based on ISP logs, etc, but transaction peers don't
inherently know at time of transaction)
- true decentralization (using directed broadcast onto subnets
or CIDR blocks where you've seen peers before, keeping a file of places
to look, having various levels of authoritative central servers, freenet
on steroids, etc-- or you can cheat like they all do now and use a central
You want a real peer to peer revolution? Just implement a variant of
the old twisty maze of passages all differently alike and alikely
different from the usenet heyday: get a bunch of XML passing
servers out there, and have the P2P discovery stuff be "newsgroups".
Some will be routed to other servers, some won't, on a truly ad hoc
If a client doesn't want to do a directed broadcast to discover other
peers (broadcast being the purist's hallmark of a true P2P system!), they
could just subscribe to a group "myp2papp", post XML into the group to
say "I'm here at 123.456.789.123, anybody else around?", and then be
contacted directly by other apps subscribed around the world.
There would be no monolithic central server that could be shut down--
remember that university trying to shut down usenet because of something
posted to rec.humor.funny? The most they could do is ask sites to
excise one topic, and then you'd just rename the service and pop to
other topics. And usenet eventually became a real propagation chain,
where an article posted in any country would be all around the globe
within minutes (ok, sometimes seconds). The age of "we don't carry that
group" or lossy propagation was overcome. And since we wouldn't be
carrying *content* on the "P2PNews", just control messages, the volume
would not get out of hand as it did with usenet.
Of course, what you wouldn't get is another ICQ, another Napster,
another huge valuation company, because (very important!) nobody would
truly control the infrastructure. It's very ironic that the industry
is racing to embrace P2P and extolling it's "peeriness" when REALLY all
they want is to be the next big Keeper of the Central Server That
Connects the Clients. OK, maybe somebody can make money selling
clients, but I think that kind of thing has gone the way of paying for
browsers. Still and all, there's Opera, so who's to say?
I know, I know-- somebody somewhere wants to collect money for this, so
you'd probably rather have the P2P apps communicate their content via
XML events rather than just find each other through your server
network. Maybe some of them will. But to get the server network out
there as something to get excited about using to build P2P, you've got
to set the framework for arbitrary app discovery.
Sure, the meta-connecter would be XML wrapped, but once the technique is
used, the cat's out of the bag and somebody will be writing a
freeplacement (I should jargon that to TBTF!). Of course, if a
commercial distributed servers service spins itself as cheap
enough, reliable enough, etc, they wouldn't need to fight or switch.
And there would still be the opportunity to do branded clients, to do
server nets for big companies, to partner with developing
custom apps, to plug microservers into browsers for the OS vendors own
apps to talk to each other (to their advantage, since they could then
talk to other hosts and be truly .NET (ugh) enabled) and so on.
There's no reason this all couldn't be done via netnews right now. Just
set up a new hierarchy like the control groups, where instead of sendsys
control messages you post "hey, I'm here!" and just use the ISP's news
server as your aggregation point. Problem is, then your client has to
do a lot of work to figure out if they're *still* here, and clumsy
time-based reaping wouldn't solve that. That's solveable using XML
wrappers, and posting messages with an expiration date, or with a unique
channel-id in the headers of the original message that is repeated in
subsequent messages by the client, creating a virtual session.
Hell, you could do an XML wrapper gateway that responded to DHCP
(remember the part about broadcast?), bidirectional proxied the IP addr
request & lease renewal, and used that as a starting point to post
presence status for all the apps that the client registered with the
request (could be overriden for individual apps by the client itself).
That's what DHCP has to evolve into to support P2P, only there's nothing
for it to register apps with.
You want "vision", I got "vision". This stuff is completely obvious, and
a natural evolution of serviceability on the net. I don't understand
why nobody's doing it-- have they just chosen not to, and want to use
current infr to be the next Napster clone? This was an hour or so
of stream-of-consciousness writing mode, and I have a hard time
that I'm the only one thinking this! The other folks are probably just
not getting around to posting it, like me.
The P2P community is not dumb. All these guys have done great stuff--
great UI's, great apps, written great code, etc. A lot of them are
nonetheless all about The Box and just learning about The Net. The
fundamental property of the The Net is not hostnames, it's a mesh of
services. It's never been about nodes, it's been about services between
the nodes. P2P needs to grow up and wash behind the ears.
PS- And this PCs at the edge are a wasted resource stuff-- Got a point
about wasted resources, but how many people get excited about that? Not
a whole lot. Sure, SETI is excited, and the cypherpunks cracking codes,
and the ray-tracers, and so on. But ya know, even together they are
still a small set compared to the ICQ/Napster/etc folks, who just want
to use the net as a true extension of how they already live and what
they do. That's true "virtual reality", when the tools stay tools and
what you do with them is the star of the show.
-- ======================================================================== Strata Rose Chalup [KF6NBZ] strata "@" virtual.net VirtualNet Consulting http://www.virtual.net/ ** Project Management & Architecture for ISP/ASP Systems Integration ** =========================================================================
This archive was generated by hypermail 2b29 : Fri Apr 27 2001 - 23:19:00 PDT