Keywords: agoric systems, computational ecologies, resource auctioning,
Mark Miller, K. Eric Drexler, Bernardo Huberman, contracts, distributed
trust, metered usage, software objects, software ICs, superdistribution,
Brad Cox, emergent order.
In physics there are various conservation laws: conservation of energy,
mass, charge, and whatnot. You all know about this... Conservation of mass
says that mass is neither created nor destroyed. (For smart aleck
quibblers, conservation of mass-energy.)
How does this relate to our issues?
"Abuse of Resources": Mail loops, infinite loops, spamming, overloads of
networks, and congestion in general are cases where "unrealistic" models of
costs are implemented in software. In the real physical world, infinite
loops don't occur (at least not in the sense seen with mail loops, as a
Conservation laws are related to the "cost model" of the universe. Real
physical objects have costs, or ontological status, or presence.... (Please
don't read too much into this point...I mean to be suggestive, not
There are no "memory leaks" in the universe which suddenly fill it up with
stuff, no perpetual motion machines, no creation and destruction of
Cyberspace Ontologies: There are several things which need to be done to
make the cyberspatial world more like the spatial world:
* payment for CPU cycles consumed (via contractual, permission-based
access: "If you want access to this machine, here are the terms and
* metering mechanisms, such as e-stamps for e-mail (essentially a special
case of the first point, where a machine says "I'll pass on your message if
you pay me to.")
* digital contracts, agreements on usage and payment (resource auctioning,
or the "smart contracts" that Nick Szabo has written about)
(you can all think of additional examples....)
Cryptographic protocols have their uses here, but there are also some other
measures which bear looking into. In the LISP community, for example, work
has been done on "engines," which are building blocks that are "fueled up"
with "CPU fuel" and allowed to run for some amount of CPU cycles. Thus, one
could put an engine into a process and it would run for some number of
ticks, then stop.
(I'm sure there are Unix-level tools which do similar things, in terms of
giving a spawned process so many ticks of the clock. The "engines" concept
is somewhat more semantically clean, in that it's pushed down into the
"ontology" of the thing being simulated or run, and is not at the "God
level" (to use a non-technical term!).)
Now, certainly I support the right of any person or machine to run programs
freely and without charge, to pass on e-mail free of charge, to run
remailers for no charge, to accept spam mail without complaint, and so on.
What I'm suggesting is that many of the problems being seen with overuse of
resources, spam, congestion, and denial of service are really due to a poor
model of resource allocation. Unix and other modern operating systems offer
various tools for helping to constrain such problems, but, I submit, better
methods are needed.
(Especially when multiple machines, networks, and even anonymous sites are
part of the overall system....clearly the constraints must be managed
locally, and via "contract," as part of a computational ecology, and not as
a hierarchical, top down Unix-type operating system.)
Economics is about the "allocation of scarce resources." Many of the
existing models being used treat various scarce resources as _free_. Then,
when the inevitable problems occur, calls for top-down regulation are heard
(e.g., the frequent calls for illegalization of "unwanted mail").
In my view, building a consistent, distributed, "conservative" system is
what Cypherpunks need to be thinking about.
(I used the term "conservative" in the physics sense. A system in which
various conservation laws are obeyed.)
As I said before, this should not be compelled, but voluntary. However,
those who give their resources away for free (choosing not to adopt a
conservative ontology, in other words) should be in no position to complain
or run to the government for top-down regulation because there freely-given
resources are being overused or "abused" (in their thinking).
And closely related to this whole issue--and something I've written about
extensively--is the issue of "building walls in cyberspace." In the real
world, persistent structures are build out of real materials, resulting in
castles, forts, skyscrapers, bridges, houses, highways, etc. These objects
have persistence, have controllable access (gates, doors, locks,...), and
have "structural integrity."
Cryptographic and distributed trust protocols are about the only means I
can think of for constructing the equivalents in cyberspace. (And to a
large extent, this is already happening: the Net and the Web have structure
which cannot be demolished casually, or by top-down orders from any single
national leader. Millions of machines, linked in various ways and
implementing various protocols and "terms of service" with users and other
machines....an early version of the "conservative" system I think we'll
Well, this gives the flavor of my points. I haven't rigorously argued all
of the points, but the Cypherpunks forum is for presenting informal
We got computers, we're tapping phone lines, I know that that ain't allowed.
Timothy C. May | Crypto Anarchy: encryption, digital money,
firstname.lastname@example.org 408-728-0152 | anonymous networks, digital pseudonyms, zero
W.A.S.T.E.: Corralitos, CA | knowledge, reputations, information markets,
Licensed Ontologist | black markets, collapse of governments.
"National borders aren't even speed bumps on the information superhighway."
--- end forwarded text