From: Eugene Leitl (firstname.lastname@example.org)
Date: Fri Aug 18 2000 - 19:44:18 PDT
Eliezer S. Yudkowsky writes:
> > Eugene's not proposing to *do* anything to me with his uberbeing, like
> > subjegate me to some loony set of seed Rules that he thinks are spiffy and
> > fair. Yet, anyway. ;-) (Eugene?)
> As far as I can tell, 'gene wants to turn you loose in an ecology of
> competing superintelligences with no particular motivation to be nice to
Actually, I think that we live in an ecology of competing
intelligences (from the virus to humans and higher-level clusters of
humans) with no particular motivation to be nice to each other, other
than what co-evolution makes them. The "super" thing is only in the
eye of the observer.
My stance is rather that I don't see a mechanism allowing you to strip
that regime sustainably, unless the whole ecology is sterilized, or
has managed to sterilize itself. Of course, everybody is encouraged to
keep looking -- evolutionary algorithms are not particularly enjoyable
at the receiving end.
Of course, if you're lucky enough to be a smart critter, you can
progressively slide over into more benign cooperation algorithms, as
you remember the history of your past dealings with others and can
authenticate, as well as extrapolate.
This archive was generated by hypermail 2b29 : Fri Aug 18 2000 - 20:50:27 PDT