Game Theory and the Golden Rule

Eugen Leitl eugen at leitl.org
Tue Dec 2 01:45:14 PST 2003


On Mon, Dec 01, 2003 at 10:10:26PM -0800, J. Andrew Rogers wrote:
> 
> I believe on can only make this assumption if one assumes rough intellectual
> parity between agents.  If one does not make this assumption, or assumes
> large disparities in practical intelligence, this does not hold.

Right. And the nature of AI emergence indicates it will be a
punctuated-equilibrium event. People just won't know what hit them.
 
> Computational parity between intelligent agents is one of those pervasive
> assumptions in human thinking and institutions based upon historical
> reality, one that more or less asserts that 1) only humans are intelligent
> agents, and that 2) all humans have functionally equivalent intelligence
> (although this illusion frays at the extremes).

It's very obvious, and truly remarkable how many people don't get
that the game completely changes when the old rules no longer apply.
 
> Some day, when we play "hamster" to the computer's "human", we'll be in for
> a rude awakening on a great many levels.

The moment you can buy a molecular circuitry desktop fabbing rig for 100 k$,
things start going to get interesting.

-- Eugen* Leitl <a href="http://leitl.org">leitl</a>
______________________________________________________________
ICBM: 48.07078, 11.61144            http://www.leitl.org
8B29F6BE: 099D 78BA 2FD3 B014 B08A  7779 75B0 2443 8B29 F6BE
http://moleculardevices.org         http://nanomachines.net
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 198 bytes
Desc: not available
Url : http://lair.xent.com/pipermail/fork/attachments/20031202/29898715/attachment.pgp


More information about the FoRK mailing list