RE: Eli and Brian Update [Fwd: [>Htech] "Friendly AI" released for commentary]

From: Lucas Gonze (lucas@worldos.com)
Date: Fri Apr 20 2001 - 10:34:38 PDT


Caveat: IANAB (I am not a biologist).

> > I've always felt that if you build something that acts like it has a
> > sense of self, then you are morally obligated to *treat* it like has a
> > sense of self.

Y. There's a kind of turing test involved: if something gives external
indications of having a sense of self, you can't say it doesn't have a sense of
self. On an empirical level it's not enough to say "but I know that computers
don't have selves". From outside a thing there is no way to know if it feels
like it has a self.

> > Hmm, but then there's the question of, is it possible
> > to build something that has a sense of self but has been engineered to
> > act like it doesn't? Jeez, what an ethical nightmare.

can't do it. a sense of self will always manifest in survival mode. Point a
gun at the thing and see what it does. of course, it's up to you to figure out
what, from its perspective, is analogous to a gun.

> I think SJ Gould would rip you a new one regarding the "if we have it,
> then evolution dicates it must be useful, or at least once was useful"
> argument.

That's not the basis of the argument. The argument is that the reason self
exists is that the characteristics of things which cause them to survive over
generations are self-like characteristics. Per the self/turing test above,
showing self-like characteristics is the same as having a self.

An example is a gene for blue eyes. Why does the gene cause blue eyes to be
made? Because (1) it was created and (2) it had attributes that caused effects
that were advantageous to its survival. It doesn't matter whether there was an
"it" that could want to survive. It only matters that the result was the same
as if there was an "it" to have wants.

"if we have it then evolution dictates it must be useful" is dead wrong. Whole
organisms carry tons of microstuff that does nothing for the whole. e.g.
leftover viral DNA, inter-chromosome competition, genes that spread fast enough
to survive killing the organism... The real answer is that if we have it, it
has been successful in its fight for survival. Given how long that fight for
survival has gone on, if we have it then evolution dictates it is likely to be a
stupendendous badass.[1]

Bringing this back to AIs without selves -- If you create a varied bunch of
selfless AIs, and some but not all survive, and survival had anything to do with
individual characteristics, you will have culled out the ones with the weakest
sense of self. Differential survival means units of selection, units of
selection mean selves. Capacity to mind self-interest is the absolute first
thing selected for, ever.

[1] per _Cryptonomicon_



This archive was generated by hypermail 2b29 : Sun Apr 29 2001 - 20:26:01 PDT