[FoRK] Singularity / fear (for Ken)
eugen at leitl.org
Sat Nov 14 03:16:12 PST 2009
On Fri, Nov 13, 2009 at 06:30:39PM -0600, Jeff Bone wrote:
> Fearing a technological Singularity is certainly a rational response
> given certain priors.
Given most priors, actually. The faster it is, the higher probability
of nobody squishy making it.
> I'm not sure particular that fear can be productive or do any good,
It can be productive if you realize that certain starting conditions
are more benign than others, and that clamping down on the kinetics
makes the thing more survivable. No guarantees, of course.
> though. IMHO, the best thing we can hope for is to engineer some
> sustainable symbiosis. Barring that, some acceptable and identity-
The only way it's possible if there's no longer any difference between
biology and postbiology. So biology has to transform into something
> kernel-preserving assimilation process. Something that makes the
> "human" part of posthuman meaningful, and "transhuman" something other
> than an oxymoron.
There's a continuum from an infant you to the adult you. Are you that
infant? Of course not. Does the question even make sense?
> As previously mentioned, I suspect that the very concept of FAI, as
> its usually batted around, is a non-logical and unrealistic idea.
Of course. Nobody can even define formally a friendly metric, at least
one that scales across open-ended system evolution.
> To the specific point, though, automating the "programming" process to
> ever-higher levels of abstraction is just part of the change curve ---
> one we're falling behind on, IMHO. I don't think automating away the
> details or enabling ever-greater "programming in the large" (as in,
> determining what needs to be orchestrated to get something done,
> accomplished, built, some action taken, etc.) actually *reduces* the
> aesthetic or creative aspect; indeed it may empower it. But that's a
> debate for artists, or craftsman, akin to an argument about whether a
> dresser made with hand tools or power tools is "better." It may be an
> interesting debate for some, but you're damn sure not going to build a
> skyscraper with hand tools...
Darwinian processes certainly scale. You don't have to understand
anything, yet you're still able to optimize. And of course the end
result is not understandable, at least using our current methods.
Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
ICBM: 48.07100, 11.36820 http://www.ativel.com http://postbiota.org
8B29F6BE: 099D 78BA 2FD3 B014 B08A 7779 75B0 2443 8B29 F6BE
More information about the FoRK