[FoRK] Brain mapping and the connectome

Jeff Bone jbone at place.org
Sat Nov 7 19:17:55 PST 2009

Ken writes:

> I'm afraid I'm much more selfish than you.

Seems unlikely. (Ask various folks around here.  ;-)  I just think we  
just have a different perspective.

> if it's not essentially "us" I'm not interested.

Me either:  problem is, I don't think it's *possible* to have a non- 
trivial, objective, useful classification boundary between "us" and  
"them."  And, in fact, I think it's pretty close to provable (cf. Ship  
of Theseus-type arguments.)

Lots of loaded words get thrown around when this stuff is discussed  
that, upon closer examination, have very little substance (often  
surprising even those who use them.)  You're certainly not alone in  
making these arguments, but --- lots of biases and priors built in.   
I'd take all of this a lot more seriously if you --- or anyone ---  
offered up a non-trivial, objective, internally-consistent, useful  
definition of "human."

Some literal sort of interpretation of "humanism" is fundamentalism  
--- i.e., religion, and of a particularly nasty breed.

> And I haven't seen Consciousness

Capitalized, no less!

I thought you said you weren't a theist (or were anti-theist, or what  
have you.)  Deus ex machina, much?

(Aside:  you aren't by chance a philosophy major, are you?  ;-)

Well, so be it.  *I* haven't seen "Consciousness" either.

> the details of the subject are quite beyond me

The devil's in the details.  Or rather, isn't.   (I.e., there is no  
devil.  Cf. XTC, "Dear God.")  (Seriously, the implications are  
difficult to understand absent the details.)

> Anything I read about AI
Apparently either isn't much or isn't the right stuff. ;-)  We  
*clearly* have learning;  what we don't yet have is generalized  
agency, situation, context, etc.

> It still sounds to me that we're quite a ways away

It's worth nothing that whatever it sounds like to you, the various  
subject matter experts quoted in various of the links in this rambling  
pseudo-thread seem to have a different opinion.  (Others disagree.   
YMMV.  My money's on the connectionists --- *very* literally.)

How far we are depends on how you measure distance.  We clearly have  
*nothing* even remotely approximating the order of complexity of a  
human neocortex.  However, that doesn't mean that such is untenable in  
relatively short human-subjective timeframes (i.e., if accelerating  
change laws hold, for example...  Consider again the lifecycle of the  
Human Genome Project.  We're recycling arguments here, except we  
aren't arguing --- you're just reasserting.)


More information about the FoRK mailing list