[FoRK] Brownback defines science

Lion Kimbro <lionkimbro at gmail.com> on Sat Jun 2 04:20:02 PDT 2007

On 6/2/07, Eugen Leitl <eugen at leitl.org> wrote:
> On Sat, Jun 02, 2007 at 03:28:44AM -0700, Lion Kimbro wrote:
>
> >  As far as I know, our net cosmic effect on the universe as a whole
> >  is pretty much near zero.
>
> So far. You're likely to see first relativistic probes leaving the
> solar system. If we play it right, we have the potential to transform
> at least GLYrs of real estate around us before dark energy pulls
> the final curtain.
>
> Both in quantity and quality we'd be playing the very first league.

  If we can make it out of the solar system, *my* worries are over,
  and the worry of later generations (machine or cyborg or human,
  whatever they be) begins.

  That's the goal.  {:)}=


> >  I think that the human-centric attitude is to save as much life on Earth
> >  as we are presently capable of saving;  Our major sins so far have
>
> I don't think the majority gives a damn about that. The vast
> majority tries to save their own ass, to feed and clothe their
> children, and to give them an education. Everything else is a distant
> forth and fifth.

  Are you sure you see so very clearly into the hearts of other people?

  I think it's right that people try to save their own ass, feed and
  clothe their children, and give them an education.

  But I think that after they're done doing that, they go, "Oh yeah.
  Someone should ... you know... do something about the penguins."

  And as far as I can see by reason, this is right and proper--
  We can't take care beyond our own capabilities, until we have them.
  We don't expect kids to take care of the neighborhood, first.
  Rather, we make sure that they can take care of themselves, first.
  THEN, they can take care of things beyond themselves.

  I think we, humanity, are just coming into contact with information
  and the ability to respond to each other and the environment.


  We can read the global situation a number of ways.  It's substantially
  (though not entirely) a matter of *choice,* how we read things.

  We could read it as:

    "History is on a track where humans are going to kill each other.
     The chemicals in the brain that lead people to be religious is
     going to play out in the global media, and people are all going
     to kill each other."

  Or, we could read it as:

    "We are now at a point where all societies are seeing each other.
     Societies that have been backwards technology-wise, are,
     extremely quickly, acclimating to an extremely different world, and
     meeting, for the first time in many cases, people who before lived
     behind a curtain.  This is not an easy time, but it's a time where
     we're learn a lot about each other, and find a new peace with
     each other."

  Whichever way you turn the dial, you can see a lot of evidence for
  either way.

  Which channel you choose to watch is, in some ways, a creative
  choice, like choosing your own adventure, [*] going to either page 14,
  or page 23.  Some of the facts are all colored in, but some of them,
  I believe, we have a *choice.*

    [*] http://www.somethingawful.com/d/comedy-goldmine/choose-your-own.php?page=2


> >  been lack of information, and lack of knowledge of how to structure
> >  our governments and relations in order to make sure that we can meet
>
> Our governments reflect the capabilities of their agents. Educated and
> powerful entities can recognise and fix issues bottom-up, thankyouverymuch.

  Mmmr....

  ?

  Okay, I guess if I say something cryptic, I get a cryptic response,
  and that's fair.

  I guess what I mean to say is:

  For example, there are starving people in the world.  There's more than
  enough food to feed everybody.  And yet people starve.

  Now, a lot of activist groups, they beat the ground and lash our minds,
  and shout, "It's because of your moral failings!  It's because you're
  bad people!"  Or, "It's because of our governments failings!  It's
  because the government is rotten!"

  But I don't think that's right.  I've looked into it a little bit, (and,
  admittedly, ONLY a little bit,) and what I've come away with, is that
  it may well be that the logistics of distributing food is actually HARD.
  That it's NOT just a matter of going into town, and saying, "Food!
  Clean water!"

  That politics and economics and other systems actually *matter.*
  And that we simply *don't know how* to take all those things into
  account.

  The problem isn't that humans are *bad,* it's that humans are
  *freaking complicated.*  Answers are *not* clear.

  I think that's what I meant to say.


> >  I believe I hold a treasure--  Not uniquely, but rather rarely,
> >  and a treasure it is:  A puzzle piece on how to solve some of
> >  the crazy religion--and-atheist debates that are happening in
> >  our time.
>
> It's easy enough to do -- just build a human-enhancing self-replicating
> system, and dump it into the jetstream. Do you have any of that nanofairy
> dust, perchance?

  Eh... What?

  Okay, I don't think this one was very cryptic of me.
  I'm going to need some explaining from you on this one.

  Nanofairy dust isn't going to do it, alone.

  Machines can be just as religious as humans can be-
  I take it we're all programmers, right?
  We should all know better. ..!
  (Yes, this is tongue in cheek.)

  But, no, seriously-- If you do a study of "reason,"
  if you do a serious inquiry into reason, you'll find that reason
  is divergent, and there's good reason to believe that AI's would
  be just as likely to go off into religious ways of thinking
  as humans are.

  sdw and I had this lovely conversation about this, (I'm still
  awaiting his response on it,) and that's where I'd point that
  one, if you're suggesting that AI and nanotech will make
  everyone into religion-free people.

  I mean, I'll grant that it's *possible,* but, ... We may find
  something different than religion, but resembling religion,
  or oppressive like religion, on the other end of things.

  Perhaps when we learn to articulate and describe modes
  of reasoning in great detail, we will find ways of reasoning that
  seek to destroy other ways of reasoning, or something like
  that.  We honestly can't know, for sure, of course, this side
  the singularity.  But there's a speculation.


> >  When we destroy the Earth, we destroy ourselves, and when we look
>
> It is possible to detach yourself from the supporting ecology, but
> it's hard work. And nobody's got it done yet, so we're still
> vulnerable.

  Well, it's even harder to detach yourself from your heart.

  It's like Captain Teeg says in Pirates III:
  (paraphrasing:)

    "Living forever's not the hard part,
     It's living with yourself all that time."

  If we can save penguins, and other species, (and I'd rather,
  regardless of whether we need them or not,) I strongly believe
  it's going to require global effort to do so.

  Some people will balk and complain about how they're being
  forced to care for what they don't care for, "at the point of a
  gun!" like they always say, but honestly, I can live with that.


> >  at it that way, the only course of action to be taken is obvious.
>
> I'm glad you find it obvious -- I don't. There's a number of things
> we can do, and impact of each is impossible to model accurately yet.

  Oh, well-- the specific course of action is up for debate.

  But the concept of "try to figure it out, and labor towards it,"
  is to me, the obvious choice to make.

  Granted, the devil is in the details.

  Take care,
    Lion =^_^=

More information about the FoRK mailing list