The Meaning of Life

jbone at jbone at
Mon Oct 20 21:51:30 PDT 2003


> Jeff Bone:
> >The most important "war" humanity is waging
> >isn't against Iraq, or terrorism --- it's against entropy.
> Tipler's omega point needs a closed universe.
> Most recent estimates put matter at 10% to 40%
> of what's required to reverse expansion. If
> those estimates are even close, entropy
> ultimately wins.

True but irrelevant.

If you assume there's some value to playing the game at all, then the 
ultimate outcome doesn't matter --- only the actual outcome vs. the 
best possible outcome given whatever constraints exist (or can be 
supposed to exist.)  (Note, too, that Tipler doesn't address the 
possibility that the multiverse can be exploited, that Deutsch is 
right, that all possible universes exist, etc.  What is the best 
personal strategy if indeed every possible life that you might have led 
is in fact real somewhere in the multiverse?)

Texas high school football analogy:  just showing up to the game 
doesn't matter.  Despite the BCA ranking of the team, anything less 
than the best possible showing is at least somewhat disgraceful.

Basing the strategy of the human race on a 20th-century physical 
eschatologist's admittedly limited theories would be a grave mistake.  
Eschatological physics makes neither good public policy nor private 
strategy or tactics.

Let's be clear what the countervailing viewpoint is:  if, Turpin, 
you're suggesting that ethics be based on Tipler ---- given the 
evidence --- then there's no reason for any of us to do anything (in 
the edge case) other than sit on our porch, take whatever the gov gives 
us, and wait to die.  To be less dramatic:  we should all try to find a 
personal strategic function that minimizes output in terms of effort 
and maximizes input in terms of satisfaction.

I prefer to think Tipler might be wrong but this assumption hardly 

>> A real intelligence is one with the interest..
> See, the point of my previous post is that
> there is a lot hidden under the cover of a
> simple word like "interest."

Let me be more blunt:  a "person" is an intelligence that can say that 
it prefers some other recognized individual's survival over its own 
under some hypothetical circumstance, and can tell you why.  And for 
which reason you can agree, given some rigorous chain of reasoning and 
some agreed upon, axiomatic premises.

I'm fumbling for a much more significant replacement for Turing's 
famous test, here, but I'm not there yet.  When I get there, I'd like 
to write a book w/ you (RT) called "Axiomatic Ethics and Moral 



More information about the FoRK mailing list