[FoRK] Extreme Life Extension: Investing in Cryonics for the Long, Long Term

Jeff Bone jbone at place.org
Mon Jun 21 15:03:32 PDT 2010


Jeb --- can we call you that?  Sweet.  Substance.  Thank you.

On Jun 21, 2010, at 4:28 PM, Jebadiah Moore wrote:

> One major argument against Pascal's Wager is that there are an infinite
> number of possibilities for ways to get into heaven, so that none in
> particular gives any advantage.  You sort of have an analogue to that here
> in the form of multiple extended life technologies--i.e., you might be
> better off investing in some drug company to extend your life *before* you
> die--but that's not a very strong argument.

Agreed.  It's not a strong argument, but it's a real one.  You've got to do some kind of allocation of resources according to whatever probabilities you assign for various things succeeding.  Cryonics is the fallback, the last resort, and possibly not worthy of any investment at all.  But you've got to be fairly confident about lots of other assumptions in order to assign it zero or negative value.

> But the next step in the argument is.  With the refutation of Pascal's
> Wager, the argument is that since you cannot rationally influence your
> outcome after death due to a lack of information, you should live your life
> ignoring the possibility and simply maximizing your expected value where you
> can.  You certainly shouldn't strive for some particular possibility,
> because doing so likely costs you something on Earth for no expected gain
> afterwards.  In the case of cryonics, however, you do have some information
> about "life after death"--you know that the chance of resuscitation is
> rather small, and that the cost is somewhat of trying is somewhat high.  The
> expected value is sort of up for grabs--perhaps you'll just get a few more
> years, perhaps you'll live virtually forever.

Given the implied assumptions, you're correct.  But I'm not sure the assumptions are warranted.

First, there may well be ways to influence both the probability of success and the outcome post-resuscitation.  (The latter may depend largely on the former;  the sooner wake-up becomes practical, the more likely you are that actions taken now influence the outcome.)  And high is relative;  whether or not the cost is high depends on your definition of high.  Right?  So with the assumption that you may, through careful planning, both increase the odds of your resuscitation and influence the environment and quality-of-life that you experience after the fact, and with the assumption that the cost is not "high" in some relative sense, rationality is restored.  In essence, you are making a bet on the risk-adjusted net present value of whatever investments you may make in it.  Like all investments, you rationally invest unless you assumed the adjusted payoff was positive vs. the NPV.

> [various scenarios...]

All agreed.

> So, clearly the choice depends on your particular estimation of the
> probabilities and expected values.  

Exactly.

> So, while there are large classes of possibilities that cancel each other
> out, there are two basic classes you have to worry about; one group, which
> is low probability with a fairly high, though finite reward, and another
> which has high probability, with a moderate negative utility.  Your
> particular estimation of these two classes will tell you whether it is
> worthwhile to go for cryonics, but there are no zeros or infinities here to
> make the estimation easy, so it is clearly not a Pascal's Wager situation.

Mostly agreed in rationale, slight quibble with conclusion.  I'm not assuming binary modes or real, absolute infinities --- you may not have been around long enough or recall, but I'm a mathematical constructivist of a rather peculiar extreme sort --- I reject the absolute reality of any infinities and / or non-discrete continua in general.  More Markov than Brewer, but even moreso.

Note the word "qualitative" in my statement.  You are correct that any payout, if there is any at all, will be finite.  However, relative to current standards, it is likely that any payoff, if there is one, probably implies at least the possibility of such greater personal utility that it cannot reasonably be compared to utility today.  In the spirit of "any sufficiently advanced technology is indistinguishable from magic", it does then resemble Pascal's Gamble.

The biggest risk is the one you point out:  that you might be resuscitated but not able or allowed to have the benefit of such increased utility.  The rationale against that line of thinking is:  there would seem to be fairly few scenarios in which it would be to the resuscitators' benefit to actually revive someone while not giving them full advantage of their restored life.  Waking folks up to enslave them, put them in zoos, or what have you doesn't really seem to be compatible with any technological and ethical situation under which revival might occur.  Implausible, kind of like "human batteries" in that movie.

At least, that's the way I see it.

> In fact, the two classes seem to roughly balance each other, making this a
> rather personal choice.  

Minor quibble:  whether things balance out is by no means objectively sure.  Personal choice, yes.  But not one that necessarily occurs absent attempts at rationalism and quantitative thinking.

> it seems to me that it is quite
> likely that the information in the brain relies on the active pattern of
> firings, not just the structure, and thus it seems unlikely that a frozen
> person would wake up with their memories in tact.  

That *is* a very significant and serious concern.  There seems to be accumulating evidence that it's mostly structural, based on continuity of certain characteristics across admittedly brief but significant experimental and accidental disruptions of "life" as we judge such things today, but I'll let Eugen and his Powerpoints make that case more thoroughly if necessary.

A similar concern:  Roger Penrose's hocus-pocus quantum origin of consciousness also argues against the plausibility to a large extent, as well as against strong AI in general and any form of uploading.


$0.02,


jb





More information about the FoRK mailing list