[FoRK] Great graphs and facts: Energy

Stephen Williams sdw at lig.net
Sat Mar 26 12:43:07 PDT 2011


On 3/26/11 11:53 AM, Gary Stock wrote:
> On 3/26/11 1:34 PM, Stephen Williams wrote:
> ...
>>>> http://www.nucleartourist.com/basics/why.htm
>>>>
>>>> So, where are the corrections to those facts?
>>> How would corrections of facts lead to the requisite acknowledgement that human nature is the primary problem?
>> It is the primary problem to be engineered around, yes.  We can engineer around it, as we've shown many times in the past. 
> D'oh!  There.  Yet again.  Inconceivable.

I don't (completely) lack appreciation for human nature's ability to screw up constantly, however I may lack a poverty of 
imagination...  And, aware of many prior cases of "you can't do that" and "you can't do that safely" that were overcome, I see a 
solid pattern of engineering solutions that consistently prove both sentiments wrong.
>
>> One general way of engineering around it is to change the problem and the specific combination of solutions, which has already 
>> been largely done.
> Not if the primary problem is human nature.

See above.
>
>> Here, mostly-irrational fear seems to be the main thing holding us back. 
> Nothing could be more fully-irrational than to imagine that design can supersede human nature -- except, perhaps, to suggest that 
> an understanding of human nature constitutes "fear."

Assuming that we cannot engineer solutions that have enough safety, in spite of the fact that such solutions already exist and seem 
to be shown to be more safe than necessary and far more safe than existing market leaders, is irrational.  It takes in facts, 
doesn't evaluate or refute the facts, and returns a constant answer.  That is irrational.  In the case of nuclear power, this 
irrationality is obviously driven by the fear of being vaporized / radiation burned / given cancer and of making substantial or at 
least currently desirable portions of the Earth uninhabitable.  While there are many other things that could cause similar 
disasters, we accept all of those without much concern (beyond a more or less normal risk/reward tradeoff) while we have a singular 
cultural fear of nuclear radiation and accidents which is, for many people, dogma.  Holding one fear far more sensitively than its 
proportion to reality and other fears is at best ignorance (of real occurance & consequence probabilities and possibilities) and at 
worst a phobia.  In any case, it is irrational.

"Imagine that design can supercede ..." is never irrational unless it is based on (not yet invented/discovered, or worse already 
refuted) magic.  You can say that it is "probably incorrect/won't work because of...", but it is not irrational: Combining and 
extending facts and ideas is a core purpose of rationality.

Focusing on the fact that humans make mistakes constantly is, from an engineering point of view, like saying that beams crack, steel 
rusts, crystals have flaws, software has bugs, the Sun has spots, etc.  Even the fact that humans screw up, deliberately or not, can 
be accounted for and mitigated.  We do it all the time.  Nothing is perfect.  Demanding actual perfection (rather than in-the-limit 
probabilistic perfection) is irrational, because A) it doesn't exist, B) you don't really want it: there is always some risk/reward 
ratio that could tip the decision, and C) it is a constant answer to an evaluation question which is not an answer at all.

>> I'll also repeat my suspicion: We, at some esoteric long-thinking and "Foundation"al level, don't really want to stop paying the 
>> third world for oil too quickly.
> At this proposition, the mind boggles.  Humans (and/or their organizational systems) are so perversely self-destructive as to 
> motivate such a bizarre suspicion -- yet, 1) other systems designed 2) by humans can 3) overcome such traits.
>
> Let me try it this way:  you are the One -- the Only -- infinitely superior, infallibly omniscient Übermensch.  The rest of us are 
> merely human. Until you are personally prepared to design, construct, and operate these devices without our involvement, we must 
> respectfully decline.

I make mistakes all the time.  Because there are too many details involved, and due to constantly learning new details, I heavily 
rely on Google, compilers, Emacs, and Eclipse to keep me efficient by doing a quick code / iterate constantly rather than 
exhaustively reviewing first.  But when something works, it works until it is used outside of its envelope.  Do all of those 
mistakes mean that I A) can't get something working or B) can't solve problems quickly?  Hardly.

This is an old lesson, universally applied.  Each engineering area has a different mix of problem solving, risk evaluation, 
iteration, etc.  But there are few examples (fusion so far, but not fission) where we haven't conquered things fairly well.

Most people who really understand engineering problem solving, and have done it for tough problems, get this to some extent.  
Non-engineers often don't as engineering is voodoo until they get it.

We'll have other areas where we're likely to be irrational to some degree (self-driving cars, personal robots, non-required 
implants), but nuclear is still the worst bogeyman.

>
> GS 

sdw



More information about the FoRK mailing list