[FoRK] Why Fukushima made me stop worrying and love nuclear power
gstock at nexcerpt.com
Wed Mar 23 11:20:05 PDT 2011
On 3/23/11 12:32 PM, Stephen Williams wrote:
> [boggle at the scope of hubris revealed]
> [boggle at the scope of irrational paranoia revealed]
It's neither irrational nor paranoid to acknowledge and anticipate human
> The problems have been simple heat, pressure, and water management.
Which we've been doing for centuries -- and still can't do sufficiently
> And management of poisonous chemicals, which we do all the time.
> There is a level of engineering that could do far better than we've
> done so far.
And yet, we haven't. Human beings don't always do their best.
> Of course it is a difficult problem with bad consequences. But hardly
> as difficult or with as bad as consequences as fixing a well head at
> the deep bottom of the Gulf of Mexico. And more people died on that
> drilling platform.... And more people were negatively affected...
But, gee... that's just simple pressure management. It must be trivial
to get that right -- after drilling tens of thousands of oil wells for
over a hundred years, and millions of water wells for centuries before
that. Right? With such ~terribly~ bad consequences, clearly the
engineers would have planned everything right, and the techs would have
done everything right, and the corporation would have set all the
It's a ~fantasy~. It can't happen while current
(inappropriate|pathetic|poor|weak) modes of assigning value and
Nuclear power is not a technological problem: it is an ethical problem;
it is a moral problem; it is a cultural problem. It requires far more
significant evolution in human beings and in their organizations than it
does in technology.
>> Please keep us posted on when such fantasies become ~theoretically~
> It seems that some modern designs are already far, far better. With
> hardly anyone working on creative engineering since the market has
> long been practically non-existent. Contrast: Now that green energy
> is hot, we almost daily have interesting discoveries and leads.
When we have the changed people, and behaviors, and priorities, and
values, and organizations, and economies, and governments, and systems
behind such technology, the technology may be of interest.
>> Then, when someone actually implements such a thing at some
>> meaningful scale.
> Widespread belief that it can't or shouldn't happen is the biggest
Human nature ~itself~ is the obstacle -- not belief. Thebelief derives
in part from the fact that we all know we all screw up.
>> Then, when some community exists that actually permit that unit
>> within sight.
> See above.
>> Then, when any corporation exists that is not corrupt, to build the
>> real one.
> All nuclear related companies are corrupt? References?
They are no more corrupt than others; they are no less corrupt. You
don't seriously need references for corrupt corporations... do you?
> People who tend toward corruption are drawn to a dead industry with
> heavy education requirements, deep investment, and years of patience?
No, people who tend toward corruption are drawn to money, and enough
complexity to get away with it. A multi-billion dollar project has
"Come and get your cut!" written all over the RFP.
>> Then, when a government exists that enforces clear standards, to
>> validate it.
> The NRC does not have clear standards? Security and safety knowledge
> has become fairly advanced. (I've been a CISSP for 6 years, and
> someone near me is about to take the exam. Not nuclear related per
> se, however I'm sure they use many of the same risk analysis methods.)
Key word missed: "enforces."
For several years, I did validation for automated systems making potent,
injectable, and lifesaving drugs. FDA had ~extremely~ clear standards.
So clear, in fact, that engineers, plant management, and line operators
sought to avoid getting caught when they cut corners -- which was
constant, daily, and driven by corporate goals. (See "corruption," above.)
What FDA couldn't observe, or prove, they couldn't enforce.
>> Then, when an economy exists that can afford the inevitable costs of
> There will always be some kind of failure. Catastrophic failure?
Yes. That's the kind humans are prone to cause. It's only a matter of
> Recently, I was only 20 miles away from a significant natural gas
> pipeline explosion that killed 4.
Another simple problem: some pressure in a pipe. Clearly understood
150 years ago... easy to avoid... right?
>> Coz, when ~that~ happens -- when most of physics changes, and most of
>> human nature changes, and most of society changes, and most of
>> business changes, and most of government changes, and most of
>> economics changes -- a lot more of us will be a lot more willing to
>> embrace nuclear power.
> For the most part, those are changes that have to happen in people's
> heads. The perception gap with reality is very wide. It is fine to
> set engineering and operating standards high. Just don't require the
> industry to fight too many imaginary dragons.
They're not dragons, and they're not imaginary. They're ~human beings~
and they're all ~over~ the place.
>> People are only human. We're screw-ups. We screw things up. Things
>> we touch get screwed up. Things we do screw us up. Pretending
>> otherwise is utterly delusional.
>> The "foolishness" coefficient of requiring utter perfection far
>> exceeds that of seeking mere consistency.
> I assume both of these things.
It appears in fact that you ignore both of those things; there's a huge
> However, we can overcome impossible complexity and need for
> near-absolute perfection by the right process.
No. We can't. Maybe ~you~ can personally -- or imagine that you can.
But, elsewhere, there are still human beings involved. You keep
forgetting those pesky human beings.
> The space shuttle is fantastically complex. Each launch cycle
> requires perfection in many details. There were two failures, but
> there were not caused by failures in any of the complex systems.
Right. They were caused by... hmmm... which was it...
Really simple stuff that human beings screwed up by ignoring or claiming
Or, really simple decisions that human beings screwed up by presuming
they couldn't fail?
More information about the FoRK