[FoRK] Black Belt Bayesian vs. Authority, Fight!
Lion Kimbro
<lionkimbro at gmail.com> on
Fri Aug 10 10:41:38 PDT 2007
I understand Bayes theorem, but I don't understand the infatuation
with Bayes theorem or Bayesian reasoning, that I find in the
transhumanist movement.
The idea seems to be that:
1. Reasoning, the only good reasoning, is Bayesian reasoning.
2. One day, we'll either make or merge with computers that are
founded, at the very core, as the basis of their thinking, in
Bayesian reasoning, and that is the most powerful model
of thought.
I disagree with this, and will voice a few reasons:
1. Bayesian reasoning is intentional. That is, you have to say,
"I have this problem, and I'm going to solve it." Somewhere, the
problem has been sketched out. Somewhere, the theories of
what the probabilities are has been sketched out, as well.
All of this is hidden, by looking at just Bayes theorem, alone.
But that "hidden stuff" is enormously complex.
2. Errors in probability estimation. Collecting probabilities is not
"easy." It too is enormously complex. The mechanics of
intuition are not obvious to us. Further, there are often lines
of causality that we do not even imagine.
3. Bayes theorem is trivial. I understand it. I can explain
it easily, to anyone. If Bayes theorem was "the essential insight,"
I would expect we would already have the super-powerful AI that
we yearn for.
4. Bayes theorem is not practicable. Nobody can't wake up,
and say, "OK, I'm going to be a Bayesian today." There's just no
such thing. Again, the roles of desire and imagination, they're
just *crucial* to how lives play out.
A better use of our infatuation, I think, if we seek to build AI, would be
in brain modeling.
A better use of our infatuation, I think, if we seek to understand
reason, would be in cognitive neuroscience.
A better use of our infatuation, I think, if we seek a more rational
society, would be in the human potential movement, or evolutionary
spirituality.
More information about the FoRK
mailing list