MIT Center for Bits and Atoms
jamesr at best.com
Thu Apr 24 14:09:19 PDT 2003
> Digital logic was a bad idea
> - they did some 'atomic computing' using atoms & bonds of a
> molecule as logic gates. The molecule has resonance and
> energy states - use that as a 'machine' and program state
> changes via rf signals. Not quite retrogressing to analog
> circuits - there are 'probability circuits' that carry
> probabilty along. Don't digitize information early & process
> in the digital domain, keep it as 'probabilities' during
> computation & then digitize on output (like for sound transforms).
This is sort of missing the point and ignoring history. Whether or not
digital or stochastic computational machinery is "optimal" depends a great
deal on the kinds of algorithms you plan on running. For what computers
were originally used for and designed to do, you get better computing
performance for a given number of vacuum tubes or transistors using digital
computational machinery. There are also practical fabrication reasons that
generally make digital machinery a better choice.
However, many of the current problems and difficult algorithms in computer
science are highly optimal for stochastic computing machinery, not digital
machinery. But if you went to a purely stochastic machine, you would find
that many classes of algorithm that work very well on digital machinery
would become slow or effectively unusable. So you don't get something for
Short of non-trivial nanotech, stochastic computing machinery is not
practical from a computer engineering standpoint. You'd get more bang for
the buck emulating stochastic machinery on top of digital machinery (since
stochastic machines are generally FTMs, just like our digital computers).
> Bugs will have programs
> Natural phenomena (like energy states of a molecule, shapes
> of molecules) don't exactly model a particular problem - but
> so what, use it anyway.
> Engineers will not design complex systems
> Things are getting too complex - system will evolve. You can
> build a perfect system from imperfect parts if you build in
> self-correction along the way.
Stochastic computing machinery works well for both of these cases. One of
the features of it is that it can compute correctly through local errors or
poor quality data with a reasonable degree of reliability. The mechanisms
are the same both ways, just viewed in different scenarios. Note that I
don't think that "evolving" systems is particularly useful in the general
case, just that using stochastic machinery components can greatly simplify
the "gluing" process for components.
jamesr at best.com
More information about the FoRK