[FoRK] Newton Re: why we should stop using brain metaphors when we talk about computing

Dr. Ernie Prabhakar drernie at radicalcentrism.org
Fri Nov 14 05:48:40 PST 2014

> On Nov 13, 2014, at 11:17 AM, Gregory Alan Bolcer <greg at bolcer.org> wrote:
>> what our Theory of Computation would be like if it had been invented by Isaac Newton instead of
>> mathematicians…
>  Buckets of water and state transition phases?

Close: channels of water and stateful gates.  That was how a family friend at Bell Labs explained transistors to me when I was in fifth grade. :-)

The deeper point is that Newton started from one concrete experience (the apple) and figured out a small set of abstract concepts that allowed him to link it to other concrete experiences (e.g., the orbit of the moon).

I have become convinced that the original sin of Computer Science is starting from abstract Boolean logic (operators/math) rather than concrete physical objects (transistors/state).

> On Nov 14, 2014, at 1:47 AM, Dave Long <dave.long at bluewin.ch> wrote:
>> we are no further along with computer vision than we were with physics when Isaac Newton sat under his apple tree.
> Speaking of Newton, I keep expecting someone like Conor McBride to come up with some kind of effective calculus-analogue for informatics.

We’re working on that at The Swan Factory. Check back with me in a month. :-)

>  But with seven years of hindsight (in my case, at least double that for McBride), it doesn't seem that anyone has yet stumbled across a simple model which —like calculus— could replace heavy creative analysis with a bit of plug-and-chug on scratch paper.
> http://stackoverflow.com/questions/25554062/zipper-comonads-generically/25572148#25572148 <http://stackoverflow.com/questions/25554062/zipper-comonads-generically/25572148#25572148>
> http://www.cis.upenn.edu/~byorgey/pub/species-pearl.pdf <http://www.cis.upenn.edu/~byorgey/pub/species-pearl.pdf>
> (don't be put off by the large amount of FP machinery used; the underlying ideas are simple enough that one can apply them (and people have been, for ages, eg. buffer-gap editors) even in machine-sympathetic environments. The basic "Midas Touch" problem in programming is that while code can always take data to any isomorphic form (and small amounts of shimmering are harmless, if not actually useful), after composing enough of these transformations together one is no longer dealing with relatively simple atomic behaviors, but instead automata whose intermediate states lead to relative complication)

Exactly.  The problem with the abstractions we currently use in computation is that they aren’t really composable.  This is why our theories are non-intuitive and our programs crash.

Newton’s genius was that he figured out the right primitive metrics for physical systems -- space, time, and mass — along with the right rules for combining them.   The result was a scale-independent system that could not only be applied for everything from electrons to galaxies, but could also tell you *when* and *how* to ignore the internal details so we could focus on a higher level of abstraction.

>> On Nov 13, 2014, at 2:39 PM, Stephen D. Williams <sdw at lig.net> wrote:
> We are searching for the important essence of things, in this case the fundamental useful properties of neural systems.  We may guess wrong or focus on the wrong aspects or create a model that doesn't work the same way.  We don't build bridges by imitating trees in all details.  It is not patently false that these methods weren't inspired what we believed at different times about neuroscience.  It is entirely possible, and probable really, that we will use algorithms that work better for our purposes than what actually happens with neural systems.  So what?  There's no guaranteed advantage in being fundamentalist about imitating neural systems exactly.  We will also do that, but few expect the overhead of a more exact simulation to be competitive.  

I believe you — but the very fact that people here don’t see that indicates (to me) a fundamental flaw in how we’re going about it.

The beauty of Newton’s work — which is literally the paradigm for all of physics — is that he *first* started with an explicit conceptual model, and then derived the mathematics from that.  Yes, he got it wrong (hence Einstein), but his model was precise enough we could critique it intelligently.

What worries me about computation is that we don’t have a culture of explicitly formulating and critiquing the conceptual models, so we end up critiquing specific implementations or comparing them to the good old days.  We have isolated individuals who try, but without a common meta-paradigm for what we are supposed to be doing we never get a virtuous cycle of clarification.  

> On Nov 13, 2014, at 1:34 PM, Stephen D. Williams <sdw at lig.net> wrote:
> https://en.wikipedia.org/wiki/Deep_learning <https://en.wikipedia.org/wiki/Deep_learning>
> I have a lot of detailed opinions about different areas, techniques, trends, etc.  At an abstract level, I think:
>  - Creating machine learning mechanisms that train and work well for certain types of input is just the start as those techniques will be repeatedly applied in new and clever ways.
>  - Trying and failing to apply techniques that worked well in another case is what leads to better understanding and more refined techniques.
> The latter is what I think of as being a "scienteer" - An engineer + scientist, i.e. applies scientific methods to the combination and use of both known engineering principles and new conjectures. I'm @scienteer.  There's art in there somewhere too, but scienteertist

I applaud you for being a scienteertist.  But I’m still shocked that the field of programming has virtually nothing that I (as a physicist) would recognize as a scientific community (versus lone ‘natural philosophers’).

> On Nov 14, 2014, at 1:47 AM, Dave Long <dave.long at bluewin.ch> wrote:
> (It's probably worth mentioning that most biomass doesn't find analytic thought, let alone intelligence, very useful.)
> -Dave

99% the time it is not useful at all.  But one percent of the time it enables positive virtuous cycles of meme-creation that literally transform the world.

Unfortunately, finding the right balance between analytic thought and practical experience is not solvable by one or the other. Which is perhaps why different individuals and communities idolize their particular local maxima…

— Ernie P.

More information about the FoRK mailing list