[FoRK] dropouts in the society of mind

Dave Long dave.long at bluewin.ch
Thu Nov 7 04:03:00 PST 2013


Apparently "dropout training" is an easy way to handle overfitting in  
deep neural networks: the basic idea is that one trains a different,  
randomly selected, subset of the network with each input but queries  
the entire network.   (The more complex idea is that one is  
preventing complex coadaptation, somewhat similar to how two mirrors  
ground against each other are very unlikely to wind up flat, but  
three are)

This concept reminded me of Minsky's "society of mind" model; is  
there a connection beyond the shared nondeterministic collaboration  
of subsystems?

-Dave



More information about the FoRK mailing list