Peter Latham - Failures, Weight Scaling, and Learning in the Brain.

Abstract

The brain is a very noisy place: when a spike arrives at a pre-synaptic terminal, about half the time neurotransmitter fails to release. Essentially, the brain is running DropConnect on a global scale: at any given time, about half the weights in the brain are set to zero, and this process appears to be totally random and independent across synapses. We argue that the noise induced by these ‘failures’ forces weights to be of order(1/n) – much smaller that the order(1/sqrt(n)) scaling typically used in deep networks. It also forces the weights to be low rank, something that may help solve the credit assignment problem.

Date
Event
Location
SEC 1.413