Mathematics Colloquia and Seminars

Return to Colloquia & Seminar listing

Expander graph architectures for high-capacity neural memory

Colloquium

Speaker: Rishidev Chaudhuri, University of Texas at Austin
Related Webpage: http://www.rchaudhuri.com/
Location: 1147 MSB
Start time: Mon, Mar 19 2018, 4:10PM

Memory networks in the brain must balance two competing demands. On the one hand, they should have high capacity to store the large numbers of stimuli an organism must remember over a lifetime. On the other hand, noise is ubiquitous in the brain and memory is typically retrieved from incomplete input. Thus, memories must be encoded with some redundancy, which reduces capacity. Current neural network models of memory storage and error correction manage this tradeoff poorly, yielding either suboptimal increases in capacity with network size or exhibiting poor robustness to noise. I will show that a canonical model of neural memory — the Hopfield network — can encode a number of states exponential in network size while robustly correcting errors in a finite fraction of nodes, thus answering a long-standing question in neural network theory.

These robust exponential-capacity Hopfield networks are constructed using recent results in information theory, which show how dynamical systems on large, sparse graphs (“expander graphs”) can leverage multiple weak constraints to carry out near-optimal error-correction. Expander graph network architectures exploit generic properties of large, distributed systems and map naturally to neural dynamics, suggesting appealing theoretical frameworks for understanding computation in the brain. Moreover, they suggest a computational explanation for the observed sparsity in neural responses in many cognitive brain areas. These results thus link powerful error-correcting frameworks to neuroscience, providing insight into principles that neurons might use and potentially offering new ways to interpret experimental data.



Refreshments will be served following the seminar.