Mathematics Colloquia and Seminars

Return to Colloquia & Seminar listing

Finding and using sparse networks in neural computation

Mathematics of Data & Decisions

Speaker: Rishidev Chaudhuri, UC Davis, Math and NPB
Related Webpage: http://www.rchaudhuri.com/
Location: 1147 MSB
Start time: Tue, Nov 12 2019, 4:10PM

Computation in the brain is distributed over very large networks of neurons, yet most of these neurons are not directly connected to each other. This sparse connectivity is efficient, contributing to the brain's remarkably low energy use, and network density is disrupted in a number of disease states. Thus, understanding the principles that underlie computation on these sparse networks is of broad theoretical and practical significance.

In the first part of the talk, I will present work studying how the brain might decide which network connections are important and which are redundant (and hence can be pruned to maintain network sparsity). Noise is ubiquitous in neural systems and often considered an irritant to be overcome, but I will suggest that the brain may use noise to probe the structure of the network and remove redundant connections. I will construct a noise-driven plasticity rule that operates on a dynamical network and prove that for an important class of linear networks it preserves useful properties of the original dynamics. I will also show that this noise-driven pruning strategy has a number of connections to the sparsification of graph Laplacians.

Time permitting, in the second part of the talk I will discuss a model for sparse signal recovery by sparse random connections in biological neural networks and suggest an architecture for working memory that utilizes this sparse signal recovery.