See photos from Nordita

NEURAL NETWORKS

Our brain is composed of large networks of interacting neurons. In total it includes 100 billion neurons, and each of these neurons interacts with 10 000 others through constantly changing connections. The collective dynamics of these large and plastic networks of neurons underlie the great computational power of the brain. Given this picture it is not surprising that the physics of complex and disordered systems has significantly contributed to our understanding of information processing in neuronal networks. This contribution has been on two main frontiers, which also constitute the main line of research in neural networks at Nordita: studying equilibrium and non-equilibrium properties of model networks, and developing mathematical tools for analyzing experimental data.

One particularly attractive aspect of neuronal networks for physicists is the existence of disorder. Apart from the disorder that comes from randomness in the network architecture, i.e., when neurons are randomly connected or their membrane time constant fluctuates from one neuron to the other, learning new information also induces disorder that interferes with previously stored information. In fact the firing of nerve cells in the cortex is driven more by the fluctuations than by the net mean input, and cortical circuitry shares key features with spin glasses disorder and competing interactions, which roughly balance each other. Motivated by this, a mean field theory similar to that for spin glasses has been developed. A challenge for us is to use such theories and discover the properties of neuronal networks.

Another way statistical mechanics can be used in neuroscience is in building statistical models of data. In our recent work an Ising model has been used to describe the firing correlations in large populations of neurons.

Learn more from John Hertz and Yasser Roudi.

This page was printed on 2024-04-19 from www.nordita.org/research/cm/nn