We seek to understand how populations of neurons collectively process sensory input, perform computations and control behaviour. In particular, we are interested in how internal states and processes influence neural activity and perceptual decisions. To this end, we develop statistical models and machine learning algorithms for neural data analysis, and collaborate with experimental laboratories performing measurements of neural activity and behaviour.
How do populations of neurons process information and guide behaviour?
Understanding how neurons collectively represent the sensory input, perform computations and guide behaviour is one of the central goals of neuroscience. While it the importance of studying populations of neurons —rather than just single neurons— has been recognized for decades, the experimental and theoretical tools to empirically investigate the computational properties of neural populations had been lacking till recently. It is now becoming increasingly clear that information processing in the brain is highly state-dependent. In other words, both neural activity and behavior are not entirely determined by the external stimulus or task at hand, but can be highly modulated by internal states and endogeneously generated dynamics. For example, cognitive processes such as attention or behavioural states can lead to widespread modulations of neural excitability. Similarly, intrinsic neuronal properties such as synaptic plasticity can make neuronal responses dependent on the recent activity of the network. Thus, both behaviour and neural activity are strongly influenced by dynamic changes in the internal states of cortical networks, and can be very different from one trial to the next. We seek to get a better understanding of how such internal states and processes influence both neural activity and behavior, and to characterize their implications for neural information processing.
How can we make sense of complex data in neuroscience?
At a technical level, we tackle this problem by developing statistical models and machine algorithms for neural data analysis. Modern experimental techniques now allow unprecedented insights into the structure and function of neural circuits. These advances open the possibility of studying the statistical structure of neural activity in large populations of neurons, and of using these insights in clinical applications such as the development of neural prosthetics or brain-machine interfaces. However, understanding the complex data generated by neurophysiological experiments is a challenging task that requires powerful statistical methods.
For example, recordings of neural population activity yield high-dimensional time-series with rich and dynamically changing statistical structure, and can therefore be hard to visualize and interpret. Furthermore, because of factors such as neural plasticity as well as experimental challenges, these data can be non-stationary in nature and exhibit variability with dependencies across temporal and spatial scales. Although traditional analyses based on single neurons and “off-the shelf” data-analysis techniques have yielded important insights into neural computation, they are ultimately limited, as they fail to fully capture the rich temporal dynamics of neural population activity. To make real progress towards understanding computation in the brain, we therefore need powerful machine-learning methods that are adapted to the specific characteristics and challenges of neural data.
We develop, apply and analyze statistical methods for neural data analysis. Using Bayesian statistics as a methodological framework, we build probabilistic models that combine prior knowledge about neural connectivity and response properties with the observed experimental data and thus lead to more realistic and biologically plausible descriptions of neural population dynamics. In particular, we seek to build statistical methods for inferring internal states as well as functional networks from recordings of neural population activity. More generally, we aim to provide computational tools which have a thorough theoretical grounding, work robustly and efficiently on realistic data-sets, and can readily be used in a wide range of scientific and clinical contexts.