Neuroscientists have discovered how sensory input is converted into motor movement in numerous brain regions in mice. The study, conducted at UCL’s Sainsbury Wellcome Centre, demonstrates that decision-making is a brain-wide process controlled by learning. The findings could help with artificial intelligence research by providing insights into how to create more distributed neural networks.
“This work unifies concepts previously described for individual brain areas into a coherent view that maps onto brain-wide neural dynamics. We now have a complete picture of what is happening in the brain as sensory input is transformed through a decision process into an action,” explained Professor Tom Mrsic-Flogel, Director of the Sainsbury Wellcome Centre at UCL and corresponding author on the paper.
The study, published in Nature, describes how the researchers utilized Neuropixels probes, a cutting-edge device that allows simultaneous recordings across hundreds of neurons in several brain areas, to investigate mice performing a decision-making activity. The exercise, created by Dr. Ivana Orsolic at SWC, enabled the scientists to distinguish between sensory processing and motor control. The researchers studied taught animals and compared them to naïve animals to demonstrate the role of learning.
This work unifies concepts previously described for individual brain areas into a coherent view that maps onto brain-wide neural dynamics. We now have a complete picture of what is happening in the brain as sensory input is transformed through a decision process into an action.
Professor Tom Mrsic-Flogel
“We often make decisions based on ambiguous evidence. For example, when it starts to rain, you have to decide how high frequency the raindrops need to be before you open your umbrella. We studied this same ambiguous evidence integration in mice to understand how the brain processes perceptual decisions,” explained Dr Michael Lohse, Sir Henry Wellcome Postdoctoral Fellow at SWC and joint first author on the paper.
Mice were trained to remain still as they watched a visual pattern move on a screen. To get a reward, the mice had to lick a spout when they noticed a persistent increase in the speed of movement of the visual pattern. The challenge was constructed so that the movement’s pace was never steady, but rather changed continuously. The timing of the increase in average speed varied from trial to trial, making it difficult for the mice to remember when the sustained increase occurred. Thus, the mice had to pay continual attention to the stimulus and integrate information to determine whether the speed increase had occurred.
“By training the mice to stand still, the data analysis we could perform was much cleaner and the task allowed us to look at how neurons track random fluctuations in speed before the mice made an action. In trained mice, we found that there is no single brain region that integrates sensory evidence or orchestrates the process. Instead, we found neurons that are sparsely but broadly distributed across the brain link sensory evidence and action initiation,” explained Dr Andrei Khilkevich, Senior Research Fellow in the Mrsic-Flogel lab and joint first author on the paper.
The researchers recorded from each mouse multiple times and collected data from over 15,000 cells across 52 brain regions in 15 trained mice. To look at learning, the team also compared the results to recordings from naïve mice.
“We found that when mice don’t know what the visual stimulus means, they only represent the information in the visual system in the brain and a few midbrain regions. After they have learned the task, cells integrate the evidence all over the brain,” explained Dr Lohse.
In this study, the team only looked at naïve animals and those who had fully acquired the task. Future work aims to identify how the learning process works by following neurons throughout time to see how they change as mice begin to grasp the task. The researchers are also investigating whether specific parts of the brain serve as causal hubs in building these linkages between feelings and behaviors.
The study also raises a number of additional problems, such as how the brain incorporates an expectation of when the speed of visual pattern would increase, so that animals only react to the stimulus when it is relevant. The team intends to investigate these questions further using the data they have gathered.