Mastermind

Predicting decisions by taking a glimpse into the brain

June 19, 2017

Recent studies have indicated that the brain makes decisions on visual and auditory stimuli using accumulated sensory evidence, and that this process is orchestrated by a network of neurons from the front (the prefrontal area) and the back (the parietal area) of the brain. Ksander de Winkel and his colleagues from the Department of Prof. Bülthoff at the Max Planck Institute for Biological Cybernetics investigated whether these findings also apply to decisions on self-motion stimuli (passive motion of one’s own body). The results showed that the scientists could predict how well a participant was able to tell the different motions apart, which is an indication that an accumulation of sensory evidence of self-motion was measured. These findings provide support for the idea that the network of prefrontal and parietal neurons is ‘modality-independent’, meaning that the neurons in this network are dedicated to collect evidence and to make decisions using any type of sensory information and are not dependent on visual and vestibular (concerning the equilibrium) cues.

Peeping into the brain with fNIRS: the color of the blots indicates blood flow in that area of the cortex (image courtesy of the visualization tool QVis, see: http://bit.ly/2q8te2v).

The scientists placed participants in a motion simulator and rotated them around an earth-vertical axis that was aligned with the spine. More specifically, participants experienced a large number of pairs of such rotations, for which one was always slightly more intense than the other. The order of the smaller and larger rotations was randomized for each pair, and participants had to judge which rotation of each pair was more intense. While the participants performed the task, the blood flow in the prefrontal and parietal areas were measured using a novel technique: functional Near-Infrared Spectroscopy (fNIRS). The scientists then used these recordings to try whether it was possible to predict the participants’ judgments for every single pair of rotations.

Research on brain activity of participants or patients in motion is scarce, because the readings of common neuroimaging methods, such as electroencephalography (EEG) or functional magnetic resonance imaging (fMRI), are distorted by the body motion and electro-magnetic inference, such as electric noise in vehicles. This is not the case with fNIRS. Infrared light is emitted through the scalp into the brain tissue with the reflection to be measured. Since the light intensity of infrared light is very low, this method is non-invasive and harmless. The blood flow and the oxygen level increase in the active brain regions (haemodynamic response), which is captured via this method. Through this, it is possible to draw conclusions about activities in these brain areas.

“This method is very promising”, de Winkel gladly announces. “Up to now, we had to rely on what the participants could tell us about their perception. Now we get to glance directly into the brain.” The results showed that they could predict how well a participant could tell the motions apart using the fNIRS recordings, and therefore indicated that the areas under investigation were indeed involved in decision making on self-motion. The more sensory evidence participants collect, the better they will be able to tell two motions apart. “If we know how the brain makes decisions and what areas are involved, we can relate specific behavioral problems and physical traumata to these areas”, de Winkel explains. Moreover, considering the fact that conventional neuroimaging techniques are not suitable to use with moving participants, the results are encouraging for the use of fNIRS to perform neuroimaging in participants in moving vehicles and simulators. This might pave the way for a completely new line of research.

Printable images can be obtained at the Public Relations Office. Please send a proof upon publication.

Go to Editor View