Office of Communications

Beate Fülle
Media Liaison Officer

Head of Communications and Public Relations

Phone: +49 7071 601-777
Opens window for sending emailpresse-kyb[at]tuebingen.mpg.de

 

... what your brain decides for you

When we move through our surroundings, the information that our visual system and other sensory systems provide on self-motion is generally in agreement. When multiple sources provide the same information, this information can be combined to form the most reliable estimate of our motion. However, in motion simulators the information that is presented to the different sensory systems does not have to be in agreement. Hence, in this case the brain can be faced with a problem of Causal Inference: do different sensory signals share a common cause or not? Should the information be combined or not?

Our scientist Ksander de Winkel ( incl. Mikhail Katliar and Heinrich H. Bülthoff) in the Motion Perception and Simulation research group published a new study in PLOS one:

1. What was your study all about?

We investigated how the brain constructs an estimate of the direction in which we are moving, which we refer to as heading. To estimate our heading, the brain can make use of what is seen by the eyes, but it can also use what is felt by our body’s acceleration-sensors (such as the vestibular system). In everyday life, the physical stimuli to the eyes and acceleration sensors are in agreement, and the estimates of heading provided by the sensors will be close –apart from some noise that is due to the fact that our sensors are not perfect. To minimize the error in the final heading estimate, the brain could merge the estimates provided by the different sensors, in a way that is similar to averaging. But this strategy only makes sense when what is seen and felt is indeed similar; if there is a large difference between the estimates, it makes more sense to rely either on what is seen, or what is felt. Processing sensory information differently depending on the inferred causality of the signals is called Causal Inference. In the experiment, we manipulated the heading that our participants saw and the heading that they felt independently, with discrepancies up to 90°, and asked our participants to indicate the direction in which they felt that they were moving. The results show that the participants indeed behaved similar to our expectations: when the headings that were seen and felt were similar, the responses given by the participants resembled merging; and when the headings were very different, the responses resembled an either/or decision. This means that the brain decides how to process information from different sensory signals –unconsciously, and almost instantly.

2. Why were you interested in this topic?

I have studied psychology, and during my studies I became intrigued by the question how to quantify perception; how to turn ‘perception’, which appears to be fundamentally subjective, into something measurable and objective. During my PhD, I investigated whether our brain forms perceptions of our movement through the world by merging information provided by different sensory systems. However, in that work, I only investigated what happens when the different sensory systems receive information that is supposed to go together. Because in everyday life we are bombarded with information from which we do not know whether it should go together or not, I wanted investigate how our brain processes multisensory information for which it is not certain whether or not it all goes together.

3. What should the average person take away from your study?

A reader should take away that our brain appears to assess what causes the motion that we see and motion that we feel, and processes the information in different ways depending on the results of this assessment. And, that all this deliberation happens unconsciously, and almost instantly.

4. Are there any major caveats? What questions still need to be addressed?

The results support the idea that the brain assesses the causality of sensory estimates of heading and processes them according to the outcome of the assessment. This finding can explain differences in the results of previous studies; some finding that heading estimates from different sensors are merged, and others finding that they are segregated. Moreover, the findings are in line with both behavioral and neuroimaging studies that investigated how sights and sounds are processed. However, there are alternatives possible to the Causal Inference model, and our data do not allow us to completely rule out those other possibilities. In short –there is more work to be done.

For more information:
journals.plos.org/plosone/article

Personal page of Opens internal link in current windowKsander de Winkel
Last updated: Tuesday, 25.04.2017