Group leader

Betty Mohler, PhD
Phone: +49 7071 601-217
Fax: +49 7071 601-616
Martin Dobricki, Dr. Phil.
Phone: +49 7071 601-215
Fax: +49 7071 601-616
Opens window for sending emailmartin.dobricki[at]


Group Members

PAVE Overview Poster

Recent News


Five most recent Publications

Mohler B (December-12-2013) Invited Lecture: Perception of indoor and outdoor virtual spaces, 5th Joint Virtual Reality Conference (JVRC 2013), Paris, France.
Mohler B, Raffin B, Saito H and Staadt O: 5th Joint Virtual Reality Conference, 5th Joint Virtual Reality Conference (JVRC '13), 94, Eurographics Association, Aire-la-Ville, Switzerland, (December-2013).
Dobricki M and de la Rosa S (December-2013) The structure of conscious bodily self-perception during full-body illusions PLoS ONE 8(12) 1-9.
Heydrich L, Dodds TJ, Aspell JE, Herbelin B, Bülthoff HH, Mohler BJ and Blanke O (December-2013) Visual capture and the experience of having two bodies: evidence from two different virtual reality techniques Frontiers in Psychology 4(946) 1-15.
Meilinger T (November-12-2013) Invited Lecture: Spatial cognition: different processes for different kinds of spaces, Centre National de la Recherche Scientifique: Centre de Recherche Cerveau & Cognition, Toulouse, France.

Export as:
BibTeX, XML, Pubman, Edoc, RTF


Perception and Action in Virtual Environments

Betty Mohler seeing a self-animated avatar in a head-mounted display.
Betty Mohler seeing a self-animated avatar in a head-mounted display.

In the Perception and Action in Virtual Environments research group, our aim is to investigate human behavior, perception and cognition using ecologically valid and immersive virtual environments. Virtual reality (VR) equipment enables our scientists to provide sensory stimulus in a controlled virtual world and to manipulate or alter sensory input that would not be possible in the real world. More specifically, VR technology enables us to specifically manipulate the visual body, the contents of the virtual world, and the sensory stimulus (visual, vestibular, kinesthetic, tactile, and auditory) while performing or viewing an action. Our group focuses on several different areas, all areas involve measuring human performance in complex everyday tasks, i.e. spatial judgments, walking, driving, communicating and spatial navigation. We investigate the impact of having an animated self-avatar on spatial perception, the feeling of embodiment or agency, and on the ability for two people to effectively communicate. We are also interested in the impact of other avatars on human performance, human emotion perception and learning/training. Additionally, we are very interested in the visual and bodily control of locomotion and reaching tasks. Finally, we are interested in spatial navigation and memory of learned spatial layouts of small and large spaces. Our goal is to use state-of-the-art virtual reality technology to better understand how humans perceive sensory information and form an understanding of, remember their experiences in and act in the surrounding world. We use HMDs, large screen displays, motion simulators and sophisticated treadmills in combination with real-time rendering and control software and tools in order to immerse our participants in a virtual world. We use many different experimental design methods, i.e. psychophysical methods, adaptation and dual task paradigms, well established and novel performance measures of behavioral tasks and fMRI. 

Main research areas

Visual body influences perception:
Seeing a virtual avatar in the virtual environment influences egocentric distance estimates. If this avatar is a self-animated avatar, egocentric distances are even more influenced (Mohler, Presence, 2010).  Eye-height influences egocentric space and dimension estimates in virtual environments (Leyrer, APGV 2011).  Seeing a virtual character (self or other) impacts subsequent performance of common tasks in virtual environments (McManus, supervised by Mohler, APGV 2011).  The size of visual body parts (hands/arm length) influences size and distance estimates in virtual worlds (Linkenauger, ECVP and VSS 2011).  These results taken together argue that the body plays a central role in the perception of our surrounding environment.
The role of visual body information in human interaction and communication:
Current state-of-the-art in motion capture tracking enables scientists to animate avatars with multiple participant’s body motion in real time. We have used this technology to conduct experiments investigating the role of body language on successful communication and interaction. We have found that body language is important for successful communication in a word-communication task and that both the speaker’s and the listener’s body movements (as seen through animated avatars) impacts communication (Dodds, CASA, 2010).  We have further shown that people move more if they are wearing the xSens Moven suits and using large-screen projection technology as compared to when they are wearing Vicon rigid body tracking objects and viewing the virtual world in a low field-of-view head-mounted display (Dodds, PLoS One 2011). We have also investigated the role of the visual information of the interaction partner on task performance in a table-tennis paradigm. We have shown that the social context (competitive or cooperative) mediates the use of visual information about the interaction partner (Streuber, EBR 2011). We have also used motion capture technology to investigate the use of VR for medical training (Alexandrova CASA, 2011) and the emotional expression of body language (Volkova, IMRF, 2011).
Self-motion perception while walking and reaching:
We have conducted studies to investigate the sensory contribution to encoding walking velocity (visual, vestibular, proprioceptive, efferent copy) and have found a new measure for self-motion perception: active pointing trajectory (Campos, PLoS One, 2009). We have further demonstrated that imagined walking is different than physical walking, in that participants point in a way that indicates that they are not simulating all of their sensory information for walking when imagining walking. Additionally, we have investigated human’s ability to detect when they are walking on a curved path and the influence of walking speed on curvature sensitivity. We have found that walking speed does influence curvature sensitivity, showing that when walking at a slower velocity people are less sensitive to walking on a curve. We exploit this perceptual knowledge and designed a dynamic gain controller for redirected walking, which enables participants to walk unaided in a virtual city (Neth, IEEE-VR 2011).  Finally, we have investigated motor learning in for reaching given different viewpoints and different visual realism of the arm and environment and make suggestions for the use of VR for rehabilitation and motor-learning experiments (Shomaker, Tesch, Buelthoff & Bresciani, EBR 2011).
Spatial perception and cognition:
Visiting Prof. Roy Ruddle investigated the role of body-based information on spatial navigation. He found that walking improves humans cognitive map in large virtual worlds (Ruddle, ToCHI 2011) and he investigated the role of body-based information and landmarks on route knowledge (Ruddle, Memory & Cognition 2011).  We have also found that pointing to locations within one’s city of residence relies on a single north-oriented reference frame likely learned from maps [Frankenstein, PsychScience in press]. Without maps available navigators primarily memorize a novel space as local interconnected reference frames corresponding to a corridor or street [Meilinger 2010 and Hensen, supervised by Meilinger 2011 Cog Sci,]. Consistent with these results, entorhinal grid cells in humans quickly remap their grid orientation after changing the surrounding environment (Pape, supervised by Melinger SfN 2011). Additionally, we have found that egocentric distance estimates are also underestimated in large screen displays, and are influenced by the distance to the screen (Alexandrova, APGV 2010).



Recent Journal Publications (2009-present)

43. Linkenauger S, Lerner MD, Ramenzoni VC and Proffitt D (October-2012) A Perceptual–Motor Deficit Predicts Social and Communicative Impairments in Individuals With Autism Spectrum Disorders Autism Research 5(5) 352–362.
CiteID: LinkenaugerLRP2012
42. Alaimo SMC, Pollini L, Innocenti M, Bresciani JP and Bülthoff HH (October-2012) Experimental Comparison of Direct and Indirect Haptic Aids in Support of Obstacle Avoidance for Remotely Piloted Vehicles Journal of Mechanics Engineering and Automation 2(10) 628-637.
CiteID: AlaimoPIBB2012
41. Pretto P, Bresciani J-P, Rainer G and Bülthoff HH (October-2012) Foggy perception slows us down eLife 1 1-12.
CiteID: PrettoBRB2012
40. Bieg H-J, Bresciani J-P, Bülthoff HH and Chuang LL (September-2012) Looking for Discriminating Is Different from Looking for Looking's Sake PLoS ONE 7(9) 1-9.
CiteID: BiegBBC2012
39. Streuber S, Mohler BJ, Bülthoff HH and de la Rosa S (September-2012) The Influence of Visual Information on the Motor Control of Table Tennis Strokes Presence 21(3) 281-294.
CiteID: StreuberMBd2012
38. Neth CT, Souman JL, Engel D, Kloos U, Bülthoff HH and Mohler BJ (July-2012) Velocity-Dependent Dynamic Curvature Gain for Redirected Walking IEEE Transactions on Visualization and Computer Graphics 18(7) 1041-1052.
pdfCiteID: NethSEKBM2011_2
37. Linkenauger S (July-2012) You'll golf better if you think tiger has used your clubs Harvard Business Review 90(7-8) 32-33.
CiteID: Linkenauger2012
36. Graydon MM, Linkenauger SA, Teachman B and Proffitt DR (June-2012) Scared stiff: The influence of anxiety on the perception of action capabilities Cognition and Emotion 26(7) 1301-1315.
CiteID: GraydonLTP2011
35. Barnett-Cowan M, Meilinger T, Vidal M, Teufel H and Bülthoff HH (May-2012) MPI CyberMotion Simulator: Implementation of a Novel Motion Simulator to Investigate Multisensory Path Integration in Three Dimensions Journal of Visualized Experiments (63) 1-6.
CiteID: BarnettCowanMVTB2011
34. Witt JK, Linkenauger SA and Proffitt DR (April-2012) Get Me Out of This Slump! Visual Illusions Improve Sports Performance Psychological Science 23(4) 397-399.
CiteID: WittLP2012
33. Frankenstein J, Mohler BJ, Bülthoff HH and Meilinger T (February-2012) Is the Map in Our Head Oriented North? Psychological Science 23(2) 120-125.
pdfCiteID: FrankensteinMBM2011
32. Dobricki M and Lips M (January-2012) Communication in Swiss farming cooperatives Journal of Rural Cooperation 40(1) 29-43.
CiteID: DobrickiL2012
1, 2, 3, 4

Export as:
BibTeX, XML, Pubman, Edoc, RTF
Last updated: Friday, 23.02.2018