Projektleiter

Betty Mohler, PhD
Tel: 07071 601-217
Fax: 07071 601-616
betty.mohler[at]tuebingen.mpg.de
 
Martin Dobricki, Dr. Phil.
Tel: 07071 601-215
Fax: 07071 601-616
Opens window for sending emailmartin.dobricki[at]tuebingen.mpg.de
 

PAVE-Poster


Neuigkeiten

 
 
 

Aktuellste Veröffentlichungen

Wellerdiek AC Person, Leyrer M Person, Volkova E Person, Chang D-S Person und Mohler B Person (August-2013): Recognizing your own motions on virtual avatars: is it me or not?, ACM Symposium on Applied Perception (SAP '13), Dublin, Ireland.
Saulton A Person, Dodds TJ Person, Tesch J Person, Mohler BJ Person und Bülthoff HH Person (August-2013): The influence of shape and culture on visual volume perception of virtual rooms, ACM Symposium on Applied Perception (SAP '13), Dublin, Ireland.
pdf
Dobricki M Person (Juli-17-2013) Invited Lecture: Multisensory self-perception, Universitätsklinik für Psychiatrie und Psychotherapie, Tübingen, Germany.
Linkenauger SA Person, Leyrer M Person, Bülthoff HH Person und Mohler BJ Person (Juli-2013) Welcome to Wonderland: The Influence of the Size and Shape of a Virtual Hand On the Perceived Size and Shape of Virtual Objects PLoS ONE 8(7) 1-16.
Volkova EP Person, Mohler BJ Person und Bülthoff HH Person (Mai-11-2013): Display size of biological motion stimulus influences performance in a complex emotional categorisation task, 13th Annual Meeting of the Vision Sciences Society (VSS 2013), Naples, FL, USA, Journal of Vision, 13(9) 195.

Export als:
BibTeX, XML, pubman, Edoc, RTF

 

Wahrnehmen und Handeln in virtuellen Umgebungen

Betty Mohler sieht ihren eigenanimierten Avatar in ihrem Head-Mounted Display.
Betty Mohler sieht ihren eigenanimierten Avatar in ihrem Head-Mounted Display.
Ziel der Forschungsgruppe “Wahrnehmen und Handeln in virtuellen Umgebungen” ist die Erforschung menschlicher Wahrnehmung, Kognition und Verhaltens in natürlicher Umgebung. Hierfür bedienen wir uns realitätsgetreuer und mit vielen Sinnen erfahrbarer virtueller Welten (virtual reality, VR). Dies ermöglicht es uns einerseits sensorische Reize in einer kontrollierten Umgebung zu präsentieren als auch sie in einer Art und Weise zu verändern wie das in der realen Welt nicht möglich wäre.

Im Speziellen ermöglicht unsere hochmoderne VR-Technologie den sichtbaren Körper, den Inhalt der virtuellen Welt sowie sensorische Reize (visuell, vestibulär, kinästhetisch, taktil und auditorisch) während dem Wahrnehmen oder Handeln zu verändern. Unsere Forschungsgruppe konzentriert sich auf verschiedene Forschungsfragen, bezieht sich jedoch immer auf die Messung menschlicher Leistungsfähigkeit in komplexen, alltäglichen Situationen, z.B. beim Gehen, Fahren, Kommunizieren oder während der räumlichen Orientierung. Wir untersuchen die Auswirkung eines animierten, den Nutzer repräsentierenden Avatars auf die räumliche Wahrnehmung, die Kommunikation oder das Gefühl einen bzw. einen bestimmten Körper zu haben. Wir interessieren uns dafür, wie sich andere Avatare auf Leistung, Emotions-Wahrnehmung, Lernen und Training sowie die visuelle und körperliche Kontrolle von Fortbewegungsprozessen auswirkt. Außerdem erforschen wir wie sich Menschen in alltäglichen Umwelten wie Gebäuden oder Städten orientieren und wie sie diese im Gedächtnis repräsentieren. Zusammengefasst arbeitet unsere Forschungsgruppe daran, menschliches Verhalten, Wahrnehmung und Kognition komplexer Alltagsprozesse besser zu verstehen. Dazu nutzen und verbessern wir modernste VR-Technologien.

Main research areas

Visual body influences perception:
Seeing a virtual avatar in the virtual environment influences egocentric distance estimates. If this avatar is a self-animated avatar, egocentric distances are even more influenced (Mohler, Presence, 2010).  Eye-height influences egocentric space and dimension estimates in virtual environments (Leyrer, APGV 2011).  Seeing a virtual character (self or other) impacts subsequent performance of common tasks in virtual environments (McManus, supervised by Mohler, APGV 2011).  The size of visual body parts (hands/arm length) influences size and distance estimates in virtual worlds (Linkenauger, ECVP and VSS 2011).  These results taken together argue that the body plays a central role in the perception of our surrounding environment.
 
The role of visual body information in human interaction and communication:
Current state-of-the-art in motion capture tracking enables scientists to animate avatars with multiple participant’s body motion in real time. We have used this technology to conduct experiments investigating the role of body language on successful communication and interaction. We have found that body language is important for successful communication in a word-communication task and that both the speaker’s and the listener’s body movements (as seen through animated avatars) impacts communication (Dodds, CASA, 2010).  We have further shown that people move more if they are wearing the xSens Moven suits and using large-screen projection technology as compared to when they are wearing Vicon rigid body tracking objects and viewing the virtual world in a low field-of-view head-mounted display (Dodds, PLoS One 2011). We have also investigated the role of the visual information of the interaction partner on task performance in a table-tennis paradigm. We have shown that the social context (competitive or cooperative) mediates the use of visual information about the interaction partner (Streuber, EBR 2011). We have also used motion capture technology to investigate the use of VR for medical training (Alexandrova CASA, 2011) and the emotional expression of body language (Volkova, IMRF, 2011).
 
Self-motion perception while walking and reaching:
We have conducted studies to investigate the sensory contribution to encoding walking velocity (visual, vestibular, proprioceptive, efferent copy) and have found a new measure for self-motion perception: active pointing trajectory (Campos, PLoS One, 2009). We have further demonstrated that imagined walking is different than physical walking, in that participants point in a way that indicates that they are not simulating all of their sensory information for walking when imagining walking. Additionally, we have investigated human’s ability to detect when they are walking on a curved path and the influence of walking speed on curvature sensitivity. We have found that walking speed does influence curvature sensitivity, showing that when walking at a slower velocity people are less sensitive to walking on a curve. We exploit this perceptual knowledge and designed a dynamic gain controller for redirected walking, which enables participants to walk unaided in a virtual city (Neth, IEEE-VR 2011).  Finally, we have investigated motor learning in for reaching given different viewpoints and different visual realism of the arm and environment and make suggestions for the use of VR for rehabilitation and motor-learning experiments (Shomaker, Tesch, Buelthoff & Bresciani, EBR 2011).
 
Spatial perception and cognition:
Visiting Prof. Roy Ruddle investigated the role of body-based information on spatial navigation. He found that walking improves humans cognitive map in large virtual worlds (Ruddle, ToCHI 2011) and he investigated the role of body-based information and landmarks on route knowledge (Ruddle, Memory & Cognition 2011).  We have also found that pointing to locations within one’s city of residence relies on a single north-oriented reference frame likely learned from maps [Frankenstein, PsychScience in press]. Without maps available navigators primarily memorize a novel space as local interconnected reference frames corresponding to a corridor or street [Meilinger 2010 and Hensen, supervised by Meilinger 2011 Cog Sci,]. Consistent with these results, entorhinal grid cells in humans quickly remap their grid orientation after changing the surrounding environment (Pape, supervised by Melinger SfN 2011). Additionally, we have found that egocentric distance estimates are also underestimated in large screen displays, and are influenced by the distance to the screen (Alexandrova, APGV 2010).

Selected Publications

166. Mohler BJ Person (Juli-19-2010) Invited Lecture: Self-Avatars in Immersive Virtual Environments as a Tool to investigate Embodied Perception, Westfälische Wilhelms-Universität: Fachbereich 10 Mathematik und Informatik, Münster, Germany.
CiteID: Mohler2010
165. van der Ham IJM , van Zandvoort MJE , Meilinger T Person, Bosch SE , Kant N und Postma A (Juli-2010) Spatial and temporal aspects of navigation in two neurological patients NeuroReport 21(10) 685-689.
CiteID: 6429
164. Alexandrova IV Person, Teneva PT Person, de la Rosa S Person, Kloos U , Bülthoff HH Person und Mohler BJ Person (Juli-2010) Egocentric distance judgments in a large screen display immersive virtual environment 7th Symposium on Applied Perception in Graphics and Visualization (APGV 2010), ACM Press, New York, NY, USA, 57-60.
pdfCiteID: 6623
163. Alaimo SMC Person, Pollini L Person, Magazzù A , Bresciani J-P Person, Robuffo Giordano P Person, Innocenti M und Bülthoff HH Person (Juli-2010) Preliminary Evaluation of a Haptic Aiding Concept for Remotely Piloted Vehicles In: Haptics: Generating and Perceiving Tangible Sensations, EuroHaptics 2010, Springer, Berlin, Germany, 418-425.
pdfCiteID: 6538
162. Reichenbach A Person, Thielscher A Person, Peer A , Bülthoff HH Person und Bresciani J-P Person (Juli-2010): Neural Correlates of Online Control of Reaching Movements, FENS 2010 Satellite Symposium on Motor Control, Nijmegen, Netherlands.
CiteID: 6847
161. Mohler BJ Person (Juni-18-2010) Invited Lecture: Self-Avatars in Immersive Virtual Environments as a Tool to investigate Embodied Perception, Event Lab: Universitat de Barcelona, Barcelona, Spain.
CiteID: 6667
160. Mohler BJ Person, Creem-Regehr SH , Thompson WB Person und Bülthoff HH Person (Juni-2010) The Effect of Viewing a Self-Avatar on Distance Judgments in an HMD-Based Virtual Environment Presence: Teleoperators and Virtual Environments 19(3) 230-242.
CiteID: 6123
159. Dodds TJ Person, Mohler BJ Person und Bülthoff HH Person (Juni-2010) A Communication Task in HMD Virtual Environments: Speaker and Listener Movement Improves Communication 23rd Annual Conference on Computer Animation and Social Agents (CASA 2010), 1-4.
pdfCiteID: 6541
158. Volkova EP Person, Mohler BJ Person, Meurers D , Gerdemann D und Bülthoff HH Person (Juni-2010) Emotional Perception of Fairy Tales: Achieving Agreement in Emotion Annotation of Text NAACL HLT 2010 Workshop on Computational Approaches to Analysis and Generation of Emotion in Text, Association for Computational Linguistics, Morristown, NJ, USA, 98-106.
pdfCiteID: 6819
157. Schomaker J Person, Tesch J Person, Bülthoff HH Person und Bresciani J-P Person (Juni-2010): It's All Me: Varying Viewpoints and Motor Learning in a Virtual Reality Environment, 11th International Multisensory Research Forum (IMRF 2010), Liverpool, UK.
CiteID: 6658
156. Reichenbach A Person, Bresciani J-P Person, Peer A , Bülthoff HH Person und Thielscher A Person (Juni-2010): Proprioceptive online control of goal-directed reaching: a transcranial magnetic stimulation study, 16th Annual Meeting of the Organisation for Human Brain Mapping (HBM 2010), Barcelona, Spain.
CiteID: 6846
155. Volkova EP Person (Juni-2010): Emotional Perception of Fairy Tales: Achieving Agreement in Emotion Annotation of Text, 20. Tagung der Computerlinguistik Studierenden (TaCoS 2010), Zürich, Switzerland.
CiteID: Volkova2010
Seite:  
1, 2, 3, 4, 5, 6, ... , 16

Export als:
BibTeX, XML, pubman, Edoc, RTF
Last updated: Montag, 10.03.2014