Ivelina Piryankova

Alumni of the Department Human Perception, Cognition and Action
Alumni of the Group Perception and Action in Virtual Environments
Alumni of the Research Group Body and Space perception

Main Focus

Research group: Space and Body Perception Group

Supervisor: Ph. D. Betty Mohler

I am currently working as a post-doctoral research scientists in the research group Space & Body Perception at Max-Planck-Institute for Biological Cybernetics. I am continuing my research on body perception in immersive virtual environments (VEs). Additionally, I am involved in a body perception research project in collaboration with the University Hospital, Tübingen (more information about this collaboration can be found ).

In February 2015 I defended my PhD thesis. During my PhD I investigated body and space perception in immersive virtual environments (VEs). People perceive the virtual world in a different way than they perceive the real world. Understanding the reasons underlying this effect will help us use the VEs more efficiently for different purposes, such as training, prototyping, education, basic science questions, interactive storytelling or rehabilitation.

My main PhD research projects:

Body perception in virtual reality (VR) using personlalized avatars. I investigate women's sensitivity to changes in their perceived weight by altering the body mass index (BMI) of the participants' personalized avatars displayed on a large-screen immersive display. The work on this project is done in collaboration with scientists from Max Planck Institute for Intelligent Systems (Dr. Javier Romero and Prof. Dr. Michael J. Black) and University of Utah (Ph. D. Jeanine K. Stefanucci). More information about this project and the results of this research can be found in Piryankova et al. 2014, ACM TAP.

Body perception in VEs using stylized avatars, in which we were interested in gaining more insights of how does visual embodiment affect perception and action. This work was part of a tandem project in collaboration with Philosophers (Dr. Hong Yu Wong and Catherine Stinson) from the Centre for Integrative Neuroscience (CIN) Tübingen. More information about this project and the results of this research can be found in Piryankova et al. 2014, PLOS ONE.

Fig.1 Left: The experimenter performing tactile stimulation, participant wearing
nVisor SX111 and viewing 1st prerson avatar (overweight or underweight). Middle: 
The overweight avatar. Right: The underweight avatar. (Piryankova et al. 2014, PLOS ONE.)


Egocentric distance perception in VR, in which we investigate the difference between distance perception in the VEs compared to the real world. For this study we use a 3D replication of a real-world room and three large screen immersive displays (flat, curved and semi-spherical). More information about this project and the results of this research can be found in Piryankova et al. 2013, Elsevier Displays.

Fig.2 Left: The three large screen immersive displays (left: curved, middle: semi-spherical, right: flat)
 used in Piryankova et al. 2013, Elsevier Displays.

Other research projects during my PhD:

Animation of medical training scenarios – the aim of this project is to increase the effectiveness of medical training simulations by helping trainees gain a better understanding of the importance of communication and teamwork. Therefore we are developing an on-line application, which enables trainees to view the scenario from different perspectives or to freely explore the environment perception (you can see an example of this work ). Thus we can study the impact of the different perspectives on the users’ perception. The results of these studies can be used to improve the effectiveness of training. We created a pipeline for rapid and realistic reconstruction of medical scenarios, involving realistic virtual humans with realistic motions and ability to express emotions. We tested their ability to convey specific emotions in a user study published in JVRC 2010, which won "Best Short Paper Award". Further we tested and improved our pipeline in a paper published at CASA 2011 and in a paper will appear in MMVR 2012.

Fig.3 Screenshots from the animated medical scenarios reconstructed from motion capture and video data.
Using realistic virtual reality to influence space and body perception

Ivelina Piryankova, Betty Mohler

Introduction

Immersive virtual environments (VEs) have great potential as interactive mediums for a variety of applications. The users of VE applications are often represented by avatars in virtual reality (VR). This does not necessarily mean that the person represented by the avatar in VR immediately embodies the avatar. However, for many VR applications to be effective, people need to identify themselves with their self-representing avatar and feel ownership over its virtual body.

Goals

The goal of this research is to investigate whether virtual reality can be used to impact perception of own body size and the surrounding virtual world.

Methods

We used body ownership illusion to investigate the conditions required for embodying a virtual avatar and introduce novel measures (body size and affordances) to assess embodiment of a virtual avatar [1]. Further, we investigate women's sensitivity to changes in their perceived weight by altering the body mass index (BMI) of the participants' personalized avatars displayed on a large-screen immersive display [2, 3].

References

1.      Ivelina V. Piryankova, Hong Yu Wong, Sally A. Linkenauger, Catherine Stinson, Matthew R. Longo, Heinrich H. Bülthoff, Betty J. Mohler (2014) Owning an overweight or underweight body: distinguishing the physical, the experienced and the virtual body, PLoS ONE (accepted)

2.      Ivelina V. Piryankova, Jeanine K. Stefanucci, Javier  Romero, Stephan  de la Rosa, Michael J. Black and Betty J. Mohler (2014) Can I recognize my body’s weight? The influence of shape and texture on the perception of self, ACM TAP (accepted)

3.      Ivelina V. Piryankova, Stephan de la Rosa, Uwe Kloos, Heinrich H. Bülthoff, Betty J. Mohler (2013) Egocentric distance perception in large screen immersive displays, Displays, Volume 34 (2) 153-164.


Medical Training Using Perspective Taking in Virtual Reality

Ivelina Piryankova, Betty Mohler, Heinrich H. Bülthoff

Introduction

The aim of this work is to increase the effectiveness of medical training simulations by helping trainees gain a better understanding of the importance of communication and teamwork. Therefore we develop an application that can be used together with Real World (RW) simulations to improve training. Our application allows students to take the perspective of their teammates or the patient, while observing realistically reconstructed medical scenarios in a Virtual Environment (VE).

Goals

Our goal is to develop a virtual (on-line or immersive VE) application that can be used as an additional tool to the RW simulations for effective medical training at the university hospital. Therefore we first establish a pipeline to rapidly and realistically generate animations of medical scenarios.

Methods

We developed a pipeline which combines body animations extracted from full body motion capture data and facial animations using predefined meshes of facial expressions ([1], [2], [3] - JVRC best short paper). We used audio recordings from the RW scenario to synchronize the lip motions with the sound and video recordings to synchronize the motions of the virtual humans and to set up the VE scene.

Initial results

Using our pipeline we captured a routine medical scenario with students [2] and a didactical medical scenario with practitioners [1]. The latter was used for improving the pipeline. The two medical scenarios were further used for generating an interactive online application [1].

Initial conclusion

Our application can be used as an additional tool to medical simulations to give students the possibility to observe and analyze their performance by taking the perspective of their teammates or the patient. Furthermore we intend to gather feedback from medical professionals and students for the usability of the interface of the application.

References

1.Alexandrova IV , Rall M , Breidt M , Kloos U , Tullius G , Bülthoff HH and Mohler BJ (2011) Enhancing Medical Communication Training Using Motion Capture,

Perspective Taking and Virtual Reality, 19th MMVR, (to appear Feb. 2011).

2.Alexandrova IV , Rall M , Breidt M , Kloos U , Tullius G , Bülthoff HH and Mohler BJ (2011) Animations of Medical Training Scenarios in Immersive Virtual Environments, 24th CASA 2011, 1-4.

3.Alexandrova IV , Volkova EP , Kloos U , Bülthoff HH and Mohler BJ (2010) Virtual Storyteller in Immersive Virtual Environments Using Fairy Tales Annotated for Emotion States, JVRC 2010, 65-68.(Best short paper award)

4. Alexandrova IV , Teneva PT , de la Rosa S , Kloos U , Bülthoff HH and Mohler BJ (2010) Egocentric distance judgments in a large screen display immersive virtual environment, 7th APGV 2010, ACM Press, New York, NY, USA, 57-60.

Other project in which I was involved as a research assistant:
  • The EU-sponsored project "POETICON" (with Christian Wallraven, as a local coordinator), in which I have modeled the 3D models and helped editing some motion capture data used in the kitchen scenarios.
  • Face Project of Isabelle Bülthoff. I have helped setting up the virtual humans used for the experiment and I have modeled the room in which the virtual humans are located.
  • Ekaterina Volkova's experiment for emotion perception with an amateur actor. I have helped animating a virtual character with the amateur actors’ motions.

Curriculum Vitae

IVELINA VESSELINOVA PIRYANKOVA

RESEARCH INTERESTS


Body and space perception in real-time virtual environments; real-time simulations, training, therapy and rehabilitation; modeling of realistic real-time virtual environments; generating realistically animated virtual humans in real-time virtual environments.

EDUCATION


M. Sc, 2011 Reutlingen University

Media and Communication Informatics

Thesis title: “Generating Virtual Humans Using Predefined Bodily and Facial Emotions in Real-Time Immersive Virtual Environments”

B.A., 2006 Eberhard Karls University Tübingen

Computational Linguistics

Thesis title: “LTAG Semantics for Relative Clause”

High School Diploma, 2003 “Academician Kiril Popov” High School of Mathematics, Plovdiv, Bulgaria

Major: Physics, English, Computer Technologies and Programming

Go to Editor View