Looking for Participants

The MPI for Biological Cybernetics is looking for participants for some of their research experiments [more].
 

Most recent Publications

Göksu C, Hanson LG, Siebner HR, Ehses P, Scheffler K and Thielscher A (May-2018) Human in-vivo brain magnetic resonance current density imaging (MRCDI) NeuroImage 171 26-39.
Celicanin Z, Manasseh G, Petrusca L, Scheffler K, Auboiroux V, Crowe LA, Hyacinthe JN, Natsuaki Y, Santini F, Becker CD, Terraz S, Bieri O and Salomir R (May-2018) Hybrid ultrasound-MR guided HIFU treatment method with 3D motion compensation Magnetic Resonance in Medicine 79(5) 2511-2523.
Schindler A and Bartels A (May-2018) Integration of visual and non-visual self-motion cues during voluntary head movements in the human brain NeuroImage 172 597-607.
Pracht ED, Feiweier T, Ehses P, Brenner D, Roebroeck A, Weber B and Stöcker T (May-2018) SAR and scan-time optimized 3D whole-brain double inversion recovery imaging at 7T Magnetic Resonance in Medicine 79(5) 2620–2628.
Dobs K, Schultz J, Bülthoff I and Gardner JL (May-2018) Task-dependent enhancement of facial expression and identity representations in human cortex NeuroImage 172 689-702.

 

Display Technology

Quarter-sphere Large Screen Projection (PanoLab)

We have initially employed a large screen, half-cylindrical virtual reality projection system to study human perception since 1997. Studies in a variety of areas have been carried out, including spatial cognition and the perceptual control of action. In 2005, we made a number of fundamental improvements to the virtual reality system. Perhaps the most noticeable change is an alteration of the screen size and geometry. This includes extending the screen horizontally (from 180 to 230 degrees) and adding a floor screen and projector. It is important to note that the projection screen curves smoothly from the wall projection to the floor projection, resulting in an overall screen geometry that can be described as a quarter- sphere. Vertically, the screen subtends 125 degrees (25 degree of visual angle upwards and 100 degrees downwards from the normal observation position).

In 2011 the image generation and projection setup was significantly updated. The existing four JVC SX21 DILA projectors (1400x1050) and curved mirrors were replaced with six EYEVIS LED DLP projectors (1920x1200), thereby simplifying the projection setup and increasing the overall resolution. In order to compensate for the visual distortions caused by the curved projection screen, as well as to achieve soft-edge blending for seamless overlap areas, we have developed a flexible warping solution using the new warp and blend features of the NVIDIA Quadro chipsets. This solution gives us the flexibility of a hardware-based warping solution and the accuracy of a software-based warping. The necessary calibration data for the image warping and blending stages is generated by a new camera-based projector auto-calibration system (DOMEPROJECTION.COM). Image generation is handled by a new high-end render cluster consisting of six client image generation PCs and one master PC. To avoid tearing artifacts resulting from the multi-projector setup, the rendering computers use frame-synchronized graphics cards to synchronize the projected images.

In addition to improving the visual aspects of the system, we increased the quality, number, and type of input devices. Participants in the experiments can, for example, interact with the virtual environment via actuated Wittenstein helicopter controls, joysticks, a space mouse, steering wheels, a Go-Kart, or a virtual bicycle (VRBike). Furthermore a Razer Hydra 6DOF joystick can be used for wand navigation and small volume tracking. Some of the input devices offer the possibility of force-feedback. With the VRBike, for example, one can actively pedal and steer through the virtual environment, and the virtual inertia and incline will be reflected in the pedals' resistance.

Back-Projection Large Screen Display (BackproLab)

The quarter-sphere projection setup is complimented by a back-projection setup, which has the advantage that participants do not create shadows on the screen. This setup consists of a single SXGA+ projector (Christie Mirage S+3K DLP) and a large, flat screen (2.2m wide by 2m high). The projector has a high contrast ration of 1500:1 and can be used for mono or active stereo projections. This space previously used four Vicon V-series cameras for motion tracking. These cameras were replaced in 2014 with a SMARTTRACK system from Advanced Realtime Tracking (ART), which can track up to four rigid-body objects and can directly output the calculated positions and orientations via UDP network stream.


For stereo projection setup, the NVIDIA 3DVision Pro active shutter-glasses are used. These glasses use RF technology for synchronization and therefore can also be used in conjunction with the infrared-based optical tracking system. The glasses have been modified with markers for the optical tracking system and thus can be used for head tracking.

Last updated: Friday, 14.10.2016