Looking for Participants

The MPI for Biological Cybernetics is looking for participants for some of their research experiments [more].
 

Most recent Publications

Guest JM, Seetharama MM, Wendel ES, Strick PL and Oberlaender M (January-2018) 3D reconstruction and standardization of the rat facial nucleus for precise mapping of vibrissal motor networks Neuroscience 368 171-186.
Zaretskaya N, Fischl B, Reuter M, Renvall V and Polimeni JR (January-2018) Advantages of cortical surface reconstruction using submillimeter 7 T MEMPRAGE NeuroImage 165 11-26.
Ardhapure AV, Sanghvi YS, Borozdina Y, Kapdi AR and Schulzke C (January-2018) Crystal structure of 8-(4-methyl­phen­yl)-2′-de­oxy­adenosine hemihydrate Acta Crystallographica Section E: Crystallographic Communications 74(1) 1-5.
Meilinger T, Garsoffky B and Schwan S (December-2017) A catch-up illusion arising from a distance-dependent perception bias in judging relative movement Scientific Reports 7(17037) 1-9.
Venrooij J, Mulder M, Mulder M, Abbink DA, van Paassen MM, van der Helm FCT and Bülthoff HH (December-2017) Admittance-Adaptive Model-Based Approach to Mitigate Biodynamic Feedthrough IEEE Transactions on Cybernetics 47(12) 4169-4181.
pdf

 

Display Technology

Quarter-sphere Large Screen Projection (PanoLab)

We have initially employed a large screen, half-cylindrical virtual reality projection system to study human perception since 1997. Studies in a variety of areas have been carried out, including spatial cognition and the perceptual control of action. In 2005, we made a number of fundamental improvements to the virtual reality system. Perhaps the most noticeable change is an alteration of the screen size and geometry. This includes extending the screen horizontally (from 180 to 230 degrees) and adding a floor screen and projector. It is important to note that the projection screen curves smoothly from the wall projection to the floor projection, resulting in an overall screen geometry that can be described as a quarter- sphere. Vertically, the screen subtends 125 degrees (25 degree of visual angle upwards and 100 degrees downwards from the normal observation position).

In 2011 the image generation and projection setup was significantly updated. The existing four JVC SX21 DILA projectors (1400x1050) and curved mirrors were replaced with six EYEVIS LED DLP projectors (1920x1200), thereby simplifying the projection setup and increasing the overall resolution. In order to compensate for the visual distortions caused by the curved projection screen, as well as to achieve soft-edge blending for seamless overlap areas, we have developed a flexible warping solution using the new warp and blend features of the NVIDIA Quadro chipsets. This solution gives us the flexibility of a hardware-based warping solution and the accuracy of a software-based warping. The necessary calibration data for the image warping and blending stages is generated by a new camera-based projector auto-calibration system (DOMEPROJECTION.COM). Image generation is handled by a new high-end render cluster consisting of six client image generation PCs and one master PC. To avoid tearing artifacts resulting from the multi-projector setup, the rendering computers use frame-synchronized graphics cards to synchronize the projected images.

In addition to improving the visual aspects of the system, we increased the quality, number, and type of input devices. Participants in the experiments can, for example, interact with the virtual environment via actuated Wittenstein helicopter controls, joysticks, a space mouse, steering wheels, a Go-Kart, or a virtual bicycle (VRBike). Furthermore a Razer Hydra 6DOF joystick can be used for wand navigation and small volume tracking. Some of the input devices offer the possibility of force-feedback. With the VRBike, for example, one can actively pedal and steer through the virtual environment, and the virtual inertia and incline will be reflected in the pedals' resistance.

Back-Projection Large Screen Display (BackproLab)

The quarter-sphere projection setup is complimented by a back-projection setup, which has the advantage that participants do not create shadows on the screen. This setup consists of a single SXGA+ projector (Christie Mirage S+3K DLP) and a large, flat screen (2.2m wide by 2m high). The projector has a high contrast ration of 1500:1 and can be used for mono or active stereo projections. This space previously used four Vicon V-series cameras for motion tracking. These cameras were replaced in 2014 with a SMARTTRACK system from Advanced Realtime Tracking (ART), which can track up to four rigid-body objects and can directly output the calculated positions and orientations via UDP network stream.


For stereo projection setup, the NVIDIA 3DVision Pro active shutter-glasses are used. These glasses use RF technology for synchronization and therefore can also be used in conjunction with the infrared-based optical tracking system. The glasses have been modified with markers for the optical tracking system and thus can be used for head tracking.

Last updated: Friday, 14.10.2016