Looking for Participants

The MPI for Biological Cybernetics is looking for participants for some of their research experiments [more].
 

Most recent Publications

Göksu C, Hanson LG, Siebner HR, Ehses P, Scheffler K and Thielscher A (May-2018) Human in-vivo brain magnetic resonance current density imaging (MRCDI) NeuroImage 171 26-39.
Celicanin Z, Manasseh G, Petrusca L, Scheffler K, Auboiroux V, Crowe LA, Hyacinthe JN, Natsuaki Y, Santini F, Becker CD, Terraz S, Bieri O and Salomir R (May-2018) Hybrid ultrasound-MR guided HIFU treatment method with 3D motion compensation Magnetic Resonance in Medicine 79(5) 2511-2523.
Schindler A and Bartels A (May-2018) Integration of visual and non-visual self-motion cues during voluntary head movements in the human brain NeuroImage 172 597-607.
Pracht ED, Feiweier T, Ehses P, Brenner D, Roebroeck A, Weber B and Stöcker T (May-2018) SAR and scan-time optimized 3D whole-brain double inversion recovery imaging at 7T Magnetic Resonance in Medicine 79(5) 2620–2628.
Dobs K, Schultz J, Bülthoff I and Gardner JL (May-2018) Task-dependent enhancement of facial expression and identity representations in human cortex NeuroImage 172 689-702.

 

Large Tracking Hall (TrackingLab)

TrackingLab Foto: GEHIRN&GEIST/Manfred Zentsch
TrackingLab Foto: GEHIRN&GEIST/Manfred Zentsch
The free space walking and tracking laboratory in the Cyberneum is a large (12.7m x 11.9m x 6.9m) empty space equipped with 26 high-speed motion capture cameras from Vicon. In December 2014 the existing setup of 16 Vicon MX13 cameras (1.2 Megapixels) was expanded with 10 Vicon T160 cameras (16 Megapixels) to double the tracking volume for aerial robotics research and to improve head tracking quality for immersive VR research. This tracking system allows us to capture the motions of one or more persons by processing the images of configurations of multiple infrared reflective markers in real-time. The calculated position and orientation information can be transmitted wirelessly to a high-end mobile graphics system that updates the simulated virtual environment according to the person's position and is able to generate a correct egocentric visualization and/or auditory simulation.  Participants can navigate freely within the entire area of the tracking hall by either wearing the backpack or having the experimenter wear the backpack and follow them around in the walking room. In order to suppress any interference between real and simulated environment as far as possible, the laboratory is completely dark (black with the ability to block out all light) and acoustic panels around the walls largely reduce acoustic reverberations. The tracking setup also allows for tracking multiple objects such as flying quadcopters as well as for full-body motion capture (e.g. for analysis of sports performance, i.e. gymnastics, or for animation of virtual characters).
Last updated: Friday, 14.10.2016