Looking for Participants

The MPI for Biological Cybernetics is looking for participants for some of their research experiments [more].
 

Most recent Publications

Guest JM, Seetharama MM, Wendel ES, Strick PL and Oberlaender M (January-2018) 3D reconstruction and standardization of the rat facial nucleus for precise mapping of vibrissal motor networks Neuroscience 368 171-186.
Zaretskaya N, Fischl B, Reuter M, Renvall V and Polimeni JR (January-2018) Advantages of cortical surface reconstruction using submillimeter 7 T MEMPRAGE NeuroImage 165 11-26.
Ardhapure AV, Sanghvi YS, Borozdina Y, Kapdi AR and Schulzke C (January-2018) Crystal structure of 8-(4-methyl­phen­yl)-2′-de­oxy­adenosine hemihydrate Acta Crystallographica Section E: Crystallographic Communications 74(1) 1-5.
Meilinger T, Garsoffky B and Schwan S (December-2017) A catch-up illusion arising from a distance-dependent perception bias in judging relative movement Scientific Reports 7(17037) 1-9.
Venrooij J, Mulder M, Mulder M, Abbink DA, van Paassen MM, van der Helm FCT and Bülthoff HH (December-2017) Admittance-Adaptive Model-Based Approach to Mitigate Biodynamic Feedthrough IEEE Transactions on Cybernetics 47(12) 4169-4181.
pdf

 

Large Tracking Hall (TrackingLab)

TrackingLab Foto: GEHIRN&GEIST/Manfred Zentsch
TrackingLab Foto: GEHIRN&GEIST/Manfred Zentsch
The free space walking and tracking laboratory in the Cyberneum is a large (12.7m x 11.9m x 6.9m) empty space equipped with 26 high-speed motion capture cameras from Vicon. In December 2014 the existing setup of 16 Vicon MX13 cameras (1.2 Megapixels) was expanded with 10 Vicon T160 cameras (16 Megapixels) to double the tracking volume for aerial robotics research and to improve head tracking quality for immersive VR research. This tracking system allows us to capture the motions of one or more persons by processing the images of configurations of multiple infrared reflective markers in real-time. The calculated position and orientation information can be transmitted wirelessly to a high-end mobile graphics system that updates the simulated virtual environment according to the person's position and is able to generate a correct egocentric visualization and/or auditory simulation.  Participants can navigate freely within the entire area of the tracking hall by either wearing the backpack or having the experimenter wear the backpack and follow them around in the walking room. In order to suppress any interference between real and simulated environment as far as possible, the laboratory is completely dark (black with the ability to block out all light) and acoustic panels around the walls largely reduce acoustic reverberations. The tracking setup also allows for tracking multiple objects such as flying quadcopters as well as for full-body motion capture (e.g. for analysis of sports performance, i.e. gymnastics, or for animation of virtual characters).
Last updated: Friday, 14.10.2016