Looking for Participants

The MPI for Biological Cybernetics is looking for participants for some of their research experiments [more].
 

Most recent Publications

Buckenmaier K, Pedersen A, SanGiorgio P, Scheffler K, Clarke J and Inglis B (February-2019) Feasibility of Functional MRI at Ultralow Magnetic Field via Changes in Cerebral Blood Volume NeuroImage 186 185-191.
Giapitzakis I-A, Borbath T, Murali‐Manohar S, Avdievich N and Henning A (February-2019) Investigation of the influence of macromolecules and spline baseline in the fitting model of human brain spectra at 9.4T Magnetic Resonance in Medicine 81(2) 746-758.
Colic L, McDonnell C, Li M, Woelfer M, Liebe T, Kretzschmar F, Speck O, Schott BH, Bianchi M and Walter M (February-2019) Neuronal glutamatergic changes and peripheral markers of cytoskeleton dynamics change synchronically 24 h after sub-anaesthetic dose of ketamine in healthy subjects Behavioural Brain Research 359 312-319.
Windschuh J, Zaiss M, Ehses P, Lee JE, Jerschow A and Regatte RR (January-2019) Assessment of frequency drift on CEST MRI and dynamic correction: application to gagCEST at 7 T Magnetic Resonance in Medicine 81(1) 573-582.
Herz K, Gandhi C, Schuppert M, Deshmane A, Scheffler K and Zaiss M (January-2019) CEST imaging at 9.4 T using adjusted adiabatic spin‐lock pulses for on‐ and off‐resonant T1⍴‐dominated Z‐spectrum acquisition Magnetic Resonance in Medicine 81(1) 275-290.

 

Large Tracking Hall (TrackingLab)

TrackingLab Foto: GEHIRN&GEIST/Manfred Zentsch
TrackingLab Foto: GEHIRN&GEIST/Manfred Zentsch
The free space walking and tracking laboratory in the Cyberneum is a large (12.7m x 11.9m x 6.9m) empty space equipped with 26 high-speed motion capture cameras from Vicon. In December 2014 the existing setup of 16 Vicon MX13 cameras (1.2 Megapixels) was expanded with 10 Vicon T160 cameras (16 Megapixels) to double the tracking volume for aerial robotics research and to improve head tracking quality for immersive VR research. This tracking system allows us to capture the motions of one or more persons by processing the images of configurations of multiple infrared reflective markers in real-time. The calculated position and orientation information can be transmitted wirelessly to a high-end mobile graphics system that updates the simulated virtual environment according to the person's position and is able to generate a correct egocentric visualization and/or auditory simulation.  Participants can navigate freely within the entire area of the tracking hall by either wearing the backpack or having the experimenter wear the backpack and follow them around in the walking room. In order to suppress any interference between real and simulated environment as far as possible, the laboratory is completely dark (black with the ability to block out all light) and acoustic panels around the walls largely reduce acoustic reverberations. The tracking setup also allows for tracking multiple objects such as flying quadcopters as well as for full-body motion capture (e.g. for analysis of sports performance, i.e. gymnastics, or for animation of virtual characters).
Last updated: Friday, 14.10.2016