Project Leaders

 Dr. Isabelle Bülthoff
Phone: +49 7071 601-611
Fax: +49 7071 601-616

RecCat Overview Poster

Five most recent publications

Chuang LL, Gehring S, Kay J and Schmidt A: Ambient Notification Environments, Dagstuhl Seminar 17161, -, Leibniz-Zentrum für Informatik, Schloss Dagstuhl, Germany, (April-2017).in press
-, Series: Dagstuhl Reports
Chuang LL (November-5-2015) Invited Lecture: Beyond Steering in Human-Centered Closed-Loop Control, Institute for Neural Computation: INC Chalk Talk Series, San Diego, CA, USA.
Stangl M, Meilinger T, Pape A-A, Schultz J, Bülthoff HH and Wolbers T (October-19-2015): Triggers of entorhinal grid cell and hippocampal place cell remapping in humans, 45th Annual Meeting of the Society for Neuroscience (Neuroscience 2015), Chicago, IL, USA.
Fademrecht L, Bülthoff I, Barraclough NE and de la Rosa S (October-18-2015): The spatial extent of action sensitive perceptual channels decrease with visual eccentricity, 45th Annual Meeting of the Society for Neuroscience (Neuroscience 2015), Chicago, IL, USA.
Scheer M, Bülthoff HH and Chuang LL (October-2015) On the influence of steering on the orienting response In: Trends in Neuroergonomics, , 11. Berliner Werkstatt Mensch-Maschine-Systeme, Universitätsverlag der TU Berlin, Berlin, Germany, 24.

Export as:
BibTeX, XML, Pubman, Edoc, RTF

All RecCat publications

For all publications by RecCat members, click here


Example stimuli used in our group. A: static image from a conversational facial expression (“don’t know”). B: Asian version of a Caucasian face. C: avatar face animated by the smile of a real person. D: 3D-printout of a face of our 3D face database. E: artificial symmetric object. F: Trajectory of a simulated fly. G: 3D-printout of a parametrically-controlled shell object. Stimuli B, C, D, E, F and G can be parametrically modified.
Dynamic faces are recorded or created using the MPI VideoLab, as well as our in-house facial animation software developed by the Cognitive Engineering group [Curio].
Active behavior research is using both body-tracking technology (VICON MX) and immersive head-mounted visual displays.
Real volumetric objects for haptic and vision research are produced with 3D printing technology.
Virtual reality setups are used to present faces and objects embedded in scenes and allow active manipulations of objects.
Furthermore, classical psychophysical techniques, functional brain imaging  [Siemens 3T at Magnetic Resonance Center] and eye tracking methods are used in the projects of the group.

Last updated: Thursday, 23.10.2014