Projektleiter

 Öffnet einen externen Link in einem neuen FensterDr. Isabelle Bülthoff
Phone: +49 7071 601-611
Fax: +49 7071 601-616
isabelle.buelthoff[at]tuebingen.mpg.de
 
 

RecCat-Poster


Aktuellste Veröffentlichungen

Danyeli L, Alizadeh S, Surova G, Jamalabadi H, Schultz M und Walter M (Juni-2017): Effects of Neurexan® on brain responses to deviant stimuli during an auditory oddball task, ISAD LONDON 2017: Perspectives on Mood and Anxiety Disorders: Looking to the future, London, UK, Frontiers in Psychiatry, Conference Abstracts: ISAD LONDON 2017.
CiteID: DanyeliASJSW2017
Chuang LL, Gehring S, Kay J und Schmidt A: Ambient Notification Environments, Dagstuhl Seminar 17161, -, Leibniz-Zentrum für Informatik, Schloss Dagstuhl, Germany, (April-2017).im Druck
-, Series: Dagstuhl Reports
CiteID: ChuangGKS2017
Chuang LL (November-5-2015) Invited Lecture: Beyond Steering in Human-Centered Closed-Loop Control, Institute for Neural Computation: INC Chalk Talk Series, San Diego, CA, USA.
CiteID: Chuang2015_3
Stangl M, Meilinger T, Pape A-A, Schultz J, Bülthoff HH und Wolbers T (Oktober-19-2015): Triggers of entorhinal grid cell and hippocampal place cell remapping in humans, 45th Annual Meeting of the Society for Neuroscience (Neuroscience 2015), Chicago, IL, USA.
CiteID: StanglMPSBW2015
Fademrecht L, Bülthoff I, Barraclough NE und de la Rosa S (Oktober-18-2015): The spatial extent of action sensitive perceptual channels decrease with visual eccentricity, 45th Annual Meeting of the Society for Neuroscience (Neuroscience 2015), Chicago, IL, USA.
CiteID: FademrechtBBd2015_2

Export als:
BibTeX, XML, pubman, Edoc, RTF

Alle RecCat Veröffentlichungen

For all publications by RecCat members, click here

 

Example stimuli used in our group. A: static image from a conversational facial expression (“don’t know”). B: Asian version of a Caucasian face. C: avatar face animated by the smile of a real person. D: 3D-printout of a face of our 3D face database. E: artificial symmetric object. F: Trajectory of a simulated fly. G: 3D-printout of a parametrically-controlled shell object. Stimuli B, C, D, E, F and G can be parametrically modified.
Dynamic faces are recorded or created using the MPI VideoLab, as well as our in-house facial animation software developed by the Cognitive Engineering group [Curio].
 
Active behavior research is using both body-tracking technology (VICON MX) and immersive head-mounted visual displays.
 
Real volumetric objects for haptic and vision research are produced with 3D printing technology.
 
Virtual reality setups are used to present faces and objects embedded in scenes and allow active manipulations of objects.
 
Furthermore, classical psychophysical techniques, functional brain imaging  [Siemens 3T at Magnetic Resonance Center] and eye tracking methods are used in the projects of the group.
 


Last updated: Freitag, 13.10.2017