Contact

Katharina Dobs

Address: Spemannstr. 38
72076 Tübingen
E-Mail: katharina.dobs

 

Picture of Dobs, Katharina

Katharina Dobs

Position: Guest Scientist  Unit: Bülthoff

Research Group: RECCAT

Supervisors: Isabelle Bülthoff and Johannes Schultz

 

I am interested in the visual perception of dynamic faces and its role for identity processing. The faces we encounter everyday are typically in motion, thus facial motion is assumed to contribute at least to some extent to the processing of identity. However, there are still many open questions remaining to bridge the gap between the well-studied static faces and the less well-understood processing of dynamic faces.

 

During my PhD, I investigated the role of facial motion and its interaction with facial form in human face processing using psychophysics and fMRI.

Investigating identity information in facial motion


Introduction

The faces we encounter everyday typically move. Previous studies have shown that facial motion - in addition to facial form - can carry information about the identity of a person [1,2], yet the exact role of facial motion as a cue for identity is still unclear [3].

 

Goals

The overall goal is to understand when and how facial motion contributes to person recognition. We hypothesize that humans’ sensitivity to identity information in facial motion varies depending on the type of facial movement (e.g. basic emotions or conversational expressions). The results shall further advance our understanding of how we perceive faces in real life.

 

Methods

We assessed human observers’ sensitivity to identity information in different types of facial movements. To separate form from motion cues, we used a recent facial motion capture and animation system [4,5] and animated a single avatar head with facial movements recorded from four different actors. The facial movements occurred in three social contexts: (1) emotional (e.g., anger), (2) emotional in a social interaction (e.g., being angry at someone) and (3) social interaction (e.g., saying goodbye to someone). Using a delayed matching-to-sample task (see Fig. 1), we tested in which context human observers can discriminate unfamiliar persons based only on their facial motion.

 

Fig. 1: The trial procedure of the experiment.

Fig. 1: The trial procedure of the experiment; exemplarily shown for the emotional context. Observers first watched an animation of a facial expression (Sample; e.g., angry), followed by two animations displaying a different facial movement (Matching stimuli; e.g., happy). Observers were asked to choose which of the matching stimuli was performed by the same actor as the sample.

 

Results

Observers were able to discriminate identities based on emotional facial movements occurring in a social interaction (Fig. 2, middle), but not on basic emotional facial expressions (Fig. 2, left). Sensitivity was highest across non-emotional, speech-related movements occurring in a social interaction (Fig. 2, right).

 

Fig. 2: Behavioral results.

Fig. 2: Behavioral results. Mean sensitivity (d’) across observers (n = 14) as a function of context. A sensitivity of 0 indicates chance level. Error bars indicate 95% confidence interval (CI).

 

 

 

 

 

 

 

 

 

 

Conclusion

Our findings reveal that human observers can recognize unfamiliar persons from conversational and speech-related movements but not from the way they perform basic emotional facial expressions. We hypothesize that these differences are due to how these movements are executed: basic emotions are performed quite stereotypically, whereas conversational and speech-related movements are performed more idiosyncratically.

 

References

[1] Hill H and Johnston A (2001). Categorizing sex and identity from the biological motion of faces. Current Biology 11 880-885.

[2] Knappmeyer B, Thornton IM and Bülthoff HH (2003). The use of facial motion and facial form during the processing of identity. Vision Research 43 1921-1936.

[3] O’Toole A.J, Roark DA and Abdi H (2002). Recognizing moving faces: A psychological and neural synthesis. Trends in Cognitive Sciences 6 261–266.

[4] Curio C, Breidt M, Kleiner M, Vuong QC, Giese MA and Bülthoff HH (2006). Semantic 3D motion retargeting for facial animation. 3rd Symposium on Applied Perception in Graphics and Visualization (APGV '06), ACM Press, New York, NY, USA, 77-84.

[5] Dobs K, Bülthoff I, Breidt M, Vuong QC, Curio C and Schultz J (2014). Quantifying human sensitivity to spatio-temporal information in dynamic faces. Vision Research 100 78-87.

Current Position

since 2015: Postdoctoral Researcher at the Brain and Cognition Research Center (CerCo), CNRS, Toulouse, France. Advisor: Leila Reddy

 

Education

2010 - 2014: Ph.D. Candidate at Max Planck Institute for Biological Cybernetics, Tübingen, Germany (Dept. Human Perception, Cognition and Action). Advisors: Isabelle Bülthoff, Johannes Schultz

2002 - 2008: Diploma Psychology, Philipps-University Marburg, Germany. Advisors: Frank Rösler, Kerstin Jost

2002 - 2007: Diploma Computer Science, Philipps-University Marburg, Germany. Advisors: Manfred Sommer, David Kämpf

 

Research and Teaching Experience

2014 - 2015: Postdoctoral researcher / guest scientist at the Max Planck Institute for Biological Cybernetics. Advisor: Isabelle Bülthoff

2013: JSPS fellow at Gardner Research Team, RIKEN BSI, Japan, conducting an fMRI study on attentional modulation of facial motion and form processing. Advisor: Justin Gardner

2011: Supervised Kathryn Bonnen (Michigan State University) working on "Physical and perceptual analysis of the 3D face database" as an internship

2004 – 2008: Student Research Assistant at the Cognitive Psychophysiology Lab, Philipps-University of Marburg, Germany. Advisor: Frank Rösler

2006: Visiting Research Assistant at the Laboratory of Systems Neurodynamics, University of Virginia, USA. Advisor: William B Levy

 

Fellowships, Grants and Awards

2015 - 2017: Postdoctoral Fellowship of the German Research Foundation (DFG)

2015: Best Dissertation Award 2015 from the Max Planck Institute for Biological Cybernetics and Förderverein für neurowissenschaftliche Forschung e.V.

2015: Travel award from the German Academic Exchange Service

2014: Invited speaker at the symposium "The perception of faces" in English Lake District, UK, funded by the Rank Prize Funds.

2013: JSPS (Japan Society for the Promoting of Science) Research Fellowship

2012: VSS Student Travel Award winner

 

Work Experience

2009 - 2010: IT Consultant / Software Engineer at PRODYNA AG, Frankfurt, Germany

2008 - 2009: Freelancer / Software Engineer in London, GB

Preferences: 
References per page: Year: Medium:

  
Show abstracts

Books (1):

Dobs K: Behavioral and Neural Mechanisms Underlying Dynamic Face Perception, 108, Logos Verlag, Berlin, Germany, (2015). ISBN: 978-3-8325-3910-8, Series: MPI Series in Biological Cybernetics ; 40

Articles (3):

Dobs K, Schultz J, Bülthoff I and Gardner JL (July-2017) Task-dependent enhancement of facial expression and identity representations in human cortex NeuroImage . submitted
Dobs K, Bülthoff I and Schultz J (September-2016) Identity information content depends on the type of facial movement Scientific Reports 6(34301) 1-9.
Dobs K, Bülthoff I, Breidt M, Vuong QC, Curio C and Schultz J (July-2014) Quantifying human sensitivity to spatio-temporal information in dynamic faces Vision Research 100 78–87.

Posters (6):

Dobs K, Bülthoff I and Reddy L (May-16-2016): Optimal integration of facial form and motion during face recognition, 16th Annual Meeting of the Vision Sciences Society (VSS 2016), St. Pete Beach, FL, USA, Journal of Vision, 16(12) 925.
Dobs K, Schultz J, Bülthoff I and Gardner JL (September-2015): Independent control of cortical representations for expression and identity of dynamic faces, 15th Annual Meeting of the Vision Sciences Society (VSS 2015), St. Pete Beach, FL, USA, Journal of Vision, 15(12) 684.
Dobs K, Schultz J, Bülthoff I and Gardner JL (November-10-2013): Attending to expression or identity of dynamic faces engages different cortical areas, 43rd Annual Meeting of the Society for Neuroscience (Neuroscience 2013), San Diego, CA, USA.
Dobs K, Bülthoff I, Breidt M, Vuong QC, Curio C and Schultz JW (August-2013): Quantifying Human Sensitivity to Spatio-Temporal Information in Dynamic Faces, 36th European Conference on Visual Perception (ECVP 2013), Bremen. Germany, Perception, 42(ECVP Abstract Supplement) 197.
Dobs K, Bülthoff I, Curio C and Schultz J (August-2012): Investigating factors influencing the perception of identity from facial motion, 12th Annual Meeting of the Vision Sciences Society (VSS 2012), Naples, FL, USA, Journal of Vision, 12(9) 35.
Dobs K, Kleiner M, Bülthoff I, Schultz J and Curio C (September-2011): Investigating idiosyncratic facial dynamics with motion retargeting, 34th European Conference on Visual Perception, Toulouse, France, Perception, 40(ECVP Abstract Supplement) 115.

Theses (1):

Dobs K: Behavioral and Neural Mechanisms Underlying Dynamic Face Perception, Eberhard-Karls-Universität Tübingen, (December-2014). PhD thesis

Talks (2):

Schultz J, Kaulard K, Pilz P, Dobs K, Bülthoff I, Fernandez-Cruz A, Brockhaus B, Gardner J and Bülthoff HH (March-27-2017) Abstract Talk: Neural processing of facial motion cues about identity and expression, 59th Conference of Experimental Psychologists (TeaP 2017), Dresden, Germany 32-33.
Dobs K and Reddy L (August-29-2016) Abstract Talk: Dynamic reweighting of facial form and motion cues during face recognition, 39th European Conference on Visual Perception (ECVP 2016), Barcelona, Spain, Perception, 45(ECVP Abstract Supplement) 87-88.

Export as:
BibTeX, XML, Pubman, Edoc, RTF
Last updated: Monday, 22.05.2017