Contact

Dr. Isabelle Bülthoff

Address: Spemannstr. 38
72076 Tübingen
Room number: 001.1
Phone: +49 7071 601 611
Fax: +49 7071 601 616
E-Mail: isabelle.buelthoff

 

Picture of Bülthoff, Isabelle, Dr.

Isabelle Bülthoff

Position: Project Leader  Unit: Bülthoff

I lead the group Recognition and Categorization of the department Human Perception, Cognition and Action.

 

My research concerns human face recognition. To that end, I use primarily psychophysical methods, eye-tracking, immersive virtual environments and face images derived from our face database.

 

Faces are the most fascinating objects for human beings. We are never tired of looking at faces, a fact used heavily by advertising companies. In the course of our childhood, we develop a remarkable expertise at deciphering the most subtle aspects of a face, not only do we recognize identity or sex, but we also notice, for example, signs of tiredness, sadness or age.

 

Main projects:

While we are expert in face recognition in general, we process and retrieve information about familiar and unfamiliar faces differently. We use preferably the inner features for recognizing familiar faces, while for unfamiliar faces we pay more attention to and keep in memory more likely extra-facial information like hairdo, glasses or beards. In one project, I investigate the recognition of personally familiar faces, as they are the faces that we remember best. With these faces, we can test how precisely facial information related to sex, race or identity is memorized. I concentrate on those aspects of faces, as we use those attributes most osten to describe or classify faces. Our results give insight about how very familiar faces are represented in memory. They reveal that facial information regarding sex and race are represented only very coarsely in memory, while those linked to identity are encoded very precisely.

 

Another line of study (in collaboration with the Space and Body Perception group) uses the advantages of virtual reality to investigate face recognition under more natural conditions. Most studies so far tested isolated static faces. In our project, observers moved physically in a virtual room to look at the faces of life-size avatars. We compared the recognition performance of this active group to that of other groups with different learning conditions. Overall, the active group performed better than the other groups.

 

In collaboration with Mintao Zhao, and other colleagues, additional projects investigate, among others, holistic processing of faces, the other-race effect and the influence of voices on face recognition.

 

Projects in collaboration with PhD students of the Recognition and Categorisation group presently include:

Other projects:

  • What gives a face its ethnicity? We can quickly and easily judge faces in terms of their ethnicity. In a series of studies, we investigate whether one or another part of the face (eyes, mouth…) has more influence on perceived ethnicity of that face. This work is done in collaboration with Korea University (BioCyb Lab in the Department of Brain and Cognitive Engineering) and involves participants of different cultural background and expertise in terms of face ethnicities.
  • Influence of body size on face recognition. The concept of “Embodied Cognition” implies that our own bodies, the way we act with our bodies, and the way our bodies “fit” into the environment, should all have important implications for our mental representation of the world. Thus the question arises whether we represent and/or process faces in a different way depending on our body size. This work is done in collaboration with Ian Thornton (University of Malta, Malta) and Betty Tesch (Mohler).

Current project in more details:

Personally familiar faces: Higher precision of memory for idiosyncratic than for sex or race facial information

Introduction

We process and retrieve information about familiar and unfamiliar faces differently. We use preferably the inner features for recognizing familiar faces, while for unfamiliar faces we keep in memory more likely extra-facial information like hairdo, glasses or beards1. Testing memory of very familiar faces allows us to test how precisely different types of facial information are memorized.

Goals

We investigate whether facial information related to either sex, race or identity might be remembered more precisely than the others. We concentrate on those aspects, as they represent some attributes that we use most commonly to describe or classify faces. The results will give insight about how very familiar faces are represented.

Methods

The faces of members of the department were used as personally familiar test faces and the members of the department were our participants. The veridical faces were manipulated2 in increasing manner in four different ways: they were (1) morphed with other identities, (2) caricatured and anti-caricatured, (3) made more feminine looking and more masculine looking and (4) made more Caucasian looking and more Asian looking. In each test trial, a veridical face was shown with its distracters (the faces obtained with one of the four manipulations). Participants had to find the veridical face among the distracters.

Initial results

Figure 1 shows that participants chose the veridical face most frequently when the distracters were identity morphs, while this was not the case for the other manipulations.

Initial conclusion

Our results reveal that for personally familiar faces, their facial information regarding sex and race are represented only very coarsely in memory, while those linked to identity are encoded very precisely.

Figure 1

Left: Mean choice frequency for veridical faces and their identity morphs. Right: Mean choice frequency for veridical faces and their race morphs. Stars denote values differing significantly from chance level.

References

1. Johnston, R. A., & Edmonds, A. J. (2009). Familiar and unfamiliar face recognition: A review. Memory, 17(5), 577–596.

2. Blanz, V., & Vetter, T. (1999). A morphable model for the synthesis of 3D faces. In Proceedings of the 26th annual conference on Computer graphics and interactive techniques - SIGGRAPH ’99 (pp. 187–194). New York, New York, USA: ACM Press.

 Education

 

1979 Licence ès Sciences naturelles (equivalent to MA in natural Sciences in the US), University of Lausanne, Switzerland.

 

1983 Ph.D in Zoology, University of Lausanne, Switzerland. Doctoral Dissertation accomplished at the Max-Planck institute for Biological Cybernetics, Tübingen, Germany.

 

 

Academic and Research Experience

 

1977-1978            Teaching assistant in Zoology, University of Lausanne, Switzerland

 

1979-1983            Doctoral work. Doctoral Dissertation: “Visual mutants  of Drosophila melanogaster, functional neuroanatomical mapping of nervous activity by 3H-Deoxyglucose method”. Max-Planck institute for Biological Cybernetics, Tübingen, Germany

 

1983-1885            Postdoctoral fellow, Max-Planck-Institut für biologische Kybernetik, Tübingen, Germany, funded by the Swiss Research Foundation

 

1986-1991            Child rearing period (2 children)

 

1991-1993            Research assistant, Neuroscience Department, (Prof. Barry Connors), Brown University, RI, USA       

 

Since 09/1993       Researcher at the Max-Planck-Institut für biologische Kybernetik, Tübingen, Germany

 

Since 01/2009       Project leader at the Max-Planck-Institut für biologische Kybernetik, Tübingen, Germany

 

 

Major Research Interests

 

Investigating the mechanisms underlying face recognition.  At present my focus is on the following themes:

  • The interplay between gender and identity information in face recognition
  • The impact of voice distinctiveness on face recognition
  • The influence of context and task on face recognition
  • Crosscultural differences in face and object recognition
  • The role of idiosyncratic viewing history in face recognition

Preferences: 
References per page: Year: Medium:

  
Show abstracts

Talks (28):

Bülthoff I (June-2010) Abstract Talk: Die Wechselwirkung von Identität und Geschlecht bei der Gesichtswahrnehmung, 36. Tagung Psychologie und Gehirn (PuG 2010), Greifswald, Germany 16.
Armann R and Bülthoff I (April-2009) Abstract Talk: Categorical perception of male and female faces depends on familiarity, 2009 Australian Psychology Conferences: 36th Australasian Experimental Psychology Conference, Wollongong, Australia 3.
Armann R and Bülthoff I (October-2008) Abstract Talk: Categorical Perception of Male and Female Faces and the Single-Route Hypothesis, 9th Conference of the Junior Neuroscientists of Tübingen (NeNa 2008), Ellwangen, Germany 13.
Gaissert N, Wallraven C and Bülthoff I (August-2008) Abstract Talk: Visual and haptic perceptual representations of complex 3-D objects, 31st European Conference on Visual Perception, Utrecht, Netherlands, Perception, 37(ECVP Abstract Supplement) 125.
Bülthoff I (January-2006): Investigating face recognition with voices and face morphs, Face Mini-Symposium: Georg-August-Universität Göttingen, Zentrum für Neurobiologie des Verhaltens, Göttingen, Gemany.
Bülthoff I and Newell F (September-2005) Abstract Talk: Accuracy in face recognition: Better performance for face identification with changes in identity and caricature but not with changes in sex, Fifth Annual Meeting of the Vision Sciences Society (VSS 2005), Sarasota, FL, USA, Journal of Vision, 5(8) 379.
Bülthoff I (August-2005): Shape perception for object recognition and face categorization, 28th European Conference on Visual Perception, A Coruña, Spain, Perception, 34(ECVP Abstract Supplement) 21.
Bülthoff I and Newell FN (August-2004) Abstract Talk: Distinctive auditory information improves visual face recognition, Fourth Annual Meeting of the Vision Sciences Society (VSS 2004), Sarasota, FL, USA, Journal of Vision, 4(8) 139.
Bülthoff HH, Edelman S and Bülthoff I (September-1996) Abstract Talk: Features of the representation space for 3D objects, 19th European Conference of Visual Perception, Strasbourg, France, Perception, 25(ECVP Abstract Supplement) 49-50.
Page:  
1, 2, 3, 4

Export as:
BibTeX, XML, Pubman, Edoc, RTF
Last updated: Monday, 22.05.2017