Social Cognition

Humans have often little difficulties inferring actions, emotions and cognitive states of another person. Visual information is critical for these inferences. Our work aims at elucidating the relatively unknown cognitive processes that support these social inferences from visual information. Moreover we are interested in how this information facilitates social interaction. For these reason we would like to understanding social cognitive processes under more natural social interaction conditions.

Previous action recognition research has focused on explaining how the visual system might use local visual features to derive a global action percept. An important aspect of action understanding has received, however, little attention. Specifically, little is known about how an action percept is filled with semantic meaning. This process of mapping visual information onto semantic meaning is important for everyday social functioning because it allows the observer to interpret the same visual pattern in different ways. For example, seeing someone laughing after another person has told a joke has the meaning of 'laughing with someone'; in contrast, seeing someone laugh after another person fell has the meaning of 'laughing about someone'. The cognitive structure that allows this flexible recognition of actions is widely unknown. We examine this cognitive structure using computational models of actions that allow action morphing. Influential theories suggest that 'perception for action' and 'perception for recognition' might rely on different cognitive-perceptual mechanisms and that interactive experimental scenarios ('second-person-perspective') are essential for understanding social-cognitive processes of social interactions. These theoretical accounts are in contrast to several studies in which participants are passive observers of actions rather than active agents. To better understand action recognition in real-life situations, we examine action recognition in interactive scenarios. By means of virtual reality we create virtual interaction partners (avatars) with which participants interact. This approach allows high experimental control over the stimulus while - at the same time - enables the participant to interact as naturally as possible with another person.


Face recognition

Humans are social beings and recognizing socially relevant information from other humans is critical for social functioning, e.g. when judging the emotional state of another person during a discussion. Facial expressions provide powerful non-verbal cues about the cognitive and emotional state of another person. Although humans have seemingly little difficulty to use these visual cues, the underlying psychological processes are complex and far from being fully understood. In this project we examine social information processing of faces.

One key feature of facial expressions is that they are inherently dynamic. Surprisingly, much of the previous research has examined visual recognition of static facial expressions. In this project we focus on examining dynamic facial expressions and how visual and motor cues contribute to the recognition of facial expressions.

People
Kathrin Kaulard, Stephan de la Rosa

Collaborators
Cristobal Curio, Martin Giese, Johannes Schulz, Christian Wallraven

Publications

  • de la Rosa, S., Giese, M., Bülthoff, H. H., & Curio, C. (2013). The contribution of different cues of facial movement to the emotional facial expression adaptation aftereffect. Journal of Vision, 13(1), 23. doi:10.1167/13.1.23


Action recognition

Actions provide another source of information that informs observers about the emotional and active state of another person. In this project we examine how are able to understand an action based on the visual information of an action. Several variables are deemed important for the recognition of an action, namely dynamic action information, the observer's motor system, and the social context. We examine these and other factors using behavioral paradigms and fMRI.

People
Stephan de la Rosa, Dong-Seon Chang,

Collaborators
Cristobal Curio, Nick Barraclough, Martin Giese, Johannes Schulz

Publications

  • de la Rosa, S., Bülthoff H.H. (in press). Motor-visual neurons and action recognition in social interactions. Commentary on Mirror neurons: From origin to function Behavioral and Brain Sciences


Cognition in social interactions

Humans are social beings and physically interacting with other people (social interactions, e.g. when shaking hands) is part of everyone's daily routine. Surprisingly little behavioral research has examined the how humans use visual information to gain knowledge about the social action of other people. In this project we aim to further our knowledge in this relatively novel field. Among other approaches, we use virtual reality to examine the processes involved in the visual recognition of social interactions and when people engage in a social interaction.

People
Stephan de la Rosa, Stephan Streuber, Sarah Mieskes

Collaborators
Günther Knoblich, Nathalie Sebanz, Betty Mohler, Shimon Ullman, Liav Asif, Hong Yu Wong, Cristobal Curio

Publications

  • de la Rosa, S., Mieskes, S., Bülthoff, H. H., & Curio, C. (2013). View dependencies in the visual recognition of social interactions. Frontiers in Psychology, 4. doi:10.3389/fpsyg.2013.00752
  • Streuber, S., Knoblich, G., Sebanz, N., Bülthoff, H. H., & de la Rosa, S. (2011). The effect of social context on the use of visual information. Experimental Brain Research, 214(2), 273–84. doi:10.1007/s00221-011-2830-9
  • Streuber, S., Mohler, B. J., Bülthoff, H. H., & Rosa, S. de la. (2012). The Influence of Visual Information on the Motor Control of Table Tennis Strokes. Presence, 21, 281-294.
  • de la Rosa, S, Choudhery, R. N., Bülthoff, H.H., Asif, L. Ullman, S., Cristobal, C. (under review). Visual recognition of social interactions.
Go to Editor View