Human Perception, Cognition & Action

Prof. Dr. Heinrich Bülthoff

The department is concerned with the fundamental processes of human perception. The primary focus is how the information from different sense organs is integrated to create a consistent representation of the “world in our heads”.

Traditional psychophysics methods are combined with the most up-to-date computer graphics and Virtual Reality systems to understand the “algorithms of perception”. Psychophysics addresses the mathematical description of associations between physical stimuli and the perceptions they trigger in humans. The use of computer simulations and realistic virtual environments to carry out psychophysical experiments maximizes the possibility of dynamic feedback and interactivity. At the same time, this approach allows complete control over all the aspects of the simulation to reach accurate conclusions about human perception.

The research focuses on the integration of information from the visual, haptic and balance senses and on the development of efficient algorithms for building assistant systems to help the aging society to cope with the challenges of the decline with age in perceptual and cognitive capabilities.


Former Groups within the Department

We can easily recognize and categorize objects at different levels depending on task requirements. An animal can be recognized as belonging to a category such as “a dog” (categorization) or as “my dog Bashi” (identification). Among all categories of objects, faces constitute a very special class because of their social importance and their high intra-group similarity. Therefore, the RECCAT group mainly focuses on the perception of faces, with an added interest for the perception of human bodies and other objects.  
In the CAPA group, we investigate human manual control behavior in order to increase our understanding of how humans use information perceived from their environment to generate control actions. This knowledge can be used to better support humans when performing control tasks, such as steering a vehicle.  
The subjective experience of locomotion, i.e. the displacement of a human observer through the environment, is what we call self-motion. To fully comprehend this pervasive experience, we take a two-fold approach. 1. We carry out fundamental research to investigate human perception of self-motion; 2. We develop state-of-the-art motion simulation technologies and algorithms. Ultimately, these two research directions build upon each other.  
We investigate how humans process information, relevant for the effective control of machine systems. Machines extend our physical capacity to sense and interact with our environments. For example, collision avoidance systems in an aircraft allow the pilot to be aware of fast moving traffic before they are even within range of human sight. Meanwhile, the pilot selectively relies on information provided by the system, to determine and execute the appropriate combination of actions.  
Interaction with the social and spatial environment are fundamental components of daily life. Previous research mainly examined both domains separately. However, social and spatial processes interact. For example, joint actions happen within space and social factors such as personal distance are expressed spatially. We examine social and spatial cognition and the relation between both.  
We aim to study novel algorithm for autonomous machines (robots) that are able to sense the environment, reason about it, and take actions to perform tasks in cooperation with humans. The driving vision behind our research is to allow in the future robots to aid humans, for example to reduce physical effort and risks in industrial environment, help prevention and reaction in emergency situations as natural disasters, and in general simplify tasks of everyday life.  
The Cognitive Engineering group develops applications based on Computer-Vision, Machine-Learning and Computer-Graphics in combination with methods that model Human cognitive processes. Highly controllable and realistic settings, real-world sensor data and simulations, offer the opportunity for advanced experiments and at the same time a framework to design and optimize processes in Human-Machine-Interfaces. Our approach ultimately opens a new window to modern industries such as entertainment computing (games), communication research (information transfer, multimedia), medical systems, and personal assistance systems (automotive safety).  
In the Perception and Action in Virtual Environments research group, our aim was to investigate human behavior, perception and cognition using ecologically valid and immersive virtual environments. Virtual reality (VR) equipment enables our scientists to provide sensory stimulus in a controlled virtual world and to manipulate or alter sensory input that would not be possible in the real world. More specifically, VR technology enables us to specifically manipulate the visual body, the contents of the virtual world, and the sensory stimulus (visual, vestibular, kinesthetic, tactile, and auditory) while performing or viewing an action.  
Go to Editor View