Although the hands are the most important tool for humans to interact with and explore objects in their environment, surprisingly little is known about haptic object perception. How does the brain process the haptic input about the shape of touched objects? Is this done in a similar fashion to the processing of the visual input? Are objects stored as multisensory or as separate representations? It is these fundamental questions that motivate my line of research.
By using recent developments in computer graphics modeling as well as 3D printing techniques, it now becomes possible to generate parametrized models of complex objects with full control over object properties like shape, texture, etc. These objects can be explored visually and/or haptically which enables us to study which object features are most important for haptic perception in comparison to visual perception.
Under the supervision of Christian Wallraven I used computer-generated 3D shapes to answer two main questions:
- The human brain is capable of forming a perceptual space that is highly congruent to the underlying physical object space when exploring objects visually. Is the same also true when objects are explored haptically?
- Moreover previous work has shown for the visual modality that a veridical perceptual space can correctly predict categorization behavior, that is, that similarity between objects grouped together into one category is perceived as higher than for objects grouped into different categories. Does this theory also hold for the haptic modality?
Next, we wanted to know if these findings can be transferred to natural objects, and therefore asked the question: does shape similarity also account for visual and haptic object categorization of natural objects?
Taken together, the results presented in my thesis point to a close connection of visual and haptic object representations with clear evidence that similar processes underlie visual and haptic similarity perception and categorization.
Based on the findings reported in my thesis I got interested in the question if the human brain stores separate unimodal shape representations or if one representation is formed that combines visual and haptic input. To answer this question a new set of stimuli was designed. Together with Steffen Waterkamp we are currently conducting learning experiments and testing for cross-modal transfer to learn more about the visual and the haptic shape representation of complex objects.
Further, a project is under way for automating haptic experiments. Therefore Maria Liebsch is working on programming a robot that can automatically present 3D objects to participants. Most interestingly with this robot it will become possible to record haptic exploration patterns and thus learn more about how humans touch and explore complex objects.
Analyzing Multisensory Perceptual Representations of Complex Shapes
Although the hands are the most important tool for humans to interact with objects in their environment, surprisingly little is known about haptic object perception. How does the brain process the haptic input about the shape of touched objects? Is this done in a similar fashion to the processing of the visual input? Are objects stored as multisensory or as separate representations? It is these fundamental questions that motivate our line of research. Our particular interest in this context lies in examining multisensory perceptual spaces of parametrically-defined objects, i.e., topological representations of various object properties. By combining computer graphics modeling and 3D printing techniques we can generate complex volumetric shapes spanning a multidimensional morph-space and study their visual and haptic representations in order to gain insight into the topology and the properties of multisensory object representations.
By visualizing perceptual spaces using multidimensional scaling techniques (MDS) we were able to show that visual and haptic object exploration lead to highly congruent perceptual spaces  indicating that visual and haptic shape information is integrated to form one multisensory perceptual space. This hypothesis has been further tested by conducting learning experiments and testing for cross-modal transfer. Cross-modal transfer would indicate that one shared representation is formed.
Combining the mathematical model of  and the software ShellyLib, a three-dimensional object space of parametrically-defined shells was generated (see Figure 1a). By analyzing participants similarity ratings using MDS we first visualized the visual and haptic perceptual spaces and compared them. The high congruency between the two spaces indicates that one shared representation is formed  (see also ). Next, we generated a new set of stimuli which were printed at Korea University (Figure 1b). Using these stimuli we conducted cross-modal learning experiments and analyzed the data using signal detection theory .
We could show that visual and haptic shape exploration lead to the formation of highly congruent perceptual spaces indicating that visual and haptic shape information get integrated to form a single perceptual space . Further, we found visual learning of shape categories to strongly influence haptic shape perception and vice versa . Thus, our experiments show that learning strongly transfers across the senses.
For low-level stimuli , complex, but parametrically-defined stimuli  and natural objects  visual and haptic perceptual spaces show high congruency. Further, visual learning of shape categories strongly influences haptic shape perception and vice versa . This strong cross-modal transfer supports the theory of a single perceptual space integrating visual and haptic shape information. Furthermore, our results [1, 6] indicate that the same cognitive processes underlie visual and haptic shape perception and object categorization.
To analyze perceptual spaces, a three-dimensional object space of parametrically-defined objects was generated and 21 objects were selected to be printed as 3D plastic models (a). The objects were explored either visually or haptically and similarity ratings were performed. Using MDS we found that visual and haptic perceptual spaces are highly congruent. To investigate visual and haptic transfer a new set of stimuli was generated (b). Visual training on shape categories strongly influenced haptic shape perception and vice versa. Taken together the results indicate that visual and haptic shape information get integrated to form a single multisensory perceptual space.
1. Gaissert, N., Wallraven, C., and Bülthoff, H.H.: Visual and Haptic Perceptual Spaces Show High Similarity in Humans. Journal of Vision 10(11:2), 1-20 (2010).
2. Fowler, D.R., H. Meinhardt, P., and Prusinkiewicz: Modeling seashells, In SIGGAPH `92: Proceedings of the 19th annual conference on Computer graphics and interactive techniques, 379-389 (1992).
3. Gaissert, N., and Wallraven, C.: Categorizing Natural Objects A Comparison of the Visual and the Haptic Modalities. Submitted.
4. Gaissert, N., Waterkamp, S., van Dam, L., Bülthoff, H.H., and Wallraven, C.: Training transfers across the senses in visual and haptic shape categorization. In preparation.
5. Cooke, T., F. Jäkel, C. Wallraven, H. H. Bülthoff: Multimodal Similarity and Categorization of Novel, Three-dimensional Objects. Neuropsychologia 45(3), 484-495 (2007).
6. Gaissert, N., Bülthoff, H.H., and Wallraven, C.: Similarity and categorization: From vision to touch. Acta Psychologica (2011)
Date of Birth: January 15th, 1982
Place of Birth: Böblingen, Germany
Ph.D. student at the Max Planck Institute for Biological Cybernetics, Department Bülthoff
Perceiving Complex Objects - A Comparison of the Visual and the Haptic Modalitites
Diploma Thesis at the University of Stuttgart, Department of Biological Air Purification
Biological Air Purification of Isophorone within a Biotricklingfilter - Analyzing the Genetics of Iph200
Student Project Thesis at the Massachusetts Institute of Technology, MA, USA, Department of Biology
Screening a cDNA Library of Entamoeba histolytica for a new Endoplasmic Reticulum Oxidase
University of Stuttgart
Field of Study: Technical Biology
Scholarship holder of the Max Planck Society
Recommended for the "Studienstiftung des Deutschen Volkes" (German National Academic Foundation)
Scholarship holder of the "Stiftung der Deutschen Wirtschaft" (sdw, Foundation of German Business)
Best Student Paper Award: Haptics Symposium 2010
Second Place Poster Presentation: Max Planck PhD Net Workshop
Supervision of internships of three students (3 weeks, 6 weeks, and 10 weeks), introduction to ongoing research, supervision of experiments and analysis, grading of report
Supervision of students for BOGY and lab rotations, introduction to ongoing research within one to two hours
Blockpraktikum, for two weeks small groups of 4 to 5 students were introduced to ongoing research, the experiments were planned, supervised and analyzed
Supervision of 30 students for 2 weeks of microbiology lab rotation, teaching classes
I wrote articles for Offspring, the magazine of the Max Planck PhD Net Association
For a research collaboration I went to Seoul, Korea for one month
sdw-Alumni-Association, organization of seminars and talks
GYLC- Global Young Leaders Conference
The UNO and US Senate invited students from all around the world to participate in a two weeks long conference in Washington D.C. and New York
Organizational Unit (Department, Group, Facility):
- Alumni of the Department Human Perception, Cognition & Action
- Alumni of the Group Recognition & Categorization