Project Leader

Dr.-Ing. Cristobal Curio
Phone: +49 7071 601-605
Fax: +49 7071 601-616
cristobal.curio[at]tuebingen.mpg.de
 

People

Group members
 

CE Overview Poster


News

04/11/2013 New HMI detectability concept supporting situational awareness featured in IEEE Opens external link in new windowIntelligent Transportation Systems Magazine.
27/09/2013 Computational perception article on view-dependencies of social interaction understanding Opens external link in new windowFrontiers Perception.
26/08/2013 Congratulations to Christian Herdtweck on his successful PhD defense!
29/06/2013 Our approach on Opens external link in new window3D driver head pose monitoring will be presented at Opens external link in new windowTP.CG.2013 conference at Eurographics UK.
24/06/2013 Our cognitive vision architecture on efficient monocular car viewpoint estimation [Opens external link in new windowpdf] has been presented at the Opens external link in new windowIntelligent Vehicles conference.
10/06/2013 We participated in Opens external link in new windowEU-FET-Open (Future and Emerging Technologies) project TANGO. The project has been evaluated as excellent.
20/01/2013  Visual cues determining emotional dynamic facial expression recognition (Opens external link in new windowJournal of Vision)
13/09/2012 Opens external link in new windowBest Conference Paper Award at Opens external link in new windowIEEE conference on Multisensor Fusion and Information Integration (MFI) 2012.
06/03/2012  Enhancing drivers' environment perception presented as Opens external link in new windowtalk (best paper finalist) and novel monocular visual odometry approach at Intelligent Vehicles conference.
05/10/2011 Public release of medial feature grouping and superpixel segmentation Opens external link in new windowcode.
09/ 2011 HMI system for image retrieval presented at INTERACT 2011 conference.
03/2011 Paper presentation at IEEE Automatic Face & Gesture Conference, Santa Barbara. Watch our latest facial analysis video with the Microsoft Kinect sensor on youtube.
10/2010 Our book on Dynamic Faces: Insights from Experiments and Computation has appeared at MIT press.
10/2010 Two SIGGRAPH Asia 2010 Technical Sketches accepted for presentation.
05/2010 Paper at CVPR Conference Workshop on Feature Grouping.
09/2009 We won a DAGM paper prize for work on automatic 3D surface tracking for the generation of a 4D morphable face model. Conference of the German Association for Pattern Recognition (DAGM).

Teaching

WS 2012/13 Opens external link in new windowMedizinische Bildverarbeitung at the Computer Science Department, Tübingen.
SS 2012 Opens external link in new windowMachine Learning II at the Graduate School of Neural Information Processing, Tübingen.
WS 2010/11 Statistical Methods in Artificial Intelligence at the Computer Science Department, Tübingen.
WS 2009/10 Advanced Topics in Machine Learning at the Computer Science Department, Tübingen.

External activities

Opens external link in new windowKI-2013 From Research to Innovation and Practical Applications (Program Committee)
Opens external link in new windowIntelligent Vehicles 2013 PC Associate Editor
WIAF-2012 Workshop Opens external link in new windowWhat's in a face? at ECCV, technical PC.
KI-2012
Associate Editor
06/2011 Organization of workshop on interactive pedestrian behavior analysis and synthesis at IEEE sponsored Intelligent Vehicles, Baden-Baden, June 5, Final program.
11/2011
 PC at 1st IEEE workshop on Information Theory in Computer Vision and Pattern Recognition at ICCV 2011
KI-2011
Program Committee
KI-2010
Area Chair
03/2008
COSYNE conference workshop

Collaborators

Opens external link in new windowCenter for Integrated Neuroscience, Section Computational Motorics (Prof. Dr. M.A. Giese)
Opens external link in new windowAutonomous Systems Lab ETHZ (Prof. Dr. R. Siegwart)
Opens external link in new windowZentrum für Neurowissenschaften Zürich (Prof. Dr. med. A. Luft)
Opens external link in new windowMPI for Intelligent Systems (Prof. M. Black)

Five most recent Publications

Herdtweck C Person and Wallraven C Person (December-2013) Estimation of the Horizon in Photographed Outdoor Scenes by Human and Machine PLoS ONE 8(12) 1-14.
Dobricki M Person and de la Rosa S Person (December-2013) The structure of conscious bodily self-perception during full-body illusions PLoS ONE 8(12) 1-9.
Dobs K Person, Schultz J Person, Bülthoff I Person and Gardner JL (November-10-2013): Attending to expression or identity of dynamic faces engages different cortical areas, 43rd Annual Meeting of the Society for Neuroscience (Neuroscience 2013), San Diego, CA, USA.
Nieuwenhuizen FM Person, Chuang LL Person and Bülthoff HH Person (November-2013) myCopter: Enabling Technologies for Personal Aerial Transportation Systems: Project status after 2.5 years 5. Internationale HELI World Konferenz "HELICOPTER Technologies", "HELICOPTER Operations" at the International Aerospace Supply Fair AIRTEC 2013, 1-3.
pdf
Redcay E , Dodell-Feder D , Mavros PL , Kleiner M Person, Pearrow MJ , Triantafyllou C , Gabrieli JD and Saxe R (October-2013) Atypical brain activation patterns during a face-to-face joint attention game in adults with autism spectrum disorder Human Brain Mapping 34(10) 2511–2523.

Export as:
BibTeX, XML, Pubman, Edoc, RTF

 

Intelligent Social Signal Processing

Multimodal body pose recordings of interactions in crowds
Many everyday actions take place in a social context and affective information forms an important channel of social communication. The goal of Opens external link in new windowEU-Project TANGO is to take these two familiar ideas one radical step further by focussing on the essential interactive nature of social communication, in the domain of non verbal communication based entirely on facial and bodily expression. We investigate interactions in real life contexts showing agents in daily life situations such as communication or navigation.

A central goal of the project is the development of an exact mathematical theory of emotional communicative behaviour. Based on such a theory combined with advanced methods from computer vision and computer graphics emotional interactions can be studied quantitatively in detail and can be transferred in technical systems that simulate believable emotional interactive behaviour. Current common motion synthesis techniques in computer graphics mainly focus on physical factors. The role of other factors, and specifically psychological variables, is neglected and not well understood. Based on the obtained experimental results and mathematical new generation of technical devices establishing emotional communication between humans and machines will be developed. TANGO goes beyond the sate of the art in theoretical scope, in methodological approaches and in innovative applications that are anticipated.
See also implications for understanding crowd behavior in driver assistance applications, Opens external link in new windowworkshop at Intelligent Vehicles Symposium, Baden-Baden.
 
Curio C PersonChiovetto E and Giese MA Person(August-2013) Abstract Talk: Integration of kinematic components in the perception of emotional facial expressions, 36th European Conference on Visual Perception (ECVP 2013), Bremen, Germany.
Giese MA Person, Chiovetto E and Curio C Person (September-2012): Perceptual relevance of kinematic components of facial movements extracted by unsupervised learning, 35th European Conference on Visual Perception, Alghero, Italy, Perception, 41(ECVP Abstract Supplement) 150.
de la Rosa S Person, Miekes S , Bülthoff HH Person and Curio C Person (September-2012): View dependencies in the visual recognition of social interactions, 35th European Conference on Visual Perception, Alghero, Italy, Perception, 41(ECVP Abstract Supplement) 240.

Virtual Realities

Curio, Breidt et al 2006
Curio, Kleiner et al 2010

We have developed a novel facial expression animation control system that realizes a real-time version of the animation pipeline developed in our institute [Curio et al, 2010, APGV 06/ 08]. The device produces no noticeable feedback latencies for subjects between executing dynamic expressions and the perception of themselves animated, in e.g. a mirror-setup. Expressions are encoded currently by means of 3D Facial Action Units. This novel real-time visual feedback offers a variety of on-line dynamic manipulations. For non-experienced programmers an intuitive application programming interface (API) based on the public ‘Visual Psychophysics Toolbox Version3’ offers implementation of experimental protocols for facial movement analysis and graphical feedback by control of gains of Action-Unit signals, the exchange and delay of Action-Unit-components, and allow highly synchronized monitoring of multimodal physiological recordings of e.g. facial EMG, EEG and sound.

 


Curio C Person, Kleiner M Person, Breidt M Person and Bülthoff HH Person (January-2010): The Virtual Face Mirror Project: Revealing Dynamic Self-Perception in Humans, 4th International Conference on Cognitive Systems (CogSys 2010), 4(137).
pdfCiteID: 6241

Facial Motion Retargeting

Studying facial expression perception based on novel 3D animation

The Facial Action Unit based animation system we have developed in our group and it has been perceptually validated against other approaches of animation in a user study (Curio et al 2006). Besides development and evaluation this animation technology has been and is still currently applied to study all kinds of research questions in our department.
For example, we found a novel high-level after-effect in the perception of dynamic facial expressions (APGV 2008).  The controllable animation provides us with a novel continuous facial expression morph space. In analogy to the study of Leopold et al (Nature Neuroscience, 2001, Prototype-referenced shape encoding revealed by high-level aftereffects) we defined facial anti-expressions (see image left) to detect expression specific adaptation aftereffects during perception. Like using a continuous face identity morph space (from Leopold et al, Nature Neuroscience, 2001), visual adaptation to anti-expressions biases the recognition of expressions opposite the morph axis, specifically towards the original expression.

Interdisciplinary viewpoints: New book on Dynamic Faces

The recognition of faces is a fundamental visual function with importance for social interaction and communication. Scientific interest in facial recognition has increased dramatically over the last decade. Researchers in such fields as psychology, neurophysiology, and functional imaging have published more than 10,000 studies on face processing. Almost all of these studies focus on the processing of static pictures of faces, however, with little attention paid to the recognition of dynamic faces, faces as they change over time—a topic in neuroscience that is also relevant for a variety of technical applications, including robotics, animation, and human-computer interfaces. This volume offers a state-of-the-art, interdisciplinary overview of recent work on dynamic faces from both biological and computational perspectives.
 
Dynamic Faces Book, MIT Press, 2010: The chapters cover a broad range of topics, including the psychophysics of dynamic face perception, results from electrophysiology and imaging, clinical deficits in patients with impairments of dynamic face processing, and computational models that provide insights about the brain mechanisms for the processing of dynamic faces. The book offers neuroscientists and biologists an essential reference for designing new experiments, and provides computer scientists with knowledge that will help them improve technical systems for the recognition, processing, synthesizing, and animating of dynamic faces.

Topics
: Computer-Vision, Computer Graphics, Human-Computer Interfaces, Computational Modeling, Psychophysics, Physiology and Clinics

Related publications

17. de la Rosa S Person, Giese MA Person, Bülthoff HH Person and Curio C Person (January-2013) The contribution of different cues of facial movement to the emotional facial expression adaptation aftereffect Journal of Vision 13(1:23) 1-15.
CiteID: delaRosaGBC2012
16. Giese MA Person, Chiovetto E and Curio C Person (September-2012): Perceptual relevance of kinematic components of facial movements extracted by unsupervised learning, 35th European Conference on Visual Perception, Alghero, Italy, Perception, 41(ECVP Abstract Supplement) 150.
CiteID: GieseCC2012
15. de la Rosa S Person, Miekes S , Bülthoff HH Person and Curio C Person (September-2012): View dependencies in the visual recognition of social interactions, 35th European Conference on Visual Perception, Alghero, Italy, Perception, 41(ECVP Abstract Supplement) 240.
CiteID: delaRosaMBC2012
14. Dobs K Person, Bülthoff I Person, Curio C Person and Schultz J Person (August-2012): Investigating factors influencing the perception of identity from facial motion, 12th Annual Meeting of the Vision Sciences Society (VSS 2012), Naples, FL, USA, Journal of Vision, 12(9) 35.
CiteID: DobsBCS2012
13. Layher G , Neumann H , Scherer S , Tschechne S , Brosch T and Curio C Person (October-2011) Social Signal Processing in Companion Systems: Challenges Ahead In: Informatik 2011: Informatik schafft Communities, Workshop on "Companion-Systeme und Mensch-Companion-Interaktion", Gesellschaft für Informatik, Bonn, Germany, 1-15.
pdfCiteID: LayherTSBCN2011
12. Dobs K Person, Kleiner M Person, Bülthoff I Person, Schultz J Person and Curio C Person (September-2011): Investigating idiosyncratic facial dynamics with motion retargeting, 34th European Conference on Visual Perception, Toulouse, France, Perception, 40(ECVP Abstract Supplement) 115.
CiteID: DobsKBSC2011
11. de la Rosa S Person, Giese MA Person and Curio C Person (September-2011): The influence of dynamic and static adaptors on the magnitude of high-level aftereffects for dynamic facial expression, 34th European Conference on Visual Perception, Toulouse, France, Perception, 40(ECVP Abstract Supplement) 154.
CiteID: delaRosaGC2011
10. McDonnell R and Breidt M Person (December-2010) Face Reality: Investigating the Uncanny Valley for virtual faces 3rd ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia (SIGGRAPH Asia 2010), ACM Press, Nw York, NY, USA, 1-2.
pdfCiteID: 6820
9. Walder C Person, Breidt M Person, Bülthoff HH Person, Schölkopf B Person and Curio C Person: Markerless tracking of dynamic 3D scans of faces, 255-276. In: Dynamic Faces: Insights from Experiments and Computation, (Ed) C. Curio, MIT Press, Cambridge, MA, USA, (December-2010).
CiteID: 5961
8. Curio C Person, Giese MA Person, Breidt M Person, Kleiner M Person and Bülthoff HH Person: Recognition of Dynamic Facial Action Probed by Visual Adaptation, 47-65. In: Dynamic Faces: Insights from Experiments and Computation, (Ed) C. Curio, MIT Press, Cambridge, MA, USA, (December-2010).
CiteID: 5960
7.
Curio C Person, Bülthoff HH Person, Giese MA Person and Poggio TA : Dynamic Faces: Insights from Experiments and Computation, 288, MIT Press, Cambridge, MA, USA, (October-2010). ISBN: 978-0-262-01453-3
pdfCiteID: 5772
6. Garrod O , Yu H , Breidt M Person, Curio C Person and Schyns P (May-2010): Reverse Correlation In Temporal Facs Space Reveals Diagnostic Information During Dynamic Emotional Expression Classification, 10th Annual Meeting of the Vision Sciences Society (VSS 2010), Naples, FL, USA, Journal of Vision, 10(7) 700.
CiteID: 6579
Page:  
1, 2

Export as:
BibTeX, XML, Pubman, Edoc, RTF
Last updated: Wednesday, 06.11.2013