Contact

Lewis Chuang

Address: Max-Planck-Ring 8
72076 Tübingen
E-Mail: lewis.chuang

 

Picture of Chuang, Lewis

Lewis Chuang

Position: Group Leader  Unit: Bülthoff

Research

A process cannot be understood by stopping it. Understanding must
move with the flow of the process, must join and flow with it. - Frank Herbert

My research group, Cognition and Control for Human-Machine Systems (link), focuses on the human factors of closed-loop control, and their underlying psychophysiological bases. In particular, I am interested in how we seek out and process task-relevant information whilst controlling machine systems.

We employ physiological and gaze-tracking to inobtrusively evaluate attentional demand and workload in different operational domains, such as flying an aircraft. Our goal is to contribute towards cognition-aware systems that take into account momentary and systematic fluctuations in attention and alertness (or cognitive performance) as well as the demands of the context

 

We interact actively with dynamic visual environments. For example, we move our gaze across our surroundings as well as manually manipulate objects, in order to access task-relevant information. Moreover, we are able to allocate limited attention resources to relevant tasks, in spite of workload and anxiety levels.


My research involves understanding how humans seek out and process information, in order to operate in control environments. To do so, I employ experimental setups that allow human participants to interact with their environments, as they are accustomed to in the real world. To this end, I employ non-obtrusive measurement techniques, such as eye- and body-

tracking, and EEG, ECG and SCA, that allows the human behavior to be observed without disrupting performance itself.


Understanding how humans perform in a natural and unrestrained environment can inform the development of human-machine interfaces, allowing for better integration and faster adoption.

 

Examples of Human-Machine Interactions



Current lab members (link)

Christiane Glatz: The influence of auditory warning cues during steering

Nina Flad: Visual information sampling with simultaneous EEG and eyetracking

Tim Schilling (w/ Zeiss Vision Lab): Role of tinted lenses in mitigating affect appraisal

 

Alumnus

Alexander Sipatchin: Role of tinted lenses in mitigating affect appraisal

Katharina-Marie Lahmer: Auditory warnings for emergency braking

Katrin Kunz: Driving simulation

Jonas Ditz: Mobile EEG


Menja Scheer: Mental workload during closed-loop control

Evangelia-Regkina Symeonidou: Haptic feedback during closed-loop control

Monika Marsching: Eye-movements during flight training

Marius Klug: A software framework for multimodal user sensing

Silke Wittkowski: The influence of environmental sounds during steering

Jonas Walter: The influence of field-of-view in visuomotor training

Hans-Joachim Bieg (Bosch GmbH):Mid-level eye movements

Björn Browatzki: Methods for mobile gaze tracking

Anne Geuzebroerk: Attentional tunneling during closed-loop control

Riya Paul: EEG signal processing in a moving-base simulator

Jon Allsop: Influence of anxiety on eye-movement planning

Associated projects and funding

SFB-TRR161: Quantitative Methods for Visual Computing (2015-2019)

BW-FIT: Information at your fingertips (2007–2011)

European Union 7th Framework Programme: myCopter

I supervise a research group that investigates how humans seek out and process task-relevant information for the effective control of machine systems, such as vehicles. Machines extend our physical capacity to sense and interact with our environments. For example, collision avoidance systems in an aircraft allow the pilot to be aware of fast moving traffic before they are even within range of human sight. Meanwhile, the pilot selectively relies on information provided by the system, to determine and execute the appropriate combination of actions, necessary for effectively maneuvering of the aircraft.

This continuous interaction between man and machine comprise a closed loop system. Information is constantly exchanged between man and machine, which is subsequently processed and acted on according to their respective cognitive and control processes. Our group employs eye-tracking, motion capture and electroencephalography to define the capacity of a human operator to interact in tandem with a responsive machine system. In particular, vehicle models with control dynamics have been well-defined and engineered for their intended purpose. We believe that doing so will extend our current understanding of attentional processes and motor control. In addition, we are motivated to apply our findings to the development of novel and more effective interfaces for information visualization and shared control.

Main research areas

PHYSIOLOGICAL ESTIMATIONS OF PERCEPTUAL-MOTOR WORKLOAD

The goal of this project is to extract physiological features (e.g., EEG) that can reliably index the amount of workload that the operator is experiencing in the domain of perceptual-motor control. Research into EEG markers of mental workload have tended to be focused on aspects such as sustained attention or working memory. Here, we are motivated to estimate perceptual-motor fatigue of the operator before potentially fatal decrements in performance occur.

 

Scheer M Person, Bülthoff HH Person and Chuang LL Person (September-2014) Is the novelty-P3 suitable for indexing mental workload in steering tasks? 12th Biannual Conference of the German Cognitive Science Society (KogWis 2014), Springer, Berlin, Germany, S135-S136.


Flad N Person, Nieuwenhuizen FM Person, Bülthoff HH Person and Chuang LL Person (June-2014) System Delay in Flight Simulators Impairs Performance and Increases Physiological Workload In: Engineering Psychology and Cognitive Ergonomics, 11th International Conference on Engineering Psychology and Cognitive Ergonomics (EPCE 2014), Springer, Berlin, Germany, 3-11.


DETECTION AND RECOGNITION DURING STEERING

High perceptual motor demands can reduce our capacity to attend to secondary tasks. For example, we could fail to notice the sudden appearance of a crossing pedestrian, especially under severe driving conditions. In this line of research, we seek to understand how our capacity for detecting and recognizing peripheral events vary with increasing demands in the control task (e.g., instability).

 

Glatz C Person, Bülthoff HH Person and Chuang LL Person (September-2014): Looming auditory warnings initiate earlier event-related potentials in a manual steering task, 12th Biannual Conference of the German Cognitive Science Society (KogWis 2014), Tübingen, Germany, Cognitive Processing, 15(Supplement 1) S38.


Bieg H-J Person, Bresciani J-P Person, Bülthoff HH Person and Chuang LL Person (September-2012) Looking for Discriminating Is Different from Looking for Looking's Sake PLoS ONE 7(9) 1-9.


GAZE CONTROL FOR RELEVANT INFORMATION RETRIEVAL

We move our eyes to actively select and process task-relevant information in real-time. By monitoring how eye-movements are coordinated during control maneuvers, we are able to determine aspects of the visual scene that support the operator’s control capabilities. Our research in this area has two emphases. The first involves developing algorithms for estimating, filtering and analyzing natural gaze in real-time and under challenging scenarios (e.g., cockpit environment). The second targets a fundamental understanding of how eye-movements are coordinated so as to handle shifts in task priorities.

 

Bonev B , Chuang LL Person and Escolano F (May-2013) How do image complexity, task demands and looking biases influence human gaze behavior? Pattern Recognition Letters 34(7) 723–730.


Bieg H-J Person, Bresciani J-P Person, Bülthoff HH Person and Chuang LL Person (October-2013) Saccade reaction time asymmetries during task-switching in pursuit tracking Experimental Brain Research 230(3) 271-281.


ROBUST EEG MEASUREMENTS IN MOBILE WORKSPACES

EEG signals can suffer from artefacts due to electromagnetic noise or muscle activity. These noise sources can be amplified in settings that involve a heavy use of electrical equipment and voluntary user movements, such as moving-base flight simulators. Here, we seek to enable EEG recordings in such demanding workspaces by developing robust measurement paradigms and filter algorithms.

 

Browatzki B Person, Bülthoff HH Person and Chuang LL Person (April-2014) A comparison of geometric- and regression-based mobile gaze-tracking Frontiers in Human Neuroscience 8(200) 1-12.

Lewis Chuang is a project leader at the Max Planck Institute for Biological Cybernetics for “Cognition and Control in Human-Machine Systems”. In particular, he investigates how humans process information in order to interact with and control complex machines (e.g., vehicles) [1]. In parallel, he collaborates with computer scientists and engineers to select between different designs for human-machine communications. For instance, he evaluated how haptic force-feedback should be provided to teloperators of swarms of unmanned aerial vehicles in order to help them avoid potential unseen collisions [2]. Lewis Chuang is motivated to observe human behavior without interfering with human-machine-interactions. Thus, he has developed novel neuroscientific methods for unrestrained gaze-tracking [3] and evaluating task engagement during steering [4]. This has proved invaluable in his external collaborations on visual computing [5][6] and in understanding the challenges of a consumer-level flying car [7]. Currently, he is engaged on identifying the potential dangers that the transition from manual to autonomous driving will pose—as everyday drivers fail to understand the real capabilities of assisted driving technologies and designers fail to communicate their expectations of driver participation effectively [8][9][10]. A full C.V. is available upon request.

[1] Chuang, L. Error visualization and information-seeking behavior for air-vehicle control. In Schmorrow, D. and Fidopiastis, C., Eds., Foundations of Augmented Cognition. Lecture Notes in Artificial Intelligence, 9183, 3–11, Aug 2015.


[2] Son, H., Franchi, A., Chuang, L. L., Kim, J., Bülthoff, H. H., and Robuffo Giordano, P. Human-centered design and evaluation of haptic cueing for teleoperation of multiple mobile robots. IEEE Transactions on Systems, Man and Cybernetics, 43(2), 597–609, Apr 2013.

[3] Browatzki, B., Bülthoff, H. H., and Chuang, L. L. A comparison of geometric- and regression-based mobile gaze-tracking. Frontiers in Human Neuroscience, 8(200), 1–12, Apr 2014

[4] Scheer, M., Bülthoff, H. H., and Chuang, L. L. Steering demands diminish the early- P3, late-P3 and RON components of the event-related potential of task-irrelevant environmental sounds. Frontiers in Human Neuroscience, 10(73), Feb 2016.

[5] Transregional Research Centre for Quantitative Methods for Visual Computing (www.trr161.de), supported by Deutsche Forschungsgemeinschaft, Germany (TRR161-C03).

[6] Burch M, Chuang L, Fischer B, Schmidt A and Weiskopf D: Eye Tracking and Visualization: Foundations, Techniques, and Applications, Springer, Cham, Switzerland, (2017).in press

[7] myCopter: Enabling Technologies for Personal Aerial Transportation Systems (www.mycopter.eu), supported by European Union’s Seventh Framework Programme (#266470)

[8] Sadeghian, S., Chuang, L. L., Heuten, W., and Boll, S. Assisting drivers with ambient take over requests in highly automated driving. In Proceedings from the 8th International Conference on Automative User Interfaces and Interactive Vehicular Applications, (Auto-UI 2016), 1–8, Oct 2016.

[9] Löcken, A., Borojeni, S. S., Müller, H., Gable, T. M., Triberti, S., Diels, C., Glatz, C., Alvarez, I., Chuang, L. L., and Boll, S. Towards adaptive ambient in-vehicle displays and interactions: Insights and design guidelines from the 2015 Automotive-UI dedicated workshop. In Meixner, G. and Müller, C., Eds., Automotive User Interfaces - Creating Interactive Experiences in the Car. Springer, Berlin, Germany, in press.

[10] Chuang, L. L., Gehring, S., Kay, J., Olivier, P., and Schmidt, A. “Ambient notification environments” (17161). In Schloss Dagstuhl—Leibniz- Zentrum für Informatik, Apr 2017.

Preferences: 
References per page: Year: Medium:

  
Show abstracts

Books (1):

LL Chuang: Recognizing Objects from Dynamic Visual Experiences, 162, Logos-Verlag, Berlin, Germany, (2011). ISBN: 978-3-8325-2842-3, Series: MPI Series in Biological Cybernetics ; 28

Proceedings (5):

Chuang L, Burch M and Kurzhals K: 3rd Workshop on Eye Tracking and Visualization (ETVIS '18), Symposium on Eye Tracking Research and Applications (ETRA '18), 53, ACM Press, New York, NY, USA, (June-2018).
978-1-4503-5787-6
Boll S, Löcken A, Schroeter R, Baumann M, Alvarez I, Chuang L, Feuerstack S, Jeong M, Hooft van Huysduynen H, Broy N, Osswald S, Politis I and Large D: 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications: Adjunct Proceedings, 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '17), 251, ACM Press, New York, NY, USA, (September-2017).
978-1-4503-5151-5
Chuang LL, Gehring S, Kay J and Schmidt A: Ambient Notification Environments, Dagstuhl Seminar 17161, 45, Leibniz-Zentrum für Informatik, Schloss Dagstuhl, Germany, (April-2017).
-, Series: Dagstuhl Reports ; 7,4
Brandenburg S, Chuang L and Baumann M: 3rd Berlin Summer School Human Factors, 3rd Berlin Summer School Human Factors, 39, Technische Universität Berlin: Zentrum für Mensch-Maschine-Systeme, Berlin, Germany, (2017).
1439-7854, Series: MMI-Interaktiv ; 17
Burch M, Chuang L, Fischer B, Schmidt A and Weiskopf D: Eye Tracking and Visualization: Foundations, Techniques, and Applications: ETVIS 2015, First Workshop on Eye Tracking and Visualization (ETVIS 2015), 258, Springer, Cham, Switzerland, (2017).
978-3-319-47024-5, Series: Mathematics and Visualization

Articles (18):

Janssen CP, Boyle LN, Kun AL, Ju W and Chuang L (July-2018) A Hidden Markov Framework to Capture Human-Machine Interaction in Automated Vehicles International Journal of Human-Computer Interaction . in press
Bonaventura X, Feixas M, Sbert M, Chuang L and Wallraven C (May-2018) A Survey of Viewpoint Selection Methods for Polygonal Models Entropy 20(5:370) 1-22.
Scheer M, Bülthoff HH and Chuang LL (April-2018) Auditory Task Irrelevance: A Basis for Inattentional Deafness Human Factors: The Journal of the Human Factors and Ergonomics Society 60(3) 428-440.
Allsop J, Gray R, Bülthoff HH and Chuang L (December-2017) Eye movement planning on Single-Sensor-Single-Indicator displays is vulnerable to user anxiety and cognitive load Journal of Eye Movement Research 10(5:8) 1-15.
Burch M, Chuang LL, Duchowski A, Weiskopf D and Kroner R (May-2017) Eye Tracking and Visualization: Introduction to the Special Thematic Issue of the Journal of Eye Movement Research Journal of Eye Movement Research 10(5:1) 1-4.
Scheer M, Bülthoff HH and Chuang LL (March-2016) Steering demands diminish the early-P3, late-P3 and RON components of the event-related potential of task-irrelevant environmental sounds Frontiers in Human Neuroscience 10(73) 1-15.
Bieg H-J, Chuang LL, Bülthoff HH and Bresciani J-P (September-2015) Asymmetric saccade reaction times to smooth pursuit Experimental Brain Research 233(9) 2527-2538.
Browatzki B, Bülthoff HH and Chuang LL (April-2014) A comparison of geometric- and regression-based mobile gaze-tracking Frontiers in Human Neuroscience 8(200) 1-12.
Bieg H-J, Bresciani J-P, Bülthoff HH and Chuang LL (October-2013) Saccade reaction time asymmetries during task-switching in pursuit tracking Experimental Brain Research 230(3) 271-281.
Bonev B, Chuang LL and Escolano F (May-2013) How do image complexity, task demands and looking biases influence human gaze behavior? Pattern Recognition Letters 34(7) 723–730.
Son HI, Franchi A, Chuang LL, Kim J, Bülthoff HH and Robuffo Giordano P (April-2013) Human-Centered Design and Evaluation of Haptic Cueing for Teleoperation of Multiple Mobile Robots IEEE Transactions on Cybernetics 43(2) 597-609.
pdf
Bieg H-J, Bresciani J-P, Bülthoff HH and Chuang LL (September-2012) Looking for Discriminating Is Different from Looking for Looking's Sake PLoS ONE 7(9) 1-9.
Chuang LL, Vuong QC and Bülthoff HH (May-2012) Learned non-rigid object motion is a view-invariant cue to recognizing novel objects Frontiers in Computational Neuroscience 6(26) 1-8.
Bülthoff HH and Chuang LL (September-2011) Seeing: The Computational Approach to Biological Vision. Second Edition. By John P. Frisby and James V. Stone. Cambridge (Massachusetts): MIT Press Quarterly Review of Biology 86(3) 227.
Schultz J, Chuang L and Vuong QC (June-2008) A dynamic object-processing network: Metric shape discrimination of dynamic objects by activation of occipito-temporal, parietal and frontal cortex Cerebral Cortex 18(6) 1302-1313.
Lander K, Chuang L and Wickham L (May-2006) Recognising face identity from natural and morphed smiles Quarterly Journal of Experimental Psychology 59(5) 801-808.
Chuang L, Vuong QC, Thornton IM and Bülthoff HH (May-2006) Recognising novel deforming objects Visual Cognition 14(1) 85-88.
Lander K and Chuang L (April-2005) Why are moving faces easier to recognize? Visual Cognition 12(3) 429-442.
pdf

Conference papers (48):

Faltaous S, Baumann M, Schneegass S and Chuang LL (September-2018) Design Guidelines for Reliability Communication in Autonomous Vehicles In: AutomotiveUI '18: Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, , 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '18), ACM Press, New York, NY, USA, 258-267.
Lahmer M, Glatz C, Seibold VC and Chuang LL (September-2018) Looming Auditory Collision Warnings for Semi-Automated Driving: An ERP Study In: AutomotiveUI '18: Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, , 10th International ACM Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '18), ACM Press, New York, NY, USA, 310-319.
Mayer S, Le HV, Nesti A, Henze N, Bülthoff HH and Chuang LL (September-2018) The Effect of Road Bumps on Touch Interaction in Cars In: AutomotiveUI '18: Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, , 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '18), ACM Press, New York, NY, USA, 85-93.
Chuang LL, Donker SF, Kun AL and Janssen CP (September-2018) Workshop on The Mobile Office In: AutomotiveUI '18: Adjunct Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, , 10th International ACM Conference on Automotive User Interfaces (AutomotiveUI '18), ACM Press, New York, NY, USA, 10-16.
Kosch T, Funk M, Schmidt A and Chuang LL (June-2018) Identifying Cognitive Assistance with Mobile Electroencephalography: A Case Study with In-Situ Projections for Manual Assembly, 10th ACM SIGCHI Symposium on Engineering Interactive Computing Systems (EICS 2018), ACM Press, New York, NY, USA, 2:11, 1-20, Series: PACMHCI Proceedings of the ACM on Human-Computer Interaction.
Borojeni SS, Boll SCJ, Heuten W, Bülthoff HH and Chuang L (April-2018) Feel the Movement: Real Motion Influences Responses to Take-over Requests in Highly Automated Vehicles, CHI Conference on Human Factors in Computing Systems (CHI ’18), ACM Press, New York, NY, USA, 1-13.
Chuang LL and Pfeil U (April-2018) Transparency and Openness Promotion Guidelines for HCI In: Extended Abstracts, , CHI Conference on Human Factors in Computing Systems (CHI EA ’18), 1-4.
Schmidt A and Chuang LL (April-2018) Understanding systems that are designed to support human cognition, Workshop on Rethinking Interaction: From Instrumental Interactions to Human-Computer Partnerships at CHI' 18, 1-5.
Glatz C, Krupenia SS, Bülthoff HH and Chuang LL (April-2018) Use the Right Sound for the Right Job: Verbal Commands and Auditory Icons for a Task-Management System Favor Different Information Processes in the Brain, CHI Conference on Human Factors in Computing Systems (CHI ’18), ACM Press, New York, NY, USA, 1-13.
Glatz C, Ditz J, Kosch T, Schmidt A, Lahmer M and Chuang LL (November-2017) Reading the mobile brain: from laboratory to real-world electroencephalography, 16th International Conference on Mobile and Ubiquitous Multimedia (MUM 2017), ACM Press, New York, NY, USA, 573-579.
Schwind V, Knierim P, Chuang LL and Henze N (October-2017) "Where's Pinky?": The Effects of a Reduced Number of Fingers in Virtual Reality, ACM SIGCHI Annual Symposium on Computer-Human Interaction in Play (CHI PLAY 2017), ACM Press, New York, NY, USA, 507-515.
Chuang LL, Manstetten D, Boll S and Baumann M (September-2017) 1st Workshop on Understanding Automation: Interfaces that Facilitate User Understanding of Vehicle Automation In: Adjunct Proceedings, , 9th International ACM Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '17), ACM Press, New York, NY, USA, 1-8.
Faltaous S, Machulla T, Baumann M and Chuang L (September-2017) Developing a Highly Automated Driving Scenario to Investigate User Intervention: When Things Go Wrong In: Adjunct Proceedings, , 9th International ACM Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '17), ACM Press, New York, NY, USA, 67-71.
Chuang LL, Glatz C and Krupenia S (September-2017) Using EEG to understand why behavior to auditory in-vehicle notifications differs across test environments, 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '17), ACM Press, New York, NY, USA, 123-133.
Brandenburg S, Chuang L and Baumann M (July-2017) Editorial: Connecting researchers for pushing an interdisciplinary research field, 3rd Berlin Summer School Human Factors, Technische Universität Berlin: Zentrum für Mensch-Maschine-Systeme, Berlin, Germany, 6-8, Series: MMI-Interaktiv ; 17.
Karolus J, Wozniak PW, Chuang LL and Schmidt A (May-2017) Robust Gaze Features for Enabling Language Proficiency Awareness In: 2017 CHI Conference on Human Factors in Computing Systems, , 35th Annual ACM Conference on Human Factors in Computing Systems (CHI '17), ACM Press, New York, NY, USA, 2998-3010.
Flad N, Ditz JC, Schmidt A, Bülthoff HH and Chuang LL (February-2017) Data-driven approaches to unrestricted gaze-tracking benefit from saccade filtering, Second Workshop on Eye Tracking and Visualization (ETVIS 2016), IEEE, Piscataway, NJ, USA, 1-5.
Allsop J, Gray R, Bülthoff HH and Chuang L (February-2017) Effects of Anxiety and Cognitive Load on Instrument Scanning Behavior in a Flight Simulation, Second Workshop on Eye Tracking and Visualization (ETVIS 2016), IEEE, Piscataway, NJ, USA, 55-59.
Flad N, Fomina T, Bülthoff HH and Chuang LL (2017) Unsupervised clustering of EOG as a viable substitute for optical eye-tracking In: Eye Tracking and Visualization: Foundations, Techniques, and Applications: ETVIS 2015, , First Workshop on Eye Tracking and Visualization (ETVIS 2015), Springer, Cham, Switzerland, 151-167, Series: Mathematics and Visualization.
Riener A, Jeon MP, Alvarez I, Pfleging B, Mirnig A, Tschelgli M and Chuang L (October-2016) 1st Workshop on Ethically Inspired User Interfaces for Automated Driving In: Adjunct Proceedings, , 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '16), ACM Press, New York, NY, USA, 217-220.
McCall R, Baumann M, Politis I, Borojeni SS, Alvarez I, Mirnig A, Meschtcherjakov A, Tscheligi M, Chuang L and Terken J (October-2016) 1st Workshop on Situational Awareness in Semi-Automated Vehicles In: Adjunct Proceedings, , 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '16), ACM Press, New York, NY, USA, 233-236.
Borojeni SS, Chuang L, Heuten W and Boll S (October-2016) Assisting Drivers with Ambient Take-Over Requests in Highly Automated Driving, 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '16), ACM Press, New York, NY, USA, 237-244.
Karolus J, Woźniak PW and Chuang LL (October-2016) Towards Using Gaze Properties to Detect Language Proficiency, 9th Nordic Conference on Human-Computer Interaction (NordiCHI '16), ACM Press, New York, NY, USA, 118.
Borojeni SS, Chuang L, Löcken A, Glatz C and Boll S (October-2016) Tutorial on Design and Evaluation Methods for Attention Directing Cues In: Adjunct Proceedings, , 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '16), ACM Press, New York, NY, USA, 213-215.
Scheer M, Bülthoff HH and Chuang LL (October-2015) On the influence of steering on the orienting response In: Trends in Neuroergonomics, , 11. Berliner Werkstatt Mensch-Maschine-Systeme, Universitätsverlag der TU Berlin, Berlin, Germany, 24.
Glatz C, Bülthoff HH and Chuang LL (September-1-2015) Attention Enhancement During Steering Through Auditory Warning Signals, Workshop on Adaptive Ambient In-Vehicle Displays and Interactions In conjunction with AutomotiveUI 2015 (WAADI'15), 1-5.
Page:  
1, 2, 3

Export as:
BibTeX, XML, Pubman, Edoc, RTF
Last updated: Monday, 22.05.2017