Lewis Chuang

Alumni of the Department Human Perception, Cognition and Action
Alumni of the Group Cognition and Control in Human-Machine Systems
Alumni of the Group Cognitive Engineering
Alumni of the Group Recognition and Categorization

Main Focus

A process cannot be understood by stopping it. Understanding must move with the flow of the process, must join and flow with it. - Frank Herbert

My group, Cognition and Control for Human-Machine Systems (link), focuses on the human factors of closed-loop control, and their underlying psychophysiological bases. In particular, I am interested in how we seek out and process task-relevant information whilst controlling machine systems.
We employ physiological and gaze-tracking to inobtrusively evaluate attentional demand and workload in different operational domains, such as flying an aircraft. Our goal is to contribute towards cognition-aware systems that take into account momentary and systematic fluctuations in attention and alertness (or cognitive performance) as well as the demands of the context

We interact actively with dynamic visual environments. For example, we move our gaze across our surroundings as well as manually manipulate objects, in order to access task-relevant information. Moreover, we are able to allocate limited attention resources to relevant tasks, in spite of workload and anxiety levels.

My research involves understanding how humans seek out and process information, in order to operate in control environments. To do so, I employ experimental setups that allow human participants to interact with their environments, as they are accustomed to in the real world. To this end, I employ non-obtrusive measurement techniques, such as eye- and body-tracking, and EEG, ECG and SCA, that allows the human behavior to be observed without disrupting performance itself.

Understanding how humans perform in a natural and unrestrained environment can inform the development of human-machine interfaces, allowing for better integration and faster adoption.

Examples of Human-Machine Interactions
Active Object Exploration: https://youtu.be/ABpY9eN4CwI
Flight simulator with real-time eyetracking: http://www.kyb.local/nc/employee/details/chuang.html#=0
Gaze tracking with wall-sized displays: https://youtu.be/lSZ9EYgXxig

Associated projects and funding
SFB-TRR161: Quantitative Methods for Visual Computing (2015-2019)
BW-FIT: Information at your fingertips (2007–2011)
European Union 7th Framework Programme: myCopter

I supervise a research group that investigates how humans seek out and process task-relevant information for the effective control of machine systems, such as vehicles. Machines extend our physical capacity to sense and interact with our environments. For example, collision avoidance systems in an aircraft allow the pilot to be aware of fast moving traffic before they are even within range of human sight. Meanwhile, the pilot selectively relies on information provided by the system, to determine and execute the appropriate combination of actions, necessary for effectively maneuvering of the aircraft.

This continuous interaction between man and machine comprise a closed loop system. Information is constantly exchanged between man and machine, which is subsequently processed and acted on according to their respective cognitive and control processes. Our group employs eye-tracking, motion capture and electroencephalography to define the capacity of a human operator to interact in tandem with a responsive machine system. In particular, vehicle models with control dynamics have been well-defined and engineered for their intended purpose. We believe that doing so will extend our current understanding of attentional processes and motor control. In addition, we are motivated to apply our findings to the development of novel and more effective interfaces for information visualization and shared control.

Main research areas
PHYSIOLOGICAL ESTIMATIONS OF PERCEPTUAL-MOTOR WORKLOAD
The goal of this project is to extract physiological features (e.g., EEG) that can reliably index the amount of workload that the operator is experiencing in the domain of perceptual-motor control. Research into EEG markers of mental workload have tended to be focused on aspects such as sustained attention or working memory. Here, we are motivated to estimate perceptual-motor fatigue of the operator before potentially fatal decrements in performance occur.

Scheer M , Bülthoff HH and Chuang LL (September-2014) Is the novelty-P3 suitable for indexing mental workload in steering tasks? 12th Biannual Conference of the German Cognitive Science Society (KogWis 2014), Springer, Berlin, Germany, S135-S136.

Flad N , Nieuwenhuizen FM , Bülthoff HH and Chuang LL (June-2014) System Delay in Flight Simulators Impairs Performance and Increases Physiological Workload In: Engineering Psychology and Cognitive Ergonomics, 11th International Conference on Engineering Psychology and Cognitive Ergonomics (EPCE 2014), Springer, Berlin, Germany, 3-11.

DETECTION AND RECOGNITION DURING STEERING
High perceptual motor demands can reduce our capacity to attend to secondary tasks. For example, we could fail to notice the sudden appearance of a crossing pedestrian, especially under severe driving conditions. In this line of research, we seek to understand how our capacity for detecting and recognizing peripheral events vary with increasing demands in the control task (e.g., instability).

Glatz C , Bülthoff HH and Chuang LL (September-2014): Looming auditory warnings initiate earlier event-related potentials in a manual steering task, 12th Biannual Conference of the German Cognitive Science Society (KogWis 2014), Tübingen, Germany, Cognitive Processing, 15(Supplement 1) S38.

Bieg H-J , Bresciani J-P , Bülthoff HH and Chuang LL (September-2012) Looking for Discriminating Is Different from Looking for Looking's Sake PLoS ONE 7(9) 1-9.

GAZE CONTROL FOR RELEVANT INFORMATION RETRIEVAL
We move our eyes to actively select and process task-relevant information in real-time. By monitoring how eye-movements are coordinated during control maneuvers, we are able to determine aspects of the visual scene that support the operator’s control capabilities. Our research in this area has two emphases. The first involves developing algorithms for estimating, filtering and analyzing natural gaze in real-time and under challenging scenarios (e.g., cockpit environment). The second targets a fundamental understanding of how eye-movements are coordinated so as to handle shifts in task priorities.

Bonev B , Chuang LL and Escolano F (May-2013) How do image complexity, task demands and looking biases influence human gaze behavior? Pattern Recognition Letters 34(7) 723–730.

Bieg H-J , Bresciani J-P , Bülthoff HH and Chuang LL (October-2013) Saccade reaction time asymmetries during task-switching in pursuit tracking Experimental Brain Research 230(3) 271-281.

ROBUST EEG MEASUREMENTS IN MOBILE WORKSPACES
EEG signals can suffer from artefacts due to electromagnetic noise or muscle activity. These noise sources can be amplified in settings that involve a heavy use of electrical equipment and voluntary user movements, such as moving-base flight simulators. Here, we seek to enable EEG recordings in such demanding workspaces by developing robust measurement paradigms and filter algorithms.

Browatzki B , Bülthoff HH and Chuang LL (April-2014) A comparison of geometric- and regression-based mobile gaze-tracking Frontiers in Human Neuroscience 8(200) 1-12

Former Lab members (link)
Christiane Glatz: The influence of auditory warning cues during steering
Nina Flad: Visual information sampling with simultaneous EEG and eyetracking
Tim Schilling (w/ Zeiss Vision Lab): Role of tinted lenses in mitigating affect appraisal
Alexander Sipatchin: Role of tinted lenses in mitigating affect appraisal
Katharina-Marie Lahmer: Auditory warnings for emergency braking
Katrin Kunz: Driving simulation
Jonas Ditz: Mobile EEG
Menja Scheer: Mental workload during closed-loop control
Evangelia-Regkina Symeonidou: Haptic feedback during closed-loop control
Monika Marsching: Eye-movements during flight training
Marius Klug: A software framework for multimodal user sensing
Silke Wittkowski: The influence of environmental sounds during steering
Jonas Walter: The influence of field-of-view in visuomotor training
Hans-Joachim Bieg (Bosch GmbH):Mid-level eye movements
Björn Browatzki: Methods for mobile gaze tracking
Anne Geuzebroerk: Attentional tunneling during closed-loop control
Riya Paul: EEG signal processing in a moving-base simulator
Jon Allsop: Influence of anxiety on eye-movement planning      


Curriculum Vitae

Lewis Chuang is a project leader at the Max Planck Institute for Biological Cybernetics for “Cognition and Control in Human-Machine Systems”. In particular, he investigates how humans process information in order to interact with and control complex machines (e.g., vehicles) [1]. In parallel, he collaborates with computer scientists and engineers to select between different designs for human-machine communications. For instance, he evaluated how haptic force-feedback should be provided to teloperators of swarms of unmanned aerial vehicles in order to help them avoid potential unseen collisions [2]. Lewis Chuang is motivated to observe human behavior without interfering with human-machine-interactions. Thus, he has developed novel neuroscientific methods for unrestrained gaze-tracking [3] and evaluating task engagement during steering [4]. This has proved invaluable in his external collaborations on visual computing [5][6] and in understanding the challenges of a consumer-level flying car [7]. Currently, he is engaged on identifying the potential dangers that the transition from manual to autonomous driving will pose—as everyday drivers fail to understand the real capabilities of assisted driving technologies and designers fail to communicate their expectations of driver participation effectively [8][9][10]. A full C.V. is available upon request.


[1] Chuang, L. Error visualization and information-seeking behavior for air-vehicle control. In Schmorrow, D. and Fidopiastis, C., Eds., Foundations of Augmented Cognition. Lecture Notes in Artificial Intelligence, 9183, 3–11, Aug 2015.

[2] Son, H., Franchi, A., Chuang, L. L., Kim, J., Bülthoff, H. H., and Robuffo Giordano, P. Human-centered design and evaluation of haptic cueing for teleoperation of multiple mobile robots. IEEE Transactions on Systems, Man and Cybernetics, 43(2), 597–609, Apr 2013.

[3] Browatzki, B., Bülthoff, H. H., and Chuang, L. L. A comparison of geometric- and regression-based mobile gaze-tracking. Frontiers in Human Neuroscience, 8(200), 1–12, Apr 2014

[4] Scheer, M., Bülthoff, H. H., and Chuang, L. L. Steering demands diminish the early- P3, late-P3 and RON components of the event-related potential of task-irrelevant environmental sounds. Frontiers in Human Neuroscience, 10(73), Feb 2016.

[5] Transregional Research Centre for Quantitative Methods for Visual Computing (www.trr161.de), supported by Deutsche Forschungsgemeinschaft, Germany (TRR161-C03).

[6] Burch M, Chuang L, Fischer B, Schmidt A and Weiskopf D: Eye Tracking and Visualization: Foundations, Techniques, and Applications, Springer, Cham, Switzerland, (2017).in press

[7] myCopter: Enabling Technologies for Personal Aerial Transportation Systems (www.mycopter.eu), supported by European Union’s Seventh Framework Programme (#266470)

[8] Sadeghian, S., Chuang, L. L., Heuten, W., and Boll, S. Assisting drivers with ambient take over requests in highly automated driving. In Proceedings from the 8th International Conference on Automative User Interfaces and Interactive Vehicular Applications, (Auto-UI 2016), 1–8, Oct 2016.

[9] Löcken, A., Borojeni, S. S., Müller, H., Gable, T. M., Triberti, S., Diels, C., Glatz, C., Alvarez, I., Chuang, L. L., and Boll, S. Towards adaptive ambient in-vehicle displays and interactions: Insights and design guidelines from the 2015 Automotive-UI dedicated workshop. In Meixner, G. and Müller, C., Eds., Automotive User Interfaces - Creating Interactive Experiences in the Car. Springer, Berlin, Germany, in press.

[10] Chuang, L. L., Gehring, S., Kay, J., Olivier, P., and Schmidt, A. “Ambient notification environments” (17161). In Schloss Dagstuhl—Leibniz- Zentrum für Informatik, Apr 2017.


Go to Editor View