Project Leader

Dr. Lewis Chuang
Phone: +49 7071 601-608
Fax: +49 7071 601-616
lewis.chuang[at]tuebingen.mpg.de

Recent news

2016-Feb-15

[Paper] We have a new paper on how steering decreases involuntary attention Opens internal link in current window[more]
 
2015-June-12

[Paper] We have a new paper on eye-movements to moving targets Opens internal link in current window[more]
 
2015-June-05

[Thesis] Ms. Silke Wittkowski has completed her Masters thesis in Neural and Behavioral Sciences titled,"The influence of environmental sounds during steering." Opens internal link in current window[more]
 
2015-May-21

[Funding] Dr. Lewis Chuang has been jointly awarded, with Konstanz and Stuttgart Universities, a DFG-Sondersforschungbereich grant—'Quantitative Methods for Visual Computing'. Opens internal link in current window[more]

Five most recent Publications

Chuang LL, Gehring S, Kay J und Schmidt A: Ambient Notification Environments, Dagstuhl Seminar 17161, -, Leibniz-Zentrum für Informatik, Schloss Dagstuhl, Germany, (April-2017).im Druck
-, Series: Dagstuhl Reports
Chuang LL (November-5-2015) Invited Lecture: Beyond Steering in Human-Centered Closed-Loop Control, Institute for Neural Computation: INC Chalk Talk Series, San Diego, CA, USA.
Scheer M, Bülthoff HH und Chuang LL (Oktober-2015) On the influence of steering on the orienting response In: Trends in Neuroergonomics, , 11. Berliner Werkstatt Mensch-Maschine-Systeme, Universitätsverlag der TU Berlin, Berlin, Germany, 24.
Chuang L (September-16-2015) Invited Lecture: Non-obtrusive measurements of attention and workload in steering, DSC 2015 Europe: Driving Simulation Conference & Exhibition, Tübingen, Germany.
Glatz C, Bülthoff HH und Chuang LL (September-1-2015) Attention Enhancement During Steering Through Auditory Warning Signals, Workshop on Adaptive Ambient In-Vehicle Displays and Interactions In conjunction with AutomotiveUI 2015 (WAADI'15), 1-5.

Export als:
BibTeX, XML, pubman, Edoc, RTF

 

[2016-Feb-15] New paper: Steering decreases involuntary distraction

We have just published a new study that demonstrates how manipulations of steering difficulty reflect themselves in the ERPs that are generated by task-irrelevant environment sounds. This represents a first step towards evaluating vehicle handling demands without requiring the use of obtrusive secondary tasks.

Scheer M, Bülthoff HH and Chuang LL (2016). Steering demands diminish the early-P3, late-P3 and RON components of the event-related potential of task-irrelevant environmental sounds. Frontiers in Human Neuroscience, 10:73.
journal.frontiersin.org/article/10.3389/fnhum.2016.00073/abstract


[2015-June-12] New Experimental Brain Research paper

Imagine yourself as a watchful hunter. A boar appears in your field of view and scurries either towards where you were looking in the first place or away from where you're looking. If the boar is running away, it is obvious that your eye will to "chase" after the boar in order to take accurate aim before it disappears. In contrast, it is less clear how to move one's eye if the boar is running towards your current line-of-sight. In this case, you either have the choice of waiting for the boar to intersect your line-of-sight, or to move your eye to the approaching boar. How does the brain compute this decision?
 
Contrary to the intuition, this decision does not depend on how far away the moving target is. Instead, our eye-movement planning system is more sophisticated. It is able to estimate the time that it will take the moving target to intersect one's current line-of-sight, based on the moving target's speed. In the current paper, we find that participants were more likely to move their eyes if a target was expected to take more than 300 msecs to intersect with one's gaze. Furthermore, this decision to move one's eye to an approaching target takes more time than it does to a target that is moving away. This work reveals that our brain is capable of making efficient decisions on where to look in a split-second.


Bieg H-J Person, Chuang LL Person, Bülthoff HH Person and Bresciani J-P Person (June-2015) Asymmetric saccade reaction times to smooth pursuit Experimental Brain Research, Epub
 
http://link.springer.com/article/10.1007/s00221-015-4323-8?wt_mc=email.event.1.SEM.ArticleAuthorOnlineFirst

[2015-June-05] Ms. Silke Wittkowski has completed her Masters thesis in Neural and Behavioral Sciences

Her research addresses how the human brain orients to novel environmental sounds during steering. Previous work has shown that hearing an infrequent familiar sound, such as a dog-bark, can induce a characteristic positive deflection in EEG signals ~300 msecs later (i.e., P3a). This has been treated as an indicator of involuntary attention orienting, or a "call for attention". It has also been employed to index user-workload—it has been reasoned that a heavier workload could reduce our tendency to be distracted by task-irrelevant stimuli.
 
However, the repeated presentation of familiar sounds could result in habituation or adaptation. This can be expected to increase the variability of the P3a response, thus diminishing its reliability as an index for user state. Ms. Wittkowski created a new set of novel unfamiliar sounds that contain the statistical regularities of familiar sounds but continue to sound unrecognizable. This induces larger P3a than familiar environment sounds and demonstrate less variability to repetitions. More importantly, its amplitude is reduced when participants are required to perform a manual steering task, suggesting that it can be employed in further studies designed to investigate user's attention during steering.
 
We wish Silke all the best in her future endeavors.

[2015-May-21] SFB funding for Quantitative Methods for Visual Computing

Dr. Lewis Chuang has been jointly awarded, with Konstanz and Stuttgart Universities, funding to establish up a special research focus titled —'Quantitative Methods for Visual Computing'. The goal of this project is to derive perception-based approaches for evaluating the quality of visualizations, such as those used in medical imaging, virtual reality, communication media, etc.
 
Information is predominantly communicated via the visual medium. From maps to meteorological reports to word-clouds of popular Twitter hashtags, the diversity of visual properties (i.e., size, color, lines, etc), which our visual system is responsive to, provides designers with a large palette to choose from. Our confidence in our ability to perceive visual information has coined popular phrases such as,"I'll believe it when I see it." or for the internet-savvy, "pics or it didn't happen". Thus, our inability to determine whether a dress is blue or gold can result in unexpected levels of controversy. The Opens external link in new windownumerous visual illusions that never cease to bemuse attests to our limited visual capabilities.
 
Visual computing refers to a flourishing field in computer science that strives to develop principles and algorithms for generating visual information. The applications that benefit include medical imaging for diagnostic purposes as well as computer graphics for entertainment. Regardless of the application, the ubiquitous challenge lies in transforming abstract data to perceivable information. How should data be represented? How can costs be reduced for real-time computations? What are intuitive interfaces that will support interactive information seeking behavior? These are challenging issues that demand constant innovation and novel algorithms.
 
This DFG-funded initiative constitutes a collaboration of internationally renowned experts in computer science from the Universities of Konstanz and Stuttgart. The motivation is to develop methods for visual computing that are relevant to the human perception. It is in this regard, that the Max Planck Institution for Biological Cybernetics (MPI-BC) will play a a crucial role. Specifically, the MPI-BC will be working closely with partners within this framework, to apply methods drawn from psychophysics and neuroscience to the development and evaluation of visual computing methods. 
Last updated: Montag, 15.02.2016