Frank Nieuwenhuizen

Alumni of the Department Human Perception, Cognition and Action
Alumni of the Group Cybernetics Approach to Perception and Action

Main Focus


Enabling technologies for personal air transport systems

The volume of road transportation continues to increase and the implied financial and environmental impact fuels public concern. Individual drivers spend considerable time in congested urban agglomerations or highly frequented highways, which leads to significant losses to the European economy.

An envisioned Personal Aerial Vehicle.

A Personal Air Transport System (PATS) has been proposed as a radical solution to congestion and as an alternative to current transport systems. A PATS would use Personal Aerial Vehicles (PAV) to bridge the gap between relatively slow cars in a road-based door-to-door system and an air transport system that provides fast and longer journeys to specific locations. Previous projects related to PAVs have focused on the design of the vehicle itself. However, the surrounding issues such as the concept of operations, business models and target users have not been comprehensively considered. This is a necessary requirement for PATS to be operated commercially.

The myCopter project approaches the development of a PATS by investigating the technologies that are needed to enable the operational infrastructure required for the use of PAVs on a large scale. At the Max Planck Institute for Biological Cybernetics, the interaction between the pilot and a vehicle will be investigated. Even though it is likely that a PAV will be autonomous to a high degree, the pilot will be expected to interact with the vehicle. Thus, human-machine interfaces (HMI) should consider the perceptual and cognitive capabilities of the PAV user. We will introduce novel concepts for the interaction between human and the vehicle, and the benefits of, e.g., synthetic vision displays and force feedback will be evaluated in motion simulators.

Multimodel identification of human control behaviour

Human cognitive behaviour can be subdivided into three levels:

1) knowledge-based behaviour that describes high-level problem solving;

2) rule-based behaviour that is determined by rules and behaviour learned in the past; and

3) skill-based behaviour that involves elementary human information processing and basic control tasks.

Considering skill-based behaviour in a simulator environment can provide an objective means to assess the influence of various simulator settings on human control behaviour. By taking a cybernetic approach, skill-based behaviour can be assessed in experimental simulator trials. In this approach, a mathematical model is fit to the measured response of a pilot and changes in the identified parameters serve as a measure for adaptation of human behaviour. By performing tasks in which a pilot tracks a target, while at the same time rejecting a disturbance, a distinction can be identified between the contribution of visual and vestibular senses. Observed changes in the performance measures derived from the measured response of the pilot can be now correlated with changes in identified control behaviour.

A block diagram representation of a multiloop control task.

Intuitive Control Interfaces for Personal Aerial Vehicles


Personal Aerial Vehicles (PAV) have been presented as a potential solution to problems associated with predicted volumes of traffic of the future [1]. Control for such vehicles is likely highly automated, but this can also have undesirable effects, especially during control of safety-critical dynamic processes in unpredictable environments. Therefore, users should remain in control of their PAV through a control interface that provides them with continuous feedback.


The goal of this project is to integrate a haptic shared control framework and a Highway-in-the-Sky (HITS) display (see Figure) with a dynamic model of a highly augmented PAV and investigate whether this could result in an easy-to-use control interface and better performance for pilots with limited flight experience [2]. For this purpose, we evaluated experimentally to which extent participants without formal flight training benefit from haptic cues and enhanced visual information.


Participants followed a flight trajectory depicted by a HITS display in a fixed-base flight simulator with and without haptic guidance. This steering task lasted 70 seconds. Three HITS display configurations were evaluated: a tunnel representation, a wall representation and a highway representation of the flight trajectory (see Figure). The guidance forces were calculated according to the geometrical relation between the predicted position of the PAV with respect to the flight trajectory. The prediction time tpred was varied between 0, 1.5 and 3 seconds.

Initial results

The experimental results show that the highway representation of the flight trajectory leads to worse performance (highest error) in following the trajectory compared to the wall or tunnel representation when pilots are not supported with haptic cues. When pilots have access to haptic guidance forces, performance is increased as they become better at minimizing the error with respect to the flight trajectory. In general, an increase in prediction time leads to better performance. During turns, haptic guidance also resulted in lower pilot effort as measured through the variance of the pilot lateral input force on the sidestick. However, in straight flight segments, pilot effort is slightly higher with any haptic guidance. This indicates that the haptic cues provided in this condition might not entirely match the pilot's intentions.

Initial conclusion

Haptic guidance cues on the sidestick allowed pilots to achieve better performance (lower error) with lower control activity. However, pilots had to increase their control effort when the haptic guidance cues were based on an instantaneous error instead of the predicted error of the position of the PAV with respect to the flight trajectory. A tunnel and a wall representation of the flight trajectory led to best performance (lowest error), whereas a highway representation resulted in worse performance and higher control activity and effort. The combination of a haptic shared control framework and HITS display can provide pilots with limited flight experience with an easy-to-use control interface for flying a PAV.


1. Nieuwenhuizen F. M., Jump M., Perfect P., White M. D., Padfield G. D., Floreano D., Schill F., Zufferey J.-C., Fua P., Bouabdallah S., Siegwart R., Meyer S., Schippl J., Decker M., Gursky B., Höfinger M. and Bülthoff H. H. (2011) myCopter: Enabling Technologies for Personal Aerial Transportation Systems 3rd International HELI World Conference 2011 "Helicopter Technologies and Operations" (HeliWorld 2011).

2. Nieuwenhuizen F. M. and Bülthoff H. H. (2014) Evaluation of Haptic Shared Control and a Highway-in-the-Sky Display for Personal Aerial Vehicles AIAA Science and Technology Forum and Exposition (SCITECH 2014), AIAA 2014-0808.

Curriculum Vitae

Frank is a project leader at the Max Planck Institute for Biological Cybernetics. His interests include identication of human behaviour in closed-loop control tasks, haptics for human-machine interfaces and human factors. He was the project leader for the EU-project "myCopter - Enabling Technologies for Personal Aerial Vehicles" that ran between January 2011 and December 2014 (). In this project, he was responsible for management of the project and its dissemination activities.

Frank undertook his Ph.D. research in collaboration with Delft University of Technology. His research focused on investigating the effects of simulator motion system characteristics on pilot control behavioural. He used several techniques to identify pilot behaviour in closed-loop control tasks and assess the influence of motion system properties on human control behaviour.

Frank was born in Haarlem, The Netherlands in March 1981. In September 1999 he started studying at the Faculty of Aerospace Engineering in Delft and graduated in December 2005 on the development of a novel identification technique for pilot control behaviour and its application in an experiment on perception of visual and motion cues during control of self-motion in optic flow environments.

Go to Editor View