Group leader

Dr. Paolo Stegagno
Phone: +49 7071-601-218
Fax: +49 7071 601-616
Opens window for sending emailpaolo.stegagno[at]
Opens external link in new windowWebsite

Recent Journal Publications

Grabe V, Bülthoff HH, Scaramuzza D and Robuffo Giordano P (July-2015) Nonlinear ego-motion estimation from optical flow for online control of a quadrotor UAV International Journal of Robotics Research 34(8) 1114-1135.
Ryll M, Bülthoff HH and Robuffo Giordano P (February-2015) A Novel Overactuated Quadrotor Unmanned Aerial Vehicle: Modeling, Control, and Experimental Validation IEEE Transactions on Control Systems Technology 23(2) 540-556.
Zelazo D, Franchi A, Bülthoff HH and Robuffo Giordano P (January-2015) Decentralized rigidity maintenance control with range measurements for multi-robot systems International Journal of Robotics Research 34(1) 105-128.
Franchi A, Oriolo G and Stegagno P (September-2013) Mutual Localization in Multi-Robot Systems using Anonymous Relative Measurements International Journal of Robotics Research 32(11) 1302-1322.
Lee D, Franchi A, Son HI, Ha CS, Bülthoff HH and Robuffo Giordano P (August-2013) Semiautonomous Haptic Teleoperation Control Architecture of Multiple Unmanned Aerial Vehicles IEEE/ASME Transactions on Mechatronics 18(4) 1334-1345.
Son HI, Franchi A, Chuang LL, Kim J, Bülthoff HH and Robuffo Giordano P (April-2013) Human-Centered Design and Evaluation of Haptic Cueing for Teleoperation of Multiple Mobile Robots IEEE Transactions on Cybernetics 43(2) 597-609.
Censi A, Franchi A, Marchionni L and Oriolo G (April-2013) Simultaneous Calibration of Odometry and Sensor Parameters for Mobile Robots IEEE Transaction on Robotics 29(2) 475-492.
Robuffo Giordano P, Franchi A, Secchi C and Bülthoff HH (March-2013) A Passivity-Based Decentralized Strategy for Generalized Connectivity Maintenance International Journal of Robotics Research 32(3) 299-323.
Franchi A, Secchi C, Son HI, Bülthoff HH and Robuffo Giordano P (October-2012) Bilateral Teleoperation of Groups of Mobile Robots with Time-Varying Topology IEEE Transaction on Robotics 28(5) 1019-1033.
Franchi A, Masone C, Grabe V, Ryll M, Bülthoff HH and Robuffo Giordano P (October-2012) Modeling and Control of UAV Bearing-Formations with Bilateral High-Level Steering International Journal of Robotics Research 31(12) 1504-1525.
Franchi A, Secchi C, Ryll M, Bülthoff HH and Robuffo Giordano P (September-2012) Shared Control: Balancing Autonomy and Human Assistance with a Group of Quadrotor UAVs IEEE Robotics & Automation Magazine 19(3) 57-68.

Export as:
BibTeX, XML, Pubman, Edoc, RTF


Aerial Robot Estimation and Control with Onboard Sensors

Many recent works on aggressive flight maneuvers highly rely on the presence of precise external tracking systems. This makes them both very expensive and limits their flexibility greatly as they cannot be deployed in unstructured environments such as natural disaster sides.Thus, we focus on the use of cameras and on-board processing to be both independent from transmission limitations to ground stations and the aforementioned tracking systems. Yet, we had to find efficient ways to cope with the restricted on-board computational power. 
However, existing visual systems mostly make either use of prebuilt maps and known environments or build a map as the robot moves around using simultaneous tracking and mapping (SLAM) approaches. Usually, these methods fail once tracking is lost for even short periods of time.
In order to avoid the problem we propose a robust velocity control, which requires only the estimation of the velocity instead of the estimation of the full 3D configuration. Depending on the onboard sensors and the underlying assumptions, different strategies can be applied to estimate the velocity.
For this research, we have implemented two different platforms based on the MikroKopter UAV kit (500g payload).

RGB-D Based Autonomous Velocity Control

Click to Enlarge
In the development of this platform have not taken any particular assumption, equipping the quadrotor with an RGB-D sensor. The extraction of velocity measurements relies on the integration of DVO (Dense Visual Odometry) software from TUM, which is in general able to compute an estimate of the position of a camera in the space without a full mapping of the environment. This information, although affected by an unavoidable cumulative error (because the position is not observable), can be geometrically derived, filtered and fused with IMU information in order to compute a reliable estimate of the velocity.
In addition, the RGB-D sensor is used to compute a local map of the obstacles in the surroundings of the robot and employ obstacle avoidance techniques to improve safety and autonomy of the system. The resulting platform relies only on on-board sensors without any particular assumption on the environment, and is suitable both for autonomous navigation and teleoperation.
Since our current setting employs an ASUS Xtion Live Pro as RGB-D sensor, the platform is currently suitable only for indoor environment. However, we are planning to move to outdoor application by the integration of a more conventional stereo camera and a GPS.
Watch a video of the platform Opens external link in new windowhere.

Obstacle Tracking and Avoidance

The depth map from the RGB-D sensor can be efficiently used to track the obstacles in the nearby of the robot using multi-target tracking techniques. In particular, the use of a robot-centered bin-occupancy filter on a limited domain surrounding the obstacle allows the employment of obstacle avoidance in all the direction of motion of the robot, and not only in the field of view of the sensor. Additionally, this comes with a constant computational time which does not grow over time, and exploit the benefits of a full probabilistic approach.

Monocular Camera-based Ego-Motion Estimation using the Continuous Homography Constraint

Click to Enlarge.
In this platform we have addressed the problem of motion estimation from two consecutive frames making use of the continuous homography constraint in the presence of flat environments (as found in most indoor scenarios as well as when flying in high altitudes) for a closed-form decomposition of the homography matrix. This reveals both the rotation and translation of the UAV and allows for a robust velocity control.


As the metric scale of the observed velocities cannot be obtained from visual information alone, we fuse both the camera and the acceleration readings from the on-board inertial measurement unit for different scale estimation techniques.
The algorithms were shown to work reliably at around 35 frames per second on our evaluation setup consisting of a
  • Mikrokopter UAV kit (500g payload)
  • Intel Atom 1.83GHz dual-core CPU
  • Single monochrome camera with VGA resolution
  • Inertial measurement unit (IMU)
Our algorithms make use of the OpenCV library and ROS, the Robot Operating System. Further, they were integrated into our controller framework TeleKyb (link to telekyb).
Visit Volker Grabe's page for more information on the topic.

Essential Publications in this Topic

Grabe V, Bülthoff HH and Robuffo Giordano P (November-2013) A comparison of scale estimation schemes for a quadrotor UAV based on optical flow and IMU measurements, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2013), IEEE, Piscataway, NJ, USA, 5193-5200.
pdfCiteID: GrabeBR2013
Stegagno P, Basile M, Bülthoff HH and Franchi A (November-2013) Vision-based Autonomous Control of a Quadrotor UAV using an Onboard RGB-D Camera and its Application to Haptic Teleoperation, 2nd Workshop on Research, Education and Development of Unmanned Aerial Systems (RED-UAS 2013), Pergamon Press, Oxford, UK, IFAC Proceedings Volumes, 46(30), 87-92.
pdfCiteID: StegagnoBBF2013
Grabe V, Bülthoff HH and Robuffo Giordano P (October-2012) Robust Optical-Flow Based Self-Motion Estimation for a Quadrotor UAV, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2012), IEEE, Piscataway, NJ, USA, 2153-2159.
pdfCiteID: GrabeBR2012

Export as:
BibTeX, XML, Pubman, Edoc, RTF
Last updated: Friday, 23.02.2018