Project Leader

Dr. Paolo Stegagno
Phone: +49 7071-601-218
Fax: +49 7071 601-616
Opens window for sending emailpaolo.stegagno[at]
Opens external link in new windowWebsite

Recent Journal Publications

Grabe V, Bülthoff HH, Scaramuzza D and Robuffo Giordano P (July-2015) Nonlinear ego-motion estimation from optical flow for online control of a quadrotor UAV International Journal of Robotics Research 34(8) 1114-1135.
Ryll M, Bülthoff HH and Robuffo Giordano P (February-2015) A Novel Overactuated Quadrotor Unmanned Aerial Vehicle: Modeling, Control, and Experimental Validation IEEE Transactions on Control Systems Technology 23(2) 540-556.
Zelazo D, Franchi A, Bülthoff HH and Robuffo Giordano P (January-2015) Decentralized rigidity maintenance control with range measurements for multi-robot systems International Journal of Robotics Research 34(1) 105-128.
Franchi A, Oriolo G and Stegagno P (September-2013) Mutual Localization in Multi-Robot Systems using Anonymous Relative Measurements International Journal of Robotics Research 32(11) 1302-1322.
Lee D, Franchi A, Son HI, Ha CS, Bülthoff HH and Robuffo Giordano P (August-2013) Semiautonomous Haptic Teleoperation Control Architecture of Multiple Unmanned Aerial Vehicles IEEE/ASME Transactions on Mechatronics 18(4) 1334-1345.
Son HI, Franchi A, Chuang LL, Kim J, Bülthoff HH and Robuffo Giordano P (April-2013) Human-Centered Design and Evaluation of Haptic Cueing for Teleoperation of Multiple Mobile Robots IEEE Transactions on Cybernetics 43(2) 597-609.
Censi A, Franchi A, Marchionni L and Oriolo G (April-2013) Simultaneous Calibration of Odometry and Sensor Parameters for Mobile Robots IEEE Transaction on Robotics 29(2) 475-492.
Robuffo Giordano P, Franchi A, Secchi C and Bülthoff HH (March-2013) A Passivity-Based Decentralized Strategy for Generalized Connectivity Maintenance International Journal of Robotics Research 32(3) 299-323.
Franchi A, Secchi C, Son HI, Bülthoff HH and Robuffo Giordano P (October-2012) Bilateral Teleoperation of Groups of Mobile Robots with Time-Varying Topology IEEE Transaction on Robotics 28(5) 1019-1033.
Franchi A, Masone C, Grabe V, Ryll M, Bülthoff HH and Robuffo Giordano P (October-2012) Modeling and Control of UAV Bearing-Formations with Bilateral High-Level Steering International Journal of Robotics Research 31(12) 1504-1525.
Franchi A, Secchi C, Ryll M, Bülthoff HH and Robuffo Giordano P (September-2012) Shared Control: Balancing Autonomy and Human Assistance with a Group of Quadrotor UAVs IEEE Robotics & Automation Magazine 19(3) 57-68.

Export as:
BibTeX, XML, Pubman, Edoc, RTF


Environment Sensing and Classification

In order to effectively interact with the environment, an autonomous systems must be able to sense and understand the surroundings. The faster and more precisely a robot is able to identify the target of its action and distinguish among different objects in the environment, the more efficiently it will be able to fulfill its task. However, autonomous sensing still relies on time-expensive and online unfeasible computer vision algorithms.
One of our research areas concerns the study of new sensing paradigms that can improve the understanding of the environment at low computational cost, hence increase the autonomy of robotics systems.

Development of A Low-cost Spectral Camera for Online Vegetation and Water Identification

(Click to enlarge) An example of unsupervised classification of the DR and NIR images
In general, each material reflects a different percentage of electro-magnetic (EM) radiation in each wavelength. By plotting those values with respect to the wavelength, it is possible to obtain what is called the "reflectance" of the inspected material. The reflectance is unique for each material and constitutes what can be considered a sort of spectral signature.
The knowledge of the spectral signatures of different materials is used in remote sensing to classify objects sensed with spectral cameras. This type of sensors are cameras that can collect multiple images at many wavelengths of interest. This information can be useful onboard of robots in order to identify materials and use this information to identify and classify surrounding objects. However, reflectance sensors are usually expensive and heavy, so unsuitable for their equipment on Micro Aerial Vehicles (MAV). Some principles of remote sensing can nevertheless be applied onboard by developing a camera array in which each camera collects images at specific wavelengths of interest.
At this aim, we are developing a small camera array constituted by three usb cameras able to collect images in the Dark Red (DR, 660nm) and in the Near Infra Red (NIR, 850nm) bands. The collected images are then suitable for the application of the Normalized Difference Vegetation Index (NDVI), that is normally used in remote sensing in order to identify vegetation and water from satellite or airborne images. An automatic classification algorithm then applied on the computed NDVI matrix in order to reconstruct the constituent materials of the sensed objects.

Object Classification with Robotic Swarms

(Click to enlarge) A simulation with 30 robots, three types of sensors, and communication graph
The ability to identify the target of a common action is fundamental for the development of a multi-robot team able to interact with the environment. In most existing systems, the identi cation is carried on individually, based on either color coding, shape identification or complex vision systems. Those methods usually assume a broad point of view over the objects, which are observed in their entirety. This assumption is sometimes difficult to fulfill in practice, and in particular in swarm systems, constituted by a multitude of small robots with limited sensing and computational capabilities.
We are developing a method for target identification  with a heterogeneous swarm of low-informative spatially-distributed sensors employing a distributed version of the naive Bayes classifier. Despite limited individual sensing capabilities, the recursive application of the Bayes law allows the identification if the robots cooperate sharing the information that they are able to gather from their limited points of view. In addition, our framework is able to fuse together the information gathered by many different types of sensors, such as cameras, reflectance sensors and laser scanners. Simulation results (see figure) show the effectiveness of this approach highlighting some properties of the developed algorithm.
Last updated: Friday, 13.02.2015