Marcin Odelga

Marcin Odelga

Alumni of the Department Human Perception, Cognition & Action
Alumni of the Group Autonomous Robotics & Human-Machine Systems

Main Focus

I am a Ph.D. student in the at Max Planck Institute for Biological Cybernetics.

My reserch focuses mainly on development of an UAV platform for haptic teleoperation with on-board computation and sensing using an RGB-D sensor.

Development of an UAV platform for haptic teleoperation with on-board computation and sensing using an RGB-D camera.

Introduction

An Unmanned Aerial Vehicle (UAV) haptic teleoperation platform is an aerial robot that provides haptic and visual feedbacks to the human operator. Provided feedbacks decrease the operator effort to perform desired tasks. Most of existing platforms rely on external tracking systems and computational power which currently limit their possible application to laboratory/structured environment.

Goals

To increase the range of possible applications we aim to develop an UAV platform that could be teleoperated with a haptic device independently of external sensors and computational units. This will allow to perform tasks in most of indoor and subsequently outdoor spaces. With semi autonomous behaviors such as obstacle avoidance, the robot will aid - but not limit - the role of the human operator.

Methods

Integration of an on-board computational unit with the RGB-D sensor and inertia measurement unit (IMU) on-board an UAV will provide the testbed platform to develop control and estimation algorithms. Fusion of visual and inertia data through Kalman filtering will be enacted to estimate sensible quantities independently of external equipment. Advanced filtering methods, such as the Bin-Occupancy filter, can be used to build a local obstacle map and implement obstacle avoidance. All the estimated information will be used in a robust control scheme to achieve stable flight of the platform.

Current progress

We have designed and implemented the mechanical platform that integrates all the required equipment (e.g.: CPU and sensors). The robot is ready to perform the estimation and control tasks using the on-board computational unit only. We have run the software for localization using the RGB-D sensor, integration with the low-level flight controller and with the haptic teleoperation interface. We have performed first successful flights and collect initial data.

Currently the obstacle detection and avoidance algorithm is under development.

Go to Editor View