Flight autonomy of micro-drone in indoor environments using LiDAR flash camera
Résumé
Autonomy starts with awareness of the environment. Robots are given autonomy using sensors that endow them with perceptual capabilities, such as cameras . Recently, a new type of camera working under the Time-of-Flight principle has been developed, capable of acquiring dense depth maps at high frame rates. Its small size and weight make it suitable for its use on-board a flying vehicle for indoor localization and mapping . This document outlines the first approaches taken in the use of a ToF camera for such tasks constrained by real-time requirements. The camera has been mounted on a flying vehicle that uses the open source Paparazzi autopilot system developed by the ENAC ( Ecole National de l'Aviation Civile ) french team. Since indoor environments are predominantly planar, planar patches have been favoured to model the environment and detect motion of the UAV. A Region Growing segmentation algorithm identifies and extracts planes from the scene in real-time. Planes are tracked and registered across a sequence of frames to estimate the camera's ego-motion . Initial results of plane-based visual odometry are presented and confirm the device suitability.
Domaines
Automatique / RobotiqueOrigine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...