
Site IMT Atlantique
Candidature web-site : https://euraxess.ec.europa.eu/jobs/337369
Scientific context
The project follows on the recent appointment of Lucia Bergantin as Assoc. Prof. in IMT Atlantique in 2024 and complements ongoing research in AI-driven drones in the context of ANR projects LEASARD (partnering with IRL CROSSING) and VORTEX. More particularly, in the context of the VORTEX project, Lucia Bergantin and Panagiotis Papadakis (HDR obtained in 2023) are cosupervising in collaboration with LS2N laboratory, Nantes a PhD focused on rapid indoor navigation of a fully autonomous flying drone equipped with an event camera.
This line of research focuses on leveraging event-cameras for semi-autonomous drone per- ception & control, thanks to the formers’ lower energy consumption, higher dynamic range and asyn- chronous operation. A central question concerns the neural network architecture design that is best suited for a given task (i.e. Spiking and Graph Neural Networks [1] or conventional, frame-based architectures [2]), further accounting for frugality aspects so as to allow efficient processing.While these questions are being increasingly investigated for tasks such as object detection and classification or drone control, the use of event cameras for human-robot interaction is only recently starting to be explored [5]. In correspondence with CROSSING’s roadmap, this thesis will pursue coordinated exploration/inspection of outdoor, unstructured environments via human-drone teaming using event streams as sensing modality.
Thesis objectives
While flying drones offer greater flexibility for exploring unstructured areas [4], their auton- omy can be hampered in the presence of diverse obstacles under adverse environment conditions. In particular, RGB camera-based vision may struggle due to varying lighting conditions and high energy consumption. This highlights the need for minimalistic approaches [3] using alternative sensors, no- tably, event cameras.
Instead of a fully autonomous drone which would require computationally demanding map- ping and planning capabilities, this thesis will investigate semi-autonomous, primarily data-driven drone navigation in coordination with a human user, interleaving between two operation modes :
Following the human operator while avoiding potential obstacles,
Interpreting the operator’s visual instructions/gestures for task-specific manoeuvres (goto, ex- plore/search instructions, etc).
The targeted applications include search & rescue missions and surveillance up-to site/field inspec- tion and reconnaissance.
References
[1] S. Schaefer, D. Gehrig, D. Scaramuzza, AEGNN: Asynchronous Event-based Graph Neural Net- works, IEEE Int. Conf. On Computer Vision & Pattern Recognition, 2022,doi
[2*] H. Fradi, P. Papadakis, Advancing Object Detection for Autonomous Vehicles via General Pur- pose Event-RGB Fusion, IEEE Int. Conf. on Robotic Computing, 2024, doi
[3*] L.Bergantin et al., Indoor and outdoor in-flight odometry based solely on optic flows with oscil- latory trajectories, International Journal of Micro Air Vehicles 15, 2023, doi
[4*] Chen, F., Nguyen, H.V., Taggart, D.A., Falkner, K., Rezatofighi, S.H. & Ranasinghe, D.C., « ConservationBots: Autonomous aerial robot for fast robust wildlife tracking in complex terrains ». Journal of Field Robotics, 41, 443–469, 2023, doi
[5] M. Aitsam, S. Davies and A. Di Nuovo, « Event Camera-Based Real-Time Gesture Recognition for Improved Robotic Guidance, » International Joint Conference on Neural Networks, 2024
*Articles from thesis supervisors
Pour postuler à cette offre d’emploi veuillez visiter euraxess.ec.europa.eu.