TY - GEN
T1 - Dynamic-Obstacle Relative Localization Using Motion Segmentation with Event Cameras
AU - Alkendi, Yusra
AU - Hay, Oussama Abdul
AU - Humais, Muhammad Ahmed
AU - Azzam, Rana
AU - Seneviratne, Lakmal
AU - Zweiri, Yahya
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - The ability to detect and localize dynamic obstacles within a robot's surroundings while navigating low-light environments is crucial for ensuring robot safety and the continuity of its mission. Event cameras excel in capturing motion within scenes clearly without motion blur, due to their asynchronous nature. These sensors are distinguished by their ability to trigger events with microsecond temporal resolution, possess a high dynamic range, and achieve low latency. In this work, we introduce a framework for a drone equipped with an event camera, named E-DoRL. This framework is specifically designed to address the challenge of detecting and localizing dynamic obstacles that are not previously known, ensuring safe navigation. E-DoRL processes raw event streams to estimate the relative position between a moving robot and dynamic obstacles. It employs a Graph Transformer Neural Network (GTNN) to extract spatiotemporal correlations from event streams, identifying active event pixels of moving objects without prior knowledge of scene topology or camera motion. Based on these identifications, E-DoRL is designed to determine the relative position of moving obstacles with respect to a dynamic unmanned aerial vehicle (UAV). E-DoRL outperformed state-of-the-art frame-based object tracking algorithms in good light scenarios (100 lux), by achieving 59.7% and 25.9% reduction in the mean absolute error (MAE) associated with the X and Y estimates, respectively. Additionally, when tested under much lower light illuminance (0.8 lux), E-DoRL consistently maintained its performance without any degradation, as opposed to image-based techniques that are highly sensitive to lighting conditions.
AB - The ability to detect and localize dynamic obstacles within a robot's surroundings while navigating low-light environments is crucial for ensuring robot safety and the continuity of its mission. Event cameras excel in capturing motion within scenes clearly without motion blur, due to their asynchronous nature. These sensors are distinguished by their ability to trigger events with microsecond temporal resolution, possess a high dynamic range, and achieve low latency. In this work, we introduce a framework for a drone equipped with an event camera, named E-DoRL. This framework is specifically designed to address the challenge of detecting and localizing dynamic obstacles that are not previously known, ensuring safe navigation. E-DoRL processes raw event streams to estimate the relative position between a moving robot and dynamic obstacles. It employs a Graph Transformer Neural Network (GTNN) to extract spatiotemporal correlations from event streams, identifying active event pixels of moving objects without prior knowledge of scene topology or camera motion. Based on these identifications, E-DoRL is designed to determine the relative position of moving obstacles with respect to a dynamic unmanned aerial vehicle (UAV). E-DoRL outperformed state-of-the-art frame-based object tracking algorithms in good light scenarios (100 lux), by achieving 59.7% and 25.9% reduction in the mean absolute error (MAE) associated with the X and Y estimates, respectively. Additionally, when tested under much lower light illuminance (0.8 lux), E-DoRL consistently maintained its performance without any degradation, as opposed to image-based techniques that are highly sensitive to lighting conditions.
UR - http://www.scopus.com/inward/record.url?scp=85197461854&partnerID=8YFLogxK
U2 - 10.1109/ICUAS60882.2024.10557118
DO - 10.1109/ICUAS60882.2024.10557118
M3 - Conference contribution
AN - SCOPUS:85197461854
T3 - 2024 International Conference on Unmanned Aircraft Systems, ICUAS 2024
SP - 1056
EP - 1063
BT - 2024 International Conference on Unmanned Aircraft Systems, ICUAS 2024
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2024 International Conference on Unmanned Aircraft Systems, ICUAS 2024
Y2 - 4 June 2024 through 7 June 2024
ER -