TY - GEN
T1 - Autonomous Drone-Person Tracking and Following in Uniform Appearance Scenarios
AU - Alansari, Mohamad Yousif Abdulkareem
AU - Hay, Oussama Abdul
AU - Javed, Sajid
AU - Elrefaei, Hazem
AU - Alnuaimi, Khaled
AU - Hassan, Bilal
AU - Dias, Jorge
AU - Zweiri, Yahya
AU - Werghi, Naoufel
N1 - Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2025.
PY - 2025
Y1 - 2025
N2 - Drone following a person has emerged as a promising technique for various surveillance applications, garnering considerable attention from researchers over the years. Despite significant advancements reported in the literature, state-of-the-art (SOTA) methods have struggled to effectively address challenges inherent in real-world scenarios, such as the presence of distractors resembling the target person, all within stringent real-time constraints. In this study, we propose a novel drone-person tracking algorithm aimed at overcoming the challenges of person tracking within a Uniform Appearance (UA) setting in real-time. Our framework integrates several components, including a face detector (RetinaFace) for person detection and localization, a face recognizer (GhostFaceNets) to identify the target person among others in the frame, a visual object tracker for continuous target tracking across frames, and a PID controller to stabilize, follow, and update the drone’s state based on the target’s state. To ensure robust and synchronized tracking in the presence of similar distractors, we evaluate nine recent SOTA trackers using two publicly available UA tracking datasets, PTUA and D-PTUAC. The extensive real-time person following experiments conducted within the UA environment demonstrate that these SOTA trackers are both applicable and robust enough to deliver satisfactory performance in tracking and following a person via drone in UA scenarios.
AB - Drone following a person has emerged as a promising technique for various surveillance applications, garnering considerable attention from researchers over the years. Despite significant advancements reported in the literature, state-of-the-art (SOTA) methods have struggled to effectively address challenges inherent in real-world scenarios, such as the presence of distractors resembling the target person, all within stringent real-time constraints. In this study, we propose a novel drone-person tracking algorithm aimed at overcoming the challenges of person tracking within a Uniform Appearance (UA) setting in real-time. Our framework integrates several components, including a face detector (RetinaFace) for person detection and localization, a face recognizer (GhostFaceNets) to identify the target person among others in the frame, a visual object tracker for continuous target tracking across frames, and a PID controller to stabilize, follow, and update the drone’s state based on the target’s state. To ensure robust and synchronized tracking in the presence of similar distractors, we evaluate nine recent SOTA trackers using two publicly available UA tracking datasets, PTUA and D-PTUAC. The extensive real-time person following experiments conducted within the UA environment demonstrate that these SOTA trackers are both applicable and robust enough to deliver satisfactory performance in tracking and following a person via drone in UA scenarios.
KW - Drone-Person Tracking
KW - Uniform Appearance
KW - Visual Object Tracking (VOT)
UR - https://www.scopus.com/pages/publications/105007224134
U2 - 10.1007/978-3-031-91767-7_3
DO - 10.1007/978-3-031-91767-7_3
M3 - Conference contribution
AN - SCOPUS:105007224134
SN - 9783031917660
T3 - Lecture Notes in Computer Science
SP - 31
EP - 46
BT - Computer Vision – ECCV 2024 Workshops, Proceedings
A2 - Del Bue, Alessio
A2 - Canton, Cristian
A2 - Pont-Tuset, Jordi
A2 - Tommasi, Tatiana
PB - Springer Science and Business Media Deutschland GmbH
T2 - Workshops that were held in conjunction with the 18th European Conference on Computer Vision, ECCV 2024
Y2 - 29 September 2024 through 4 October 2024
ER -