Tools for Detection Tracking and Autonomous Operations by Unmanned Aerial Vehiclesv

  • Muhammad A. Humais

Student thesis: Master's Thesis


The utilization of Unmanned Aerial Vehicles (UAVs) in various applications has seen exponential growth in the past decade. However, there are a few limitations associated with tele-operated UAVs. Although autonomous UAVs have been introduced for quite some time now, but still for a challenging scenario such as interacting or capturing a high-speed moving target in 3D space with a UAV require serious research effort. The development of a high speed and robust detection and tracking system that could run on-board the UAV using the existing hardware is critical. Thanks to NVIDIA that has recently made available this credit card sized powerful Jetson TX2 containing the GPU essential for implementing Deep Learning, hence enabling us to use state-of-the-art DL-based detectors on a micro UAV. Under this work we study different solutions for real-time detection and tracking and propose a novel robust vision system combining the DL-based detection and tracking in a way that they complement each other. The developed solution is able to achieve 13 FPS on a Jetson TX2 with reliable tracking and recover well from tracking failures while also computing 3D point cloud and estimating 3D position of the target object using a stereo camera. Further we incorporate this target position into our control algorithm to approach the target. A multi-rotor UAV is a non-linear and underactuated, system hence very challenging to control. Here we also need to ensure that while following the target the object remains in the camera view all the time. Simulation of our control algorithm in gazebo shows that our control algorithm works well for static as well as dynamic targets. MBZIRC 2020 challenge 1 scenario was undertaken as a case study for our work.
Date of AwardMay 2020
Original languageAmerican English


  • Object Detection
  • Object Tracking
  • Vision-Based Control
  • Autonomous UAVs
  • Deep Learning.

Cite this