Multisensor data fusion for vision-based UAV navigation and guidance

Suraj Bijjahalli, Roberto Sabatini

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review


Significant advances in sensor technology, along with economies of scale due to large production volumes, have supported the miniaturisation of navigation sensors, allowing widespread low-cost integration on unmanned aircraft systems (UAS). In small-size UAS applications, standalone sensors are not a viable option since the reduction in navigation sensor form-factor, weight and cost typically results in lowered accuracy and precision. The fusion of multiple sensor measurements in UAS navigation systems can support greater accuracy, integrity and update rates than is achievable employing individual sensors. This chapter introduces the fundamentals of state-estimation methods employed on UAS and presents different sensor integration architectures, along with an assessment of their advantages and trade-offs. Attention is devoted primarily to recursive optimal estimation algorithms such as the Kalman filter and its variants owing to its prolific employment in various classes of UAS. The need to support robust navigation performance in the global navigation satellite system denied environments, and the proliferation of visual sensors has led to the development of numerous methods for integrating visual sensor measurements (primarily) with inertial sensors. Therefore, the reader is introduced to the most popular system architectures for visual-inertial sensor integration in order to provide an understanding of the current state-of-the-art and to support the identification of future research pathways.

Original languageBritish English
Title of host publicationImaging and Sensing for Unmanned Aircraft Systems
Subtitle of host publicationControl and Performance
PublisherInstitution of Engineering and Technology
Number of pages22
ISBN (Electronic)9781785616426
StatePublished - 1 Jan 2020


  • Achievable employing individual sensors
  • Aerospace control
  • Aircraft navigation
  • Autonomous aerial vehicles
  • Collision avoidance
  • Computer vision
  • Computer vision and image processing techniques
  • Data fusion
  • Different sensor integration architectures
  • Economies of scale
  • Filtering methods in signal processing
  • Global navigation satellite system
  • Inertial sensors
  • Kalman filters
  • Low-cost integration
  • Lowered accuracy
  • Mobile robots
  • Multiple sensor measurements
  • Navigation sensor form-factor
  • Navigation sensors
  • Optical, image and video signal processing
  • Optimal estimation algorithms
  • Other topics in statistics
  • Popular system architectures
  • Production volumes
  • Prolific employment
  • Remotely operated vehicles
  • Robust navigation performance
  • Sensor fusion
  • Sensor technology
  • Small-size UAS applications
  • Standalone sensors
  • State-estimation methods
  • UAS navigation systems
  • Unmanned aircraft systems
  • Update rates
  • Vision-based UAV navigation
  • Visual sensor measurements
  • Visual sensors
  • Visual-inertial sensor integration


Dive into the research topics of 'Multisensor data fusion for vision-based UAV navigation and guidance'. Together they form a unique fingerprint.

Cite this