Bio-Inspired Home Localization Using Event-Based Vision and Spiking Neural Networks in Simulated Environment

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

In this work, we present a bio-inspired approach for home localization using event-based visual data and spiking convolutional neural networks (S-CNNs) in a simulated environment within NVIDIA Omniverse. Drawing inspiration from the navigational strategies of insects, which utilize visual cues and homing vectors to return to their nests, our approach leverages event cameras that mimic the asynchronous change-driven visual processing of insect eyes. To process this sparse, event-driven input, we employ spiking neural networks (SNNs), which mirror the spike-based transmission of information in biological neural systems. The proposed system utilizes a quadrotor equipped with an event camera to capture dynamic, asynchronous visual data and processes it using an S-CNN trained to estimate relative home vectors. This vector encoding method is inspired by the unit-circle representation of gaze directions observed in insect homing behavior, as outlined in recent studies. By integrating event-based vision and SNNs, our approach ensures energy-efficient computation, robustness to lighting variations, and adaptability to dynamic scenes. We validate the framework in a 3D environment modeled within ISAAC Sim, where the quadrotor autonomously navigates back to a designated "nest"location. Comparative analysis with conventional frame-based neural networks demonstrates the superiority of the proposed system in terms of efficiency, robustness, and accuracy. This work establishes a novel bio-inspired framework for integrating event-based data and spiking neural networks, paving the way for energy-efficient localization in robotics and smart home environments. Future work will explore the deployment of the system in a multi-quadrotor environment to coordinate collaborative tasks including the integration of multi-modal sensory inputs such as Camera, IMU, Gas sensors and GPS data, and the extension of the framework to real-world settings for further validation and scalability.

Original languageBritish English
Title of host publication2025 IEEE International Conference on Simulation, Modeling, and Programming for Autonomous Robots, SIMPAR 2025
EditorsIgnazio Infantino, Valeria Seidita
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9798331516857
DOIs
StatePublished - 2025
Event2025 IEEE International Conference on Simulation, Modeling, and Programming for Autonomous Robots, SIMPAR 2025 - Palermo, Italy
Duration: 14 Apr 202518 Apr 2025

Publication series

Name2025 IEEE International Conference on Simulation, Modeling, and Programming for Autonomous Robots, SIMPAR 2025

Conference

Conference2025 IEEE International Conference on Simulation, Modeling, and Programming for Autonomous Robots, SIMPAR 2025
Country/TerritoryItaly
CityPalermo
Period14/04/2518/04/25

UN SDGs

This output contributes to the following UN Sustainable Development Goals (SDGs)

  1. SDG 7 - Affordable and Clean Energy
    SDG 7 Affordable and Clean Energy

Keywords

  • Event
  • Insect Inspired
  • Localization
  • Regression
  • Spiking Neural Network

Fingerprint

Dive into the research topics of 'Bio-Inspired Home Localization Using Event-Based Vision and Spiking Neural Networks in Simulated Environment'. Together they form a unique fingerprint.

Cite this