Pose-Graph Neural Network Classifier for Global Optimality Prediction in 2D SLAM

Rana Azzam, Felix H. Kong, Tarek Taha, Yahya Zweiri

Research output: Contribution to journalArticlepeer-review

6 Scopus citations

Abstract

The ability to decide if a solution to a pose-graph problem is globally optimal is of high significance for safety-critical applications. Converging to a local-minimum may result in severe estimation errors along the estimated trajectory. In this paper, we propose a graph neural network based on a novel implementation of a graph convolutional-like layer, called PoseConv, to perform classification of pose-graphs as optimal or sub-optimal. The operation of PoseConv required incorporating a new node feature, referred to as cost, to hold the information that the nodes will communicate. A training and testing dataset was generated based on publicly available bench-marking pose-graphs. The neural classifier is then trained and extensively tested on several subsets of the pose-graph samples in the dataset. Testing results have proven the model's capability to perform classification with 92 - 98% accuracy, for the different partitions of the training and testing dataset. In addition, the model was able to generalize to previously unseen variants of pose-graphs in the training dataset. Our method trades a small amount of accuracy for a large improvement in processing time. This makes it faster than other existing methods by up-to three orders of magnitude, which could be of paramount importance when using computationally-limited robots overseen by human operators.

Original languageBritish English
Article number9443193
Pages (from-to)80466-80477
Number of pages12
JournalIEEE Access
Volume9
DOIs
StatePublished - 2021

Keywords

  • global optimality
  • graph neural network
  • Pose graph optimization
  • simultaneous localization and mapping

Fingerprint

Dive into the research topics of 'Pose-Graph Neural Network Classifier for Global Optimality Prediction in 2D SLAM'. Together they form a unique fingerprint.

Cite this