Segmentation of dense depth maps using inertial data. A real-time implementation

Jorge Lobo, Luis Almeida, Jorge Dias

Research output: Contribution to conferencePaperpeer-review

2 Scopus citations

Abstract

In this paper we propose a real-time system that extracts information from dense relative depth maps. This method enables the integration of depth cues on higher level processes including segmentation of structures, object recognition, robot navigation or any other task that requires a three-dimensional representation of the physical environment. Inertial sensors coupled to a vision system can provide important inertial cues for the ego-motion and system pose. The sensed gravity provides a vertical reference. Depth maps obtained from a stereo camera system can be segmented using this vertical reference, identifying structures such as vertical features and levelled planes. In our work we explore the integration of inertial sensor data in vision systems. Depth maps obtained by vision systems, are very point of view dependant, providing discrete layers of detected depth aligned with the camera. In this work we use inertial sensors to recover camera pose, and rectify the maps to a reference ground plane, enabling the segmentation of vertical and horizontal geometric features. The aim of this work is a fast real-time system, so that it can be applied to autonomous robotic systems or to automated car driving systems, for modelling the road, identifying obstacles and roadside features in real-time.

Original languageBritish English
Pages92-97
Number of pages6
StatePublished - 2002
Event2002 IEEE/RSJ International Conference on Intelligent Robots and Systems - Lausanne, Switzerland
Duration: 30 Sep 20024 Oct 2002

Conference

Conference2002 IEEE/RSJ International Conference on Intelligent Robots and Systems
Country/TerritorySwitzerland
CityLausanne
Period30/09/024/10/02

Fingerprint

Dive into the research topics of 'Segmentation of dense depth maps using inertial data. A real-time implementation'. Together they form a unique fingerprint.

Cite this