Fusing of image and inertial sensing for camera calibration

Jorge Lobo, Jorge Dias

Research output: Contribution to conferencePaperpeer-review

8 Scopus citations


This paper explores the integration of inertial sensor data with vision. A method is proposed for the estimation of camera focal distance based on vanishing points and inertial sensors. Visual and inertial sensing are two sensory modalities that can be explored to give robust solutions on image segmentation and recovering of 3D structure from images, increasing the capabilities of autonomous vehicles and enlarging the application potential of vision systems. In this paper we show that using just one vanishing point, obtained from two parallel lines belonging to some levelled plane, and using the cameras attitude taken from the inertial sensors, the unknown scaling factor f in the camera's perspective projection can be estimated. The quality of the estimation of f depends on the quality of the vanishing point used and the noise level in the accelerometer data. Nevertheless it provides a reasonable estimate for a completely uncalibrated camera. The advantage over using two vanishing points is that the best (i.e. more stable) vanishing point can be chosen, and that in indoors environment the vanishing point point can sometimes be obtained from the scene without placing any specific calibration target.

Original languageBritish English
Number of pages6
StatePublished - 2001
EventInternational Conference on Multisensor Fusion and Integration for Intelligent Systems - Baden-Baden, Germany
Duration: 20 Aug 200122 Aug 2001


ConferenceInternational Conference on Multisensor Fusion and Integration for Intelligent Systems


Dive into the research topics of 'Fusing of image and inertial sensing for camera calibration'. Together they form a unique fingerprint.

Cite this