Relative pose calibration between visual and inertial sensors

Jorge Lobo, Jorge Dias

Research output: Contribution to journalArticlepeer-review

200 Scopus citations


This paper proposes an approach to calibrate off-the-shelf cameras and inertial sensors to have a useful integrated system to be used in static and dynamic situations. When both sensors are integrated in a system their relative pose needs to be determined. The rotation between the camera and the inertial sensor can be estimated, concurrently with camera calibration, by having both sensors observe the vertical direction in several poses. The camera relies on a vertical chequered planar target and the inertial sensor on gravity to obtain a vertical reference. Depending on the setup and system motion, the translation between the two sensors can also be important. Using a simple passive turntable and static images, the translation can be estimated. The system needs to be placed in several poses and adjusted to turn about the inertial sensor centre, so that the lever arm to the camera can be determined. Simulation and real data results are presented to show the validity and simple requirements of the proposed methods.

Original languageBritish English
Pages (from-to)561-575
Number of pages15
JournalInternational Journal of Robotics Research
Issue number6
StatePublished - Jun 2007


  • Calibration
  • Computer vision
  • Inertial sensors
  • Sensor fusion


Dive into the research topics of 'Relative pose calibration between visual and inertial sensors'. Together they form a unique fingerprint.

Cite this