Visual qdometry for velocity estimation of UGVs

Xiaojing Song, Lakmal D. Seneviratne, Kaspar Althoefer, Zibin Song, Yahya H. Zweiri

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

8 Scopus citations

Abstract

An accurate and robust velocity estimation method based on an optical flow technique is presented in this paper. Using image sequences captured by a monocular camera mounted under an UGV (unmanned ground vehicle), image velocities are obtained from the optical flow technique. Combining with a camera model, the velocities of the UGV are directly estimated. This velocity estimation method is validated over various types of terrain surfaces, such as coarse sand, fine sand and mixture of coarse sand and gravel. Experimental results show that estimated velocities have very good agreement with measured velocities. Height between the projection center of camera and the terrain surface is proved to be a key parameter in velocity estimation. Height compensation is implemented to give accurate velocity estimation results. Velocity estimation method proposed has many potential applications including localization and slip estimation for UGVs.

Original languageBritish English
Title of host publicationProceedings of the 2007 IEEE International Conference on Mechatronics and Automation, ICMA 2007
Pages1611-1616
Number of pages6
DOIs
StatePublished - 2007
Event2007 IEEE International Conference on Mechatronics and Automation, ICMA 2007 - Harbin, China
Duration: 5 Aug 20078 Aug 2007

Publication series

NameProceedings of the 2007 IEEE International Conference on Mechatronics and Automation, ICMA 2007

Conference

Conference2007 IEEE International Conference on Mechatronics and Automation, ICMA 2007
Country/TerritoryChina
CityHarbin
Period5/08/078/08/07

Keywords

  • Camera
  • Optical flow
  • UGV
  • Velocity estimation
  • Visual odometry

Fingerprint

Dive into the research topics of 'Visual qdometry for velocity estimation of UGVs'. Together they form a unique fingerprint.

Cite this