TY - GEN
T1 - Bayesian 3D independent motion segmentation with IMU-aided RBG-D sensor
AU - Lobo, Jorge
AU - Ferreira, Joao Filipe
AU - Trindade, Pedro
AU - Dias, Jorge
PY - 2012
Y1 - 2012
N2 - In this paper we propose a two-tiered hierarchical Bayesian model to estimate the location of objects moving independently from the observer. Biological vision systems are very successful in motion segmentation, since they efficiently resort to flow analysis and accumulated prior knowledge of the 3D structure of the scene. Artificial perception systems may also build 3D structure maps and use optical flow to provide cues for ego- and independent motion segmentation. Using inertial and magnetic sensors and an image and depth sensor (RGB-D) we propose a method to obtain registered 3D maps, which are subsequently used in a probabilistic model (the bottom tier of the hierarchy) that performs background subtraction across several frames to provide a prior on moving objects. The egomotion of the RGB-D sensor is estimated starting with the angular pose obtained from the filtered accelerometers and magnetic data. The translation is derived from matched points across the images and corresponding 3D points in the rotation-compensated depth maps. A gyro-aided Lucas Kanade tracker is used to obtain matched points across the images. The tracked points can also used to refine the initial sensor based rotation estimation. Having determined the camera egomotion, the estimated optical flow assuming a static scene can be compared with the observed optical flow via a probabilistic model (the top tier of the hierarchy), using the results of the background subtraction process as a prior, in order to identify volumes with independent motion in the corresponding 3D point cloud. To deal with the computational load CUDA-based solutions on GPUs were used. Experimental results are presented showing the validity of the proposed approach.
AB - In this paper we propose a two-tiered hierarchical Bayesian model to estimate the location of objects moving independently from the observer. Biological vision systems are very successful in motion segmentation, since they efficiently resort to flow analysis and accumulated prior knowledge of the 3D structure of the scene. Artificial perception systems may also build 3D structure maps and use optical flow to provide cues for ego- and independent motion segmentation. Using inertial and magnetic sensors and an image and depth sensor (RGB-D) we propose a method to obtain registered 3D maps, which are subsequently used in a probabilistic model (the bottom tier of the hierarchy) that performs background subtraction across several frames to provide a prior on moving objects. The egomotion of the RGB-D sensor is estimated starting with the angular pose obtained from the filtered accelerometers and magnetic data. The translation is derived from matched points across the images and corresponding 3D points in the rotation-compensated depth maps. A gyro-aided Lucas Kanade tracker is used to obtain matched points across the images. The tracked points can also used to refine the initial sensor based rotation estimation. Having determined the camera egomotion, the estimated optical flow assuming a static scene can be compared with the observed optical flow via a probabilistic model (the top tier of the hierarchy), using the results of the background subtraction process as a prior, in order to identify volumes with independent motion in the corresponding 3D point cloud. To deal with the computational load CUDA-based solutions on GPUs were used. Experimental results are presented showing the validity of the proposed approach.
UR - http://www.scopus.com/inward/record.url?scp=84870592308&partnerID=8YFLogxK
U2 - 10.1109/MFI.2012.6343023
DO - 10.1109/MFI.2012.6343023
M3 - Conference contribution
AN - SCOPUS:84870592308
SN - 9781467325110
T3 - IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems
SP - 445
EP - 450
BT - 2012 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, MFI 2012 - Conference Proceedings
T2 - 2012 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, MFI 2012
Y2 - 13 September 2012 through 15 September 2012
ER -