Abstract
This letter proposes a framework to perform 3-D reconstruction using a heterogeneous sensor network, with potential use in augmented reality, human behavior understanding, smart-room implementations, robotics, and many other applications. We fuse orientation measurements from inertial sensors, images from cameras and depth data from Time of Flight sensors within a probabilistic framework in a synergistic manner to obtain robust reconstructions. A fully probabilistic method is proposed to efficiently fuse the multi-modal data of the system.
| Original language | British English |
|---|---|
| Article number | 7873305 |
| Pages (from-to) | 2640-2641 |
| Number of pages | 2 |
| Journal | IEEE Sensors Journal |
| Volume | 17 |
| Issue number | 9 |
| DOIs | |
| State | Published - 1 May 2017 |
Keywords
- 3D reconstruction
- heterogeneous sensor network
- Multi-modal fusion
- probabilistic