Tracking an RGB-D Camera Using Points and Planes

    •  Ataer-Cansizoglu, E., Taguchi, Y., Ramalingam, S., Garaas, T., "Tracking an RGB-D Camera Using Points and Planes", IEEE Workshop on Consumer Depth Cameras for Computer Vision (CDC4CV), December 2013.
      BibTeX TR2013-106 PDF Software
      • @inproceedings{Ataer-Cansizoglu2013dec,
      • author = {Ataer-Cansizoglu, E. and Taguchi, Y. and Ramalingam, S. and Garaas, T.},
      • title = {Tracking an RGB-D Camera Using Points and Planes},
      • booktitle = {IEEE Workshop on Consumer Depth Cameras for Computer Vision (CDC4CV)},
      • year = 2013,
      • month = dec,
      • url = {}
      • }
  • Research Areas:

    Computer Vision, Robotics


Planes are dominant in most indoor and outdoor scenes and the development of a hybrid algorithm that incorporates both point and plane features provides numerous advantages. In this regard, we present a tracking algorithm for RGB-D cameras using both points and planes as primitives. We show how to extend the standard prediction-and-correction framework to include planes in addition to points. By fitting planes, we implicitly take care of the noise in the depth data that is typical in many commercially available 3D sensors. In comparison with the techniques that use only points, our tracking algorithm has fewer failure modes, and our reconstructed model is compact and more accurate. The tracking algorithm is supported by re-localization and bundle adjustment processes to demonstrate a real-time simultaneous localization and mapping (SLAM)system using a hand-held or robot-mounted RGB-D camera. Our experiments show large-scale indoor reconstruction results as point-based and plane-based 3D models, and demonstrate an improvement over the point-based tracking algorithms using a benchmark for RGB-D cameras.


  • Software & Data Downloads

  • Related Research Highlights