TitleLive Tracking and Mapping from Both General and Rotation-Only Camera Motion
Publication TypeConference Paper
Year of Publication2012
AuthorsGauglitz, S., C. Sweeney, J. Ventura, M. Turk, and T. Höllerer
Conference NameIEEE International Symposium on Mixed and Augmented Reality
Date Published10/2012
Conference LocationAtlanta, Georgia

We present an approach to real-time tracking and mapping that sup- ports any type of camera motion in 3D environments, that is, general (parallax-inducing) as well as rotation-only (degenerate) motions. Our approach effectively generalizes both a panorama mapping and tracking system and a keyframe-based Simultaneous Localization and Mapping (SLAM) system, behaving like one or the other depending on the camera movement. It seamlessly switches between the two and is thus able to track and map through arbitrary sequences of general and rotation-only camera movements.

Key elements of our approach are to design each system component such that it is compatible with both panoramic data and Structure-from-Motion data, and the use of the ‘Geometric Robust Information Criterion’ to decide whether the transformation between a given pair of frames can best be modeled with an essential matrix E, or with a homography H. Further key features are that no separate initialization step is needed, that the reconstruction is unbiased, and that the system continues to collect and map data after tracking failure, thus creating separate tracks which are later merged if they overlap. The latter is in contrast to most existing tracking and mapping systems, which suspend tracking and mapping, thus discarding valuable data, while trying to relocalize the camera with respect to the initial map.

We tested our system on a variety of video sequences, successfully tracking through different camera motions and fully automatically building panoramas as well as 3D structures.