|Title||Wide-Area Scene Mapping for Mobile Visual Tracking|
|Publication Type||Conference Paper|
|Year of Publication||2012|
|Authors||Ventura, J., and T. Höllerer|
|Conference Name||IEEE International Symposium on Mixed and Augmented Reality|
We propose a system for easily preparing arbitrary wide-area envi- ronments for subsequent real-time tracking with a handheld device. Our system evaluation shows that minimal user effort is required to initialize a camera tracking session in an unprepared environment. We combine panoramas captured using a handheld omnidirectional camera from several viewpoints to create a point cloud model. Af- ter the offline modeling step, live camera pose tracking is initialized by feature point matching, and continuously updated by aligning the point cloud model to the camera image. Given a reconstruction made with less than five minutes of video, we achieve below 25 cm translational error and 0.5 degrees rotational error for over 80% of images tested. In contrast to camera-based simultaneous localization and mapping (SLAM) systems, our methods are suitable for handheld use in large outdoor spaces.