Decoupled Localization and Sensing with HMD-based AR for Interactive Scene Acquisition
Research output: Contribution to journal › Conference article › Research › peer-review
Real-Time tracking and visual feedback offer interactive AR-Assisted capture systems as a convenient and low-cost alternative to specialized sensor rigs and robotic gantries. We present a simple strategy for decoupling localization and visual feedback in these applications from the primary sensor being used to capture the scene. Our strategy is to use an AR HMD and 6-DOF controller for tracking and feedback, synchronized with a separate primary sensor for capturing the scene. This approach allows for convenient real-Time localization of sensors that cannot do their own localization (e.g., microphones). In this poster paper, we present a prototype implementation of this strategy and investigate the accuracy of decoupled tracking by mounting a high resolution camera as the primary sensor, and comparing decoupled runtime pose estimates to the pose estimates of a high-resolution offline structure from motion.
Original language | English |
---|---|
Journal | Adjunct Proceedings of the 2020 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2020 |
Pages (from-to) | 167-171 |
Number of pages | 5 |
DOIs | |
Publication status | Published - Nov 2020 |
Externally published | Yes |
Event | 2020 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2020 - Virtual, Recife, Brazil Duration: 9 Nov 2020 → 13 Nov 2020 |
Conference
Conference | 2020 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2020 |
---|---|
Country | Brazil |
City | Virtual, Recife |
Period | 09/11/2020 → 13/11/2020 |
Bibliographical note
Funding Information:
This work was supported in part by Stibofonden’s IT Travel Grant for PhD students and by Innovation Fund Denmark.
Publisher Copyright:
© 2020 IEEE.
- concepts and models, HCI theory, Human computer interaction (HCI), Human-centered computing, Interaction paradigms, Mixed / augmented reality
Research areas
ID: 301817943