Volume IV-2/W4
ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci., IV-2/W4, 349-354, 2017
https://doi.org/10.5194/isprs-annals-IV-2-W4-349-2017
© Author(s) 2017. This work is distributed under
the Creative Commons Attribution 4.0 License.
ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci., IV-2/W4, 349-354, 2017
https://doi.org/10.5194/isprs-annals-IV-2-W4-349-2017
© Author(s) 2017. This work is distributed under
the Creative Commons Attribution 4.0 License.

  14 Sep 2017

14 Sep 2017

RELATIVE PANORAMIC CAMERA POSITION ESTIMATION FOR IMAGE-BASED VIRTUAL REALITY NETWORKS IN INDOOR ENVIRONMENTS

M. Nakagawa, K. Akano, T. Kobayashi, and Y. Sekiguchi M. Nakagawa et al.
  • Dept. of Civil Engineering, Shibaura Institute of Technology, Tokyo, Japan

Keywords: Image-based virtual reality, panoramic image, optical flow, camera network, indoor environment

Abstract. Image-based virtual reality (VR) is a virtual space generated with panoramic images projected onto a primitive model. In imagebased VR, realistic VR scenes can be generated with lower rendering cost, and network data can be described as relationships among VR scenes. The camera network data are generated manually or by an automated procedure using camera position and rotation data. When panoramic images are acquired in indoor environments, network data should be generated without Global Navigation Satellite Systems (GNSS) positioning data. Thus, we focused on image-based VR generation using a panoramic camera in indoor environments. We propose a methodology to automate network data generation using panoramic images for an image-based VR space. We verified and evaluated our methodology through five experiments in indoor environments, including a corridor, elevator hall, room, and stairs. We confirmed that our methodology can automatically reconstruct network data using panoramic images for image-based VR in indoor environments without GNSS position data.