3D Visualization of Building Interior using Omnidirectional Images

Alexis Richard C. Claridades, Jiyeong Lee, Ariel C. Blanco

Research output: Contribution to conferencePaperpeer-review

1 Scopus citations


Until recently, most mapping and visualization efforts have been concentrated for the outdoor environment. Nowadays, the development of maps of the indoor environment have been catching up with new technologies for data collection, processing, and modeling. The need for indoor maps is also emphasized with the demand for data and information about indoor spaces, and for its applications in evacuation, way-finding and visualization. Omnidirectional images present a simple yet realistic method for geographically representing indoor space compared to commonly-utilized data like point cloud or solid object models, since (x, y, z) coordinates may be obtained from each pixel because they may be georeferenced. In addition, its 360-degree field-of view (FOV) gives detailed and seamless visualization to users. This paper presents a workflow for collecting omnidirectional images for generating a 3D visualization of a building interior in the form of a virtual tour. The experiment was conducted in a building in a university campus. CAD files of the building were used as a guide for selecting the Shooting Points, which are locations in the hallway from which image capture is carried out. A DSLR camera with a fisheye lens mounted on a rotator and tripod was used at these Shooting Points to acquire at least 6 fisheye images completing a 360-degree FOV to ensure sufficient overlap. The images captured at each Shooting Point were processed using PTGui, a panoramic stitching tool. The overlaps ensured that control points common for at least 2 images in the set would be selected and used for stitching. The stitched image is checked for misalignment or erroneous stitching, and additional control points may be selected if necessary. To generate the 3D virtual tour, the images were linked based on how each Shooting Point is connected to an adjacent Shooting Point using PanoTour. In PanoTour, the location of adjacent Shooting Points would be pinpointed, and the corresponding view from those points in the corresponding images would be indicated. This results in a virtual tour that may be opened with any web browser and gives an immersive and seamless visualization of navigation a 3D indoor environment.

Original languageEnglish
Number of pages10
StatePublished - 2018
Event39th Asian Conference on Remote Sensing: Remote Sensing Enabling Prosperity, ACRS 2018 - Kuala Lumpur, Malaysia
Duration: 15 Oct 201819 Oct 2018


Conference39th Asian Conference on Remote Sensing: Remote Sensing Enabling Prosperity, ACRS 2018
CityKuala Lumpur


  • 3D Visualization
  • Indoor Navigation
  • Omnidirectional Images
  • Virtual Tour


Dive into the research topics of '3D Visualization of Building Interior using Omnidirectional Images'. Together they form a unique fingerprint.

Cite this