Developing a Data Model for an Omnidirectional Image-Based Multi-Scale Representation of Space

Alexis Richard C. Claridades, Misun Kim, Jiyeong Lee

Research output: Contribution to journalConference articlepeer-review

Abstract

One of the major challenges that existing spatial data is facing is the fragmentation of its representation of indoor and outdoor space. As studies in the use of omnidirectional images in representing space and providing Location-based Services (LBS) has been increasing, the representation of the different scales of space, both in indoors and outdoors, has yet to be addressed. This study aims to develop a data model for generating a multi-scale image-based representation of space using omnidirectional images based spatial relationships. This paper identifies the different scales of space that are represented in spatial data and extends previous approaches of using omnidirectional images in providing indoor LBS towards representing the other scales of space, particularly in outdoor space. Using a sample data, we present an experimental implementation to demonstrate the potential of the proposed data model. Results show that apart from the realistic visualization that image data provides, basic spatial functions can be performed on the image data constructed based on the proposed data model.

Original languageEnglish
Pages (from-to)95-102
Number of pages8
JournalISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Volume10
Issue number4/W5-2024
DOIs
StatePublished - 27 Jun 2024
Event19th 3D GeoInfo Conference 2024 - Vigo, Spain
Duration: 1 Jul 20243 Jul 2024

Keywords

  • Image and Topology Integration
  • Indoor-Outdoor Integration
  • Location-based Service
  • Omnidirectional Image

Fingerprint

Dive into the research topics of 'Developing a Data Model for an Omnidirectional Image-Based Multi-Scale Representation of Space'. Together they form a unique fingerprint.

Cite this