ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Download
Publications Copernicus
Download
Citation
Articles | Volume V-2-2020
ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci., V-2-2020, 289–296, 2020
https://doi.org/10.5194/isprs-annals-V-2-2020-289-2020
ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci., V-2-2020, 289–296, 2020
https://doi.org/10.5194/isprs-annals-V-2-2020-289-2020

  03 Aug 2020

03 Aug 2020

OBJECT DETECTION AND CLASSIFICATION FROM CLUTTERED LARGE-SCALE INDOOR SCENE VIA ANCHOR-BASED GRAPH

F. Su1, Y. Liang1, Z. Gang1, X. Zuo1, F. Yang1, H. Zhu1, and L. Li1,2 F. Su et al.
  • 1School of Resource and Environmental Sciences, Wuhan University, 129 Luoyu Road, Wuhan 430079, China
  • 2Collaborative Innovation Centre of Geospatial Technology, Wuhan University, 129 Luoyu Road, Wuhan 430079, China

Keywords: Object detection, Object classification, Graph matching, Geometric Similarity, Point Cloud

Abstract. Indoor object detection and classification from scanned point clouds has recently attracted considerable research interest. However, detecting and classifying objects with arbitrary upward orientation has emerged as a substantial challenge. This paper presents an anchor-based graph method via geometric and topological similarity among indoor objects. With this method, the misclassification that usually occurs in the objects placed non-vertical with the floor is overcome by extracting anchor in each graph via nodes’ geometric attribute and by matching graph via topological relationship between nodes and anchor, rather than the features along the upward orientation. A region growing-based method along the anchor’s upward orientation is proposed for classifying the unlabeled over-segmentation parts. Such an anchor-based method ensures both the accuracy of object classification and the geometric integrity of object. A series of experimental tests using three real-world 3D scans of indoor environments show the effectiveness and feasibility of the proposed method.