ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Download
Publications Copernicus
Download
Citation
Articles | Volume V-4-2022
https://doi.org/10.5194/isprs-annals-V-4-2022-219-2022
https://doi.org/10.5194/isprs-annals-V-4-2022-219-2022
18 May 2022
 | 18 May 2022

IMPROVING 3D PEDESTRIAN DETECTION FOR WEARABLE SENSOR DATA WITH 2D HUMAN POSE

V. Kamalasanan, Y. Feng, and M. Sester

Keywords: 3D pedestrian detection, human pose estimation, augmented reality, shared space, wearable sensor

Abstract. Collisions and safety are important concepts when dealing with urban designs like shared spaces. As pedestrians (especially the elderly and disabled people) are more vulnerable to accidents, realising an intelligent mobility aid to avoid collisions is a direction of research that could improve safety using a wearable device. Also, with the improvements in technologies for visualisation and their capabilities to render 3D virtual content, AR devices could be used to realise virtual infrastructure and virtual traffic systems. Such devices (e.g., Hololens) scan the environment using stereo and ToF (Time-of-Flight) sensors, which in principle can be used to detect surrounding objects, including dynamic agents such as pedestrians. This can be used as basis to predict collisions. To envision an AR device as a safety aid and demonstrate its 3D object detection capability (in particular: pedestrian detection), we propose an improvement to the 3D object detection framework Frustum Pointnet with human pose and apply it on the data from an AR device. Using the data from such a device in an indoor setting, we conducted a comparative study to investigate how high level 2D human pose features in our approach could help to improve the detection performance of orientated 3D pedestrian instances over Frustum Pointnet.