ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Download
Publications Copernicus
Download
Citation
Articles | Volume V-4-2022
ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci., V-4-2022, 137–143, 2022
https://doi.org/10.5194/isprs-annals-V-4-2022-137-2022
ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci., V-4-2022, 137–143, 2022
https://doi.org/10.5194/isprs-annals-V-4-2022-137-2022
 
18 May 2022
18 May 2022

MULTIPLE OBJECT TRACKING USING A TRANSFORM SPACE

M. Li1, J. Li1, A. Tamayo1, and L. Nan2 M. Li et al.
  • 1College of Electronic and Information Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing, China
  • 2Faculty of Architecture and the Built Environment, Delft University of Technology, Delft, the Netherlands

Keywords: Multiple Object Tracking, Tracking-by-Detection, Data Association, Deep Features, Transform Space

Abstract. This paper presents a method for multiple object tracking (MOT) in video streams. The method incorporates the prediction of physical locations of people into a tracking-by-detection paradigm. We predict the trajectories of people on an estimated ground plane and apply a learning-based network to extract the appearance features across frames. The method transforms the detected object locations from image space to an estimated ground space to refine the tracking trajectories. This transform space allows the objects detected from multi-view images to be associated under one coordinate system. Besides, the occluded pedestrians in image space can be well separated in a rectified ground plane where the motion models of the pedestrians are estimated. The effectiveness of this method is evaluated on different datasets by extensive comparisons with state-of-the-art techniques. Experimental results show that the proposed method improves MOT tasks in terms of the number of identity switches (IDSW) and the fragmentations (Frag).