ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Publications Copernicus
Articles | Volume V-1-2022
17 May 2022
 | 17 May 2022


P. Trusheim, M. Mehltretter, F. Rottensteiner, and C. Heipke

Keywords: Image Orientation, Dynamic Scene, Bundle Adjustment, Cooperative Localisation

Abstract. In the context of image orientation, it is commonly assumed that the environment is completely static. This is why dynamic elements are typically filtered out using robust estimation procedures. Especially in urban areas, however, many such dynamic elements are present in the environment, which leads to a noticeable amount of errors that have to be detected via robust adjustment. This problem is even more evident in the case of cooperative image orientation using dynamic objects as ground control points (GCPs), because such dynamic objects carry the relevant information. One way to deal with this challenge is to detect these dynamic objects prior to the adjustment and to process the related image points separately. To do so, a novel methodology to distinguish dynamic and static image points in stereoscopic image sequences is introduced in this paper, using a neural network for the detection of potentially dynamic objects and additional checks via forward intersection. To investigate the effects of the consideration of dynamic points in the adjustment, an image sequence of an inner-city traffic scenario is used; image orientation, as well as the 3D coordinates of tie points, are calculated via a robust bundle adjustment. It is shown that compared to a solution without considering dynamic points, errors in the tie points are significantly reduced, while the median of the precision of all 3D coordinates of the tie points is improved.