ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Download
Publications Copernicus
Download
Citation
Articles | Volume V-1-2021
https://doi.org/10.5194/isprs-annals-V-1-2021-137-2021
https://doi.org/10.5194/isprs-annals-V-1-2021-137-2021
17 Jun 2021
 | 17 Jun 2021

ROBUST ESTIMATION IN ROBOT VISION AND PHOTOGRAMMETRY: A NEW MODEL AND ITS APPLICATIONS

J. Li, Y. Zhang, and Q. Hu

Keywords: robust estimation, outlier, feature matching, image orientation, point cloud registration, simultaneous localization and mapping (SLAM)

Abstract. Robust estimation (RE) is a fundamental issue in robot vision and photogrammetry, which is the theoretical basis of geometric model estimation with outliers. However, M-estimations solved by iteratively reweighted least squares (IRLS) are only suitable for cases with low outlier rates (< 50%); random sample consensus (RANSAC) can only obtain approximate solutions. In this paper, we propose an accurate and general RE model that unifies various robust costs into a common objective function by introducing a “robustness-control” parameter. It is a superset of typical least-squares, l1-l2, Cauchy, and Geman-McClure estimates. We introduce a parameter-decreasing strategy into the IRLS to optimize our model, called adaptive IRLS. The adaptive IRLS begins with a least-squares estimate for initialization. Then, the “robustness-control” parameter is decreased along with iterations so that the proposed model acts as different robust loss functions and has different degrees of robustness. We also apply the proposed model in several important tasks of robot vision and photogrammetry, such as line fitting, feature matching, image orientation, and point cloud registration (scan matching). Extensive simulated and real experiments show that the proposed model is robust to more than 80% outliers and preserves the advantages of M-estimations (fast and optimal). Our source code will be made publicly available in https://ljy-rs.github.io/web.