Volume IV-2/W3
ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci., IV-2/W3, 17-24, 2017
https://doi.org/10.5194/isprs-annals-IV-2-W3-17-2017
© Author(s) 2017. This work is distributed under
the Creative Commons Attribution 4.0 License.
ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci., IV-2/W3, 17-24, 2017
https://doi.org/10.5194/isprs-annals-IV-2-W3-17-2017
© Author(s) 2017. This work is distributed under
the Creative Commons Attribution 4.0 License.

  18 Aug 2017

18 Aug 2017

UCalMiCeL – UNIFIED INTRINSIC AND EXTRINSIC CALIBRATION OF A MULTI-CAMERA-SYSTEM AND A LASERSCANNER

M. Hillemann1,2 and B. Jutzi1 M. Hillemann and B. Jutzi
  • 1Institute of Photogrammetry and Remote Sensing, Karlsruhe, Germany
  • 2Fraunhofer Institute of Optronics, System Technologies and Image Exploitation, Ettlingen, Germany

Keywords: Calibration, Relative Pose, Orientation, Multi-Camera-System, Fisheye, Laserscanner

Abstract. Unmanned Aerial Vehicle (UAV) with adequate sensors enable new applications in the scope between expensive, large-scale, aircraftcarried remote sensing and time-consuming, small-scale, terrestrial surveyings. To perform these applications, cameras and laserscanners are a good sensor combination, due to their complementary properties. To exploit this sensor combination the intrinsics and relative poses of the individual cameras and the relative poses of the cameras and the laserscanners have to be known. In this manuscript, we present a calibration methodology for the Unified Intrinsic and Extrinsic Calibration of a Multi-Camera-System and a Laserscanner (UCalMiCeL). The innovation of this methodology, which is an extension to the calibration of a single camera to a line laserscanner, is an unifying bundle adjustment step to ensure an optimal calibration of the entire sensor system. We use generic camera models, including pinhole, omnidirectional and fisheye cameras. For our approach, the laserscanner and each camera have to share a joint field of view, whereas the fields of view of the individual cameras may be disjoint. The calibration approach is tested with a sensor system consisting of two fisheye cameras and a line laserscanner with a range measuring accuracy of 30 mm. We evaluate the estimated relative poses between the cameras quantitatively by using an additional calibration approach for Multi-Camera-Systems based on control points which are accurately measured by a motion capture system. In the experiments, our novel calibration method achieves a relative pose estimation with a deviation below 1.8° and 6.4 mm.