Volume III-3
ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci., III-3, 449-456, 2016
https://doi.org/10.5194/isprs-annals-III-3-449-2016
© Author(s) 2016. This work is distributed under
the Creative Commons Attribution 3.0 License.
ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci., III-3, 449-456, 2016
https://doi.org/10.5194/isprs-annals-III-3-449-2016
© Author(s) 2016. This work is distributed under
the Creative Commons Attribution 3.0 License.

  06 Jun 2016

06 Jun 2016

METRIC CALIBRATION OF A FOCUSED PLENOPTIC CAMERA BASED ON A 3D CALIBRATION TARGET

N. Zeller1,3, C. A. Noury2,4, F. Quint1, C. Teulière2,4, U. Stilla3, and M. Dhome2,4 N. Zeller et al.
  • 1Karlsruhe University of Applied Sciences, 76133 Karlsruhe, Germany
  • 2Université Clermont Auvergne, Université Blaise Pascal, Institut Pascal, BP 10448, 63000 Clermont-Ferrand, France
  • 3Technische Universität München, Germany, 80290 Munich, Germany
  • 4CNRS, UMR 6602, IP, 63178 Aubière, France

Keywords: Bundle adjustment, depth accuracy, depth distortion model, focused plenoptic camera, metric camera calibration

Abstract. In this paper we present a new calibration approach for focused plenoptic cameras. We derive a new mathematical projection model of a focused plenoptic camera which considers lateral as well as depth distortion. Therefore, we derive a new depth distortion model directly from the theory of depth estimation in a focused plenoptic camera. In total the model consists of five intrinsic parameters, the parameters for radial and tangential distortion in the image plane and two new depth distortion parameters. In the proposed calibration we perform a complete bundle adjustment based on a 3D calibration target. The residual of our optimization approach is three dimensional, where the depth residual is defined by a scaled version of the inverse virtual depth difference and thus conforms well to the measured data. Our method is evaluated based on different camera setups and shows good accuracy. For a better characterization of our approach we evaluate the accuracy of virtual image points projected back to 3D space.