From the Calibration of a Light-Field Camera to Direct Plenoptic Odometry

Abstract

This paper presents a complete framework from the calibration of a plenoptic camera toward plenoptic camera based visual odometry. This is achieved by establishing the multiple view geometry for plenoptic cameras. Based on this novel multiple view geometry, a calibration approach is developed. The approach optimizes all intrinsic parameters of the plenoptic camera model, the 3D coordinates of the calibration points, and all camera poses in a single bundle adjustment. Our plenoptic camera based visual odometry algorithm, called direct plenoptic odometry (DPO), is a direct and semi-dense approach, which takes advantage of the full sensor resolution. DPO also relies on our multiple view geometry for plenoptic cameras. Tracking and mapping works directly on the micro images formed by the micro lens array and, therefore, has not to deal with aliasing effects in the spatial domain. The algorithm generates a semi-dense depth map based on correspondences between subsequent light-field frames, while taking differently focused micro images into account. Up to our knowledge, it is the first method that performs tracking and mapping for plenoptic cameras directly on the micro images. DPO outperforms state-of-the-art direct monocular simultaneous localization and mapping (SLAM) algorithms and can compete in accuracy with latest stereo SLAM approaches, while supplying much more detailed pointclouds.

Publication
IEEE Journal of Selected Topics in Signal Processing