Joint Estimation of Camera Orientation and Vanishing Points from an Image Sequence in a Non-Manhattan World
– Author : Jeong-Kyun Lee, Kuk-Jin Yoon
– Published Date : July 2019
– Place of publication : International Journal of Computer Vision
Abstract:
A widely used approach for estimating camera orientation is to use the points at infinity, i.e., the vanishing points (VPs). Enforcement of the orthogonal constraint between the VPs, known as the Manhattan world constraint, enables an estimation of the drift-free camera orientation to be achieved. However, in practical applications, this approach is neither effective (because of noisy parallel line segments) nor performable in non-Manhattan world scenes. To overcome these limitations, we propose a novel method that jointly estimates the VPs and camera orientation based on sequential Bayesian filtering. The proposed method does not require the Manhattan world assumption, and can perform a highly accurate estimation of camera orientation. In order to enhance the robustness of the joint estimation, we propose a keyframe-based feature management technique that removes false positives from parallel line clusters and detects new parallel line sets using geometric properties such as the orthogonality and rotational dependence for a VP, a line, and the camera rotation. In addition, we propose a 3-line camera rotation estimation method that does not require the Manhattan world assumption. The 3-line method is applied to the RANSAC-based outlier rejection technique to eliminate outlier measurements; therefore, the proposed of the camera orientation and VPs in general scenes with non-orthogonal parallel lines. We demonstrate the superiority of the proposed method by conducting an extensive evaluation using synthetic and real datasets and by comparison with other state-of-the-art methods.