Calibration and 3D Reconstruction with a Photo-Realistic Simulator Based on the Omnidirectional Vision System

Ivan Kholodilin1, Yuan Li1, Qinglin Wang1, Paul Bourke2
International Journal of Advanced Robotic Systems.
Volume: 18 issue: 6
DOI: 10.1177/17298814211059313

1. State Key Laboratory of Intelligent Control and Decision of Complex Systems, School of Automation, Beijing Institute of Technology, China
2. The University of Western Australia


Recent advancements in deep learning require a large amount of the annotated training data containing various terms and conditions of the environment. Thus, developing and testing algorithms for navigation of mobile robots can be expensive and time-consuming. Motivated by aforementioned problems this paper presents a photo-realistic simulator for the computer vision (CV) community working with omnidirectional vision systems. Built using Unity, the simulator integrates sensors, mobile robots, elements of the indoor environment, and allows one to generate synthetic photo-realistic datasets with automatic ground truth annotations. With the aid of the proposed simulator two practical applications are studied, namely extrinsic calibration of the vision system and 3D reconstruction of the indoor environment. For the proposed calibration and reconstruction techniques the processes themselves are simple, robust and accurate. Proposed methods are evaluated experimentally with data generated by the simulator.


Calibration, measurements, omnidirectional vision, simulation, structured light

PDF version of the paper