Browsing by Subject "Inertial Navigation"
Now showing 1 - 1 of 1
- Results Per Page
- Sort Options
Item Extrinsic and intrinsic sensor calibration(2013-12) Mirzaei, Faraz M.Sensor Calibration is the process of determining the intrinsic (e.g., focal length) and extrinsic (i.e., position and orientation (pose) with respect to the world, or to another sensor) parameters of a sensor. This task is an essential prerequisite for many applications in robotics, computer vision, and augmented reality. For example, in the field of robotics, in order to fuse measurements from different sensors (e.g., camera, LIDAR, gyroscope, accelerometer, odometer, etc. for the purpose of Simultaneous Localization and Mapping or SLAM), all the sensors' measurements must be expressed with respect to a common frame of reference, which requires knowing the relative pose of the sensors. In augmented reality the pose of a sensor (camera in this case) with respect to the surrounding world along with its internal parameters (focal length, principal point, and distortion coefficients) have to be known in order to superimpose an object into the scene. When designing calibration procedures and before selecting a particular estimation algorithm, there exist two main issues of concern than one needs to consider: Whether the system is observable, meaning that the sensor's measurements contain sufficient information for estimating all degrees of freedom (d.o.f.) of the unknown calibration parameters; Given an observable system, whether it is possible to find the globally optimal solution.Addressing these issues is particularly challenging due to the nonlinearity of the sensors' measurement models. Specifically, classical methods for analyzing the observability of linear systems (e.g., the observability Gramian) are not directly applicable to nonlinear systems. Therefore, more advanced tools, such as Lie derivatives, must be employed to investigate these systems' observability. Furthermore, providing a guarantee of optimality for estimators applied to nonlinear systems is very difficult, if not impossible. This is due to the fact that commonly used (iterative) linearized estimators require initialization and may only converge to a local optimum. Even with accurate initialization, no guarantee can be made regarding the optimality of the solution computed by linearized estimators. In this dissertation, we address some of these challenges for several common sensors, including cameras, 3D LIDARs, gyroscopes, Inertial Measurement Units (IMUs), and odometers. Specifically, in the first part of this dissertation we employ Lie-algebra techniques to study the observability of gyroscope-odometer and IMU-camera calibration systems. In addition, we prove the observability of the 3D LIDAR-camera calibration system by demonstrating that only a finite number of values for the calibration parameters produce a given set of measurements. Moreover, we provide the conditions on the control inputs and measurements under which these systems become observable. In the second part of this dissertation, we present a novel method for mitigating the initialization requirements of iterative estimators for the 3D LIDAR-camera and monocular camera calibration systems. Specifically, for each problem we formulate a nonlinear Least-Squares (LS) cost function whose optimality conditions comprise a system of polynomial equations. We subsequently exploit recent advances in algebraic geometry to analytically solve these multivariate polynomial systems and compute the LS critical points. Finally, the guaranteed LS-optimal solutions are directly found by evaluating the cost function at the critical points without requiring any initialization or iteration.Together, our observability analysis and analytical LS methods provide a framework for accurate and reliable calibration of common sensors in robotics and computer vision.