Towards a Fast, Robust and Accurate Visual-Inertial Simultaneous Localization and Mapping System

2022-05
Loading...
Thumbnail Image

Persistent link to this item

Statistics
View Statistics

Journal Title

Journal ISSN

Volume Title

Title

Towards a Fast, Robust and Accurate Visual-Inertial Simultaneous Localization and Mapping System

Published Date

2022-05

Publisher

Type

Thesis or Dissertation

Abstract

A Simultaneous Localization and Mapping (SLAM) system estimates a robot's instantaneous location using onboard sensory measurements, e.g., LiDAR (Light Detection and Ranging) sensors, cameras, and inertial measurement units (IMU). It is particularly challenging where GPS reception is weak such as indoor, urban, and underwater environments, but also is a rather essential capability. For robots operating outdoors, the visual conditions can be quite poor, and such robots also have limited onboard computational resources. Our research investigated efficient and robust methods for SLAM algorithms on robots with limited computational and energy capabilities operating in challenging scenarios. This dissertation is mainly divided into three parts. The first part focuses on developing a stereo visual SLAM system for mobile robots operating outdoors with limited computational capacity. Compared to the state-of-the-art SLAM systems, the proposed method is independent of feature detection and matching, and thus it is computationally efficient and robust in adverse visual conditions, which is thoroughly validated on public datasets. In the second part, we further extend the visual SLAM system to a visual-inertial system by integrating IMU data for improved accuracy and robustness. Unlike most existing visual-inertial systems which are discrete-time, our system is continuous-time based on a spline representation, which provides the versatility for handling SLAM-related challenges (e.g., rolling shutter distortion) and applications (e.g., smooth path planning). Extensive experiments validate its state-of-the-art accuracy and real-time computational efficiency. In the third part, we turn our attention to rolling shutter distortion with the goal of improving SLAM performance on such cameras. We propose a deep neural network for accurate rolling shutter correction from a single-view image and IMU data. This enables numerous vision algorithms (e.g., SLAM systems) to run on rolling shutter cameras and produce highly accurate results. We demonstrate its efficacy by evaluating the performance of a SLAM algorithm on rolling shutter imagery corrected by the proposed approach. In summary, this dissertation devotes itself to improving the efficiency and robustness of SLAM systems in challenging scenarios such as the underwater environment. By advancing the state-of-the-art, the proposed methodologies bring SLAM systems one step closer to practical usage of mobile robots in challenging environments.

Description

University of Minnesota Ph.D. dissertation.May 2022. Major: Computer Science. Advisor: Junaed Sattar. 1 computer file (PDF); xi, 107 pages.

Related to

Replaces

License

Collections

Series/Report Number

Funding information

Isbn identifier

Doi identifier

Previously Published Citation

Other identifiers

Suggested citation

Mo, Jiawei. (2022). Towards a Fast, Robust and Accurate Visual-Inertial Simultaneous Localization and Mapping System. Retrieved from the University Digital Conservancy, https://hdl.handle.net/11299/241441.

Content distributed via the University Digital Conservancy may be subject to additional license and use restrictions applied by the depositor. By using these files, users agree to the Terms of Use. Materials in the UDC may contain content that is disturbing and/or harmful. For more information, please see our statement on harmful content in digital repositories.