Skip to content
forked from hku-mars/r3live

A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package

Notifications You must be signed in to change notification settings

dejavusgg/r3live

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

R3LIVE

A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package

Our preprint paper: Our preprint paper are available at here

[Updated] Data of release: We have just received the reviewer comments in the first round of paper reviews and we decide to release our codes before Dec 31, 2021.

Our related video: our related video is now available on YouTube (click below images to open, or watch it on Bilibili1, 2):

video video

Introduction

R3LIVE is a novel LiDAR-Inertial-Visual sensor fusion framework, which takes advantage of measurement of LiDAR, inertial, and visual sensors to achieve robust and accurate state estimation. R3LIVE is contained of two subsystems, the LiDAR-inertial odometry (LIO) and visual-inertial odometry (VIO). The LIO subsystem (FAST-LIO) takes advantage of the measurement from LiDAR and inertial sensors and builds the geometry structure of (i.e. the position of 3D points) global maps. The VIO subsystem utilizes the data of visual-inertial sensors and renders the map's texture (i.e. the color of 3D points).

The overview of this package
(a): R3LIVE is able to reconstruct a dense, 3D, RGB-colored point cloud of the traveled environment in real-time. The white path is our traveling trajectory for collecting the data. (b): The mesh reconstructed with our offline utilities. (c): The mesh after textured with the vertex colors, which is rendered by our VIO subsystem

R3LIVE is developed based on our previous work R2LIVE, with careful architecture design and implementation, is a versatile and well-engineered system toward various possible applications, which can not only serve as a SLAM system for real-time robotic applications, but can also reconstruct the dense, precise, RGB-colored 3D maps for applications like surveying and mapping. Moreover, to make R3LIVE more extensible, we develop a series of offline utilities for reconstructing and texturing meshes, which further minimizes the gap between R3LIVE and various of 3D applications such as simulators, video games and etc.

We use the maps reconstructed by R3LIVE to build the car (in (a)) and drone (in (b)) simulator with AirSim(https://microsoft.github.io/AirSim). The images in green and blue frameboxes are of the depth, RGB image query from the AirSim's API, respectively.
We use the maps built by R3LIVE to develop the video games for mobile platform (see (a)) and desktop PC (see (b)). In (a), the player is controling the actor to explore the campus of HKU. In (b), the player is fighting against the dragon with shoting the rubber balls in the campus of HKUST.

Relative works

  1. FAST-LIO: A computationally efficient and robust LiDAR-inertial odometry package.
  2. R2LIVE: A robust, real-time tightly-coupled multi-sensor fusion package.
  3. ikd-Tree: A state-of-art dynamic KD-Tree for 3D kNN search.
  4. LOAM-Livox: A robust LiDAR Odometry and Mapping (LOAM) package for Livox-LiDAR

About

A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published