Skip to content

leggedrobotics/holistic_fusion

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Holistic Fusion: Task and Setup-agnostic Robot Localization and State Estimation with Factor Graphs

Authors: Julian Nubert ([email protected]), Turcan Tuna, Jonas Frey, Cesar Cadena, Katherine J. Kuchenbecker, Shehryar Khattak, Marco Hutter

Holistic Fusion (HF) is an open-source library for flexible task and setup-agnostic robot localization and state estimation. It provides a wide range of features that are very useful in common robotic workflows, including online sensor fusion, offline batch optimization and calibration. While it already supports a set of measurement factors quite common in field robotic applications, a core purpose of HF is to simplify the process of integrating new measurement types without manually deriving Jacobians by i) utilizing GTSAM expression factors and ii) following a specific structure to follow a common parent class interface, depending on the measurement type. Currently, HF supports three general measurement types: i) absolute measurements, ii) landmark measurements, iii) local & relative measurements.

Disclaimer: The framework is still under development and will be updated, extended, and more generalized in the future. More instructions will be added to the ReadTheDocs page soon.

Modules and Packages

This repository contains the following modules:

  1. Graph MSF: The core library for the sensor fusion. This library depends mainly on Eigen and GTSAM and can be used with any communication layer (including ROS1 and ROS2).
  2. Graph MSF ROS: This package provides an example class for GraphMsf in ROS. It is dependent on GraphMsf and ROS.
  3. ROS1 Examples: Examples on how to use GraphMsf and GraphMsfRos.
  4. ROS2 Examples: COMING SOON

Instructions

Please refer to our Read the Docs for detailed instructions regarding installation and usage.

*Note that the documentation is still under construction and will be updated soon.

Code Documentation

Please refer to our Doxygen for documentation of the code.

Paper

If you find this code useful or use it in your work, please consider citing:

[1] arXiv 2025

@misc{nubert2025holisticfusiontasksetupagnostic,
      title={Holistic Fusion: Task- and Setup-Agnostic Robot Localization and State Estimation with Factor Graphs}, 
      author={Julian Nubert and Turcan Tuna and Jonas Frey and Cesar Cadena and Katherine J. Kuchenbecker and Shehryar Khattak and Marco Hutter},
      year={2025},
      eprint={2504.06479},
      archivePrefix={arXiv},
      primaryClass={cs.RO},
      url={https://arxiv.org/abs/2504.06479}, 
}

[2] ICRA2022, Philadelphia

@inproceedings{nubert2022graph,
  title={Graph-based Multi-sensor Fusion for Consistent Localization of Autonomous Construction Robots},
  author={Nubert, Julian and Khattak, Shehryar and Hutter, Marco},
  booktitle={IEEE International Conference on Robotics and Automation (ICRA)},
  year={2022},
  organization={IEEE}
}

Acknowledgments

The authors thank their colleagues at ETH Zurich and NASA JPL for their help in conducting the robot experiments and evaluations and using HF on their robots. Special thanks go to Takahiro Miki and the ANYmal Hike team at the Robotic Systems Lab (RSL), ETH Zurich, Nikita Rudin, and David Hoeller for the ANYmal Parkour experiments, Patrick Spieler for running the deployments on the JPL RACER vehicle, the entire excavation team at RSL and Gravis Robotics, Thomas Mantel and the teaching assistants of the ETH Robotic Summer School for their help on the SuperMegaBot, and Mayank Mittal for his help in generating renderings.