Holistic Fusion: Task and Setup-agnostic Robot Localization and State Estimation with Factor Graphs
Julian Nubert, Turcan Tuna, Jonas Frey, Cesar Cadena, Katherine J. Kuchenbecker, Shehryar Khattak, Marco Hutter
Abstract:
Modern robotic systems must be able to operate in challenging environments seamlessly and often require low-latency local motion estimation (e.g., for dynamic maneuvers) and accurate \textit{global} localization (e.g., for wayfinding).
While most existing sensor fusion solutions are designed with specific tasks or setups in mind, this work introduces a flexible solution for task- and setup-agnostic multi-modal sensor fusion called Holistic Fusion that distinguishes in its generality and simple adaptability to new tasks.
HF formulates sensor fusion as a holistic (combined) estimation problem of i) the local and global robot state and ii) the relationship of a (theoretically unlimited) number of dynamic context variables, e.g., for the automatic alignment of coordinate reference frames, which fits a large body of real-world applications without any conceptual modifications.
In particular, the proposed factor graph solution enables the direct fusion of a (theoretically) arbitrary number of absolute-, local-, and landmark measurements expressed w.r.t. different reference frames.
To account for the drift of different absolute measurements, the evolution of the reference frames is modeled as a random walk.
Moreover, to handle jumps in the robot state belief, particular attention is given to both local smoothness and consistency while ensuring global accuracy.
The proposed solution enables low-latency and smooth state estimation on typical robot hardware and provides low-drift globally consistent estimates in robot states at IMU-rate. %post-mission offline optimization,
The efficacy of the released proposed framework is demonstrated in six distinct real-world scenarios on three robotic platforms, each with different task requirements.
Github
Video
Read The Docs
Doxygen