Deep Learning-Based Fusion of Multiple IMUs for Robust Inertial Navigation
This demonstration will focus on inertial sensor modeling and its critical role in navigation and localization within marine robotics, specifically addressing surface vessels such as ships and autonomous surface vehicles (ASVs). A real-time inertial navigation system based on a multi-IMU setup enhanced with deep learning will be presented. Using only data from inertial sensors, we will show how accurate position estimation can be achieved without GNSS or other external references, which is crucial in challenging marine environments. The core of the system is a Bidirectional LSTM (Bi-LSTM) neural network trained to learn and compensate for sensor errors. The demonstration involves live streaming of accelerometer and gyroscope data from multiple IMUs mounted on a surface platform. The neural network processes the data in real time and outputs a 2D trajectory estimate, which is visualized alongside a baseline method (fusion of the same IMU setup in the EKF) for comparison. This approach highlights the benefits of data-driven IMU modeling and its potential to improve autonomous marine navigation. The demo is lightweight, reproducible, and deployable on various USV platforms, including the H2OmniX developed at UNIZG-FER LABUST. Participants will gain insights into the challenges associated with IMU data interpretation and understand the benefits of integrating deep learning and multi-sensor approaches for robust sensor characterization in navigation systems. Practical examples will highlight how these approaches are integrated within ROS2-based frameworks for real-time navigation.
