Autonomous navigation of unmanned vehicles in GPS-denied environments is a challenging problem, especially for small ground vehicles and micro aerial vehicles (MAVs) which are characterized by their small payload, short battery lifetime and limited processing resources. Stereo vision positioning has been introduced as a scale-free positioning technique, but it is computationally expensive. Monocular vision systems aided by inertial measurement unit (IMU) are more computationally efficient but it suffers from IMU random biases and scale errors. In this paper, we propose a hybrid visual-inertial odometry solution that minimizes the computation load by dividing the mission into two interchangeable stages. Firstly, a stereo vision stage in which a loosely coupled integration between stereo cameras and IMU is performed. In this stage, an extended Kalman filter (EKF) is used to automatically and dynamically estimate IMU biases. Once the IMU is calibrated, a monocular stage is activated where the system is downgraded into single camera getting the motion scale from the calibrated IMU. The proposed solution has been tested using the popular IMU-enabled ZED-Mini tracking camera. We compared our stereo vision solution against the IMU-aided monocular solution and the results showed accurate positioning with the advantage of less computation. Further analysis is provided where we compared our solution with the built-in solutions of the ZED Mini camera and the Intel Realsense T265 tracking camera.

, , , ,
7th IEEE Global Conference on Signal and Information Processing, GlobalSIP 2019
Department of Systems and Computer Engineering

Mahmoud, A. (Ahmed), & Atia, M. (2019). Hybrid IMU-aided approach for optimized visual odometry. In GlobalSIP 2019 - 7th IEEE Global Conference on Signal and Information Processing, Proceedings. doi:10.1109/GlobalSIP45357.2019.8969460