Academic Journal of Computing & Information Science, 2020, 3(1); doi: 10.25236/AJCIS.2020.030114.
Jian Lin1, Jierui Peng2, Zhichao Hu3, Xiaofeng Xie4 and Rui Peng*
1 School of Electronic Engineering, University of Electronic Science and Technology of China, Chengdu, China
2 Brandeis University, Computer Science Department, Waltham, USA
3 Research and Education Center High IT School, National Research Tomsk State University, Russia
4 College of Information Engineering, Wuchang Institute of Technology, Wuhan, China
* Corresponding author, Department of Mechanical Engineering, The Hong Kong Polytechnic University, Hong Kong
In this paper, we propose a light-weight multi-sensor fusion method containing ORB-SLAM, IMU and wheel odometry for localization and navigation of an indoor mobile robot in GPS-denied environment. Known as an accepted generally visual simultaneous localization and mapping (SLAM) system, ORB-SLAM based on feature matching computes real-time camera pose. The Inertial Measurement Unit (IMU) measures the angular velocity of the robot by one of its gyroscopes. The wheel odometry provides linear motion velocity for the robot and records distance the robot has moved. Through leveraging both rotation characteristic of IMU and linear characteristic of wheel odometry, the rough localization estimation for the robot is obtained. During every navigation of the robot, the rough localization estimation provides relatively accurate mapping scale of the real world for ORB-SLAM. And the mapping scale revises the monocular camera pose of ORB-SLAM to obtain global robot pose estimation in the real world. In the experiment, the robot can locate itself with tolerable error and perform great navigation ability in a specific scene.
mobile robot, ORB-SLAM, multi-sensor fusion, indoor localization
Jian Lin, Jierui Peng, Zhichao Hu, Xiaofeng Xie and Rui Peng. ORB-SLAM, IMU and Wheel Odometry Fusion for Indoor Mobile Robot Localization and Navigation. Academic Journal of Computing & Information Science (2020), Vol. 3, Issue 1: 131-141. https://doi.org/10.25236/AJCIS.2020.030114.
[1] Ral Mur-Artal, Jose Maria Martinez Montiel, Juan D. Tardos. ORB-SLAM: a versatile and accurate monocular SLAM system, IEEE Transactions on Robotics, vol. 31, no. 5, pp. 1147-1163, 2015.
[2] Hailan Kuang, Xiaodan Wang, Xinhua Liu, Xiaolin Ma, Ruifang Li, An Improved Robots Localization and Mapping Method Based on ORB-SLAM,” in 2017 10th International Symposium on Computational Intelligence and Design (ISCID), vol. 2, pp. 400-403, 2017.
[3] Shirong Wang, Yuan Li, Yue Sun, Xiaobin Li, Ning Sun, Xuebo Zhang, Ningbo Yu, A localization and navigation method with ORB-SLAM for indoor service mobile robots,” in 2016 IEEE International Conference on Real-time Computing and Robotics (RCAR), pp. 443-447, 2016.
[4] Garca, Walterio Mayol-Cuevas, Towards autonomous flight of micro aerial vehicles using ORB-SLAM,” in 2015 Workshop on Research, Education and Development of Unmanned Aerial Systems (RED-UAS), pp. 241-248, 2015.
[5] Omid Esrafilian, Hamid D. Taghirad, Autonomous flight and obstacle avoidance of a quadrotor by monocular SLAM,” in 2016 4th International Conference on Robotics and Mechatronics (ICROM), pp. 240-245, 2016.
[6] Tong Qin, Peiliang Li, Shaojie Shen, VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator,” IEEE Transactions on Robotics, 34 (4), pp. 1004-1020, 2018.
[7] Wen Fu, Ao Peng, Biyu Tang, Lingxiang Zheng, Inertial sensor aided visual indoor positioning,” in 2018 International Conference on Electronics Technology (ICET), pp. 106-110, 2018.M. Young, The Technical Writer's Handbook. Mill Valley, CA: University Science, 198.
[8] Seyed Jamalaldin Haddadi, Eugenio B. Castelan, Visual-Inertial Fusion for Indoor Autonomous Navigation of a Quadrotor Using ORB-SLAM,” in 2018 Latin American Robotic Symposium, 2018 Brazilian Symposium on Robotics (SBR) and 2018 Workshop on Robotics in Education (WRE), pp.106-111, 2018.Kass, R. E. and A. E. Raftery (1995). Bayes Factors. Journal of the American Statistical Association 90, 773–794.
[9] Jinglin Shen, David Tick, Nicholas Gans, Localization through fusion of discrete and continuous epipolar geometry with wheel and IMU odometry,” in Proceedings of the 2011 American Control Conference, pp. 1292-1298, 2011.
[10] Ethan Rublee, Vincent Rabaud, Kurt Konolige, Gary Bradski, Orb: An efficient alternative to sift or surf, in International Conference on Computer Vision, pp. 2564-2571, 2011.
[11] Joseph Redmon, Ali Farhadi, YOLO9000: Better, Faster, Stronger,” in 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 6517-6525, July 2017.
[12] Jean-Paul Laumond, Jean-Jacques Risler, Nonholonomic systems: controllability and complexity, Theoretical Computer Science, vol. 157, pp. 101114, 1996.