Abstract:To meet the requirement of the indoor travelling and the localization of mobile robot, an approach for autonomous exploration, localization and mapping is proposed based on visual FastSLAM. Firstly, exploration position based on frontiers of the explored region is selected with consideration of information gain and path distance, and then path planning with the shortest distance to exploration position is performed to ensure the maximized exploration efficiency and completeness of the task accomplishment. FastSLAM 2.0 is employed as the basis of the proposed localization and mapping algorithm obtaining observation data by using robot vision, data observation efficiency is increased by fusing panoramic scanning and landmark tracking, and data association estimation is improved by introducing landmark visual information into calculation. The experimental results show that the proposed approach selects the best exploration position accurately, makes path planning reasonably, and accomplishes the exploration task successfully. The localization and mapping results of the proposed algorithm are robust with high accuracy.
崔帅,高隽,张骏,范之国. 基于视觉FastSLAM的移动机器人自主探索方法*[J]. 模式识别与人工智能, 2016, 29(12): 1083-1094.
CUI Shuai, GAO Jun, ZHANG Jun, FAN Zhiguo. Autonomous Exploration Approach of Mobile Robot Based on Visual FastSLAM. , 2016, 29(12): 1083-1094.
[1] THRUN S, BURGARD W, FOX D. Probabilistic Robotics. Cambridge, USA: MIT Press, 2005. [2] JONES C V. An Introduction to Graph-Based Modeling Systems, Part I: Overview. ORSA Journal on Computing, 1990, 2(2): 136-151. [3] GRISETTI G, KUMMERLE R, STACHNISS C, et al. A Tutorial on Graph-Based SLAM. IEEE Intelligent Transportation Systems Magazine, 2010, 2(4): 31-43. [4] THRUN S, LIU Y F, KOLLER D, et al. Simultaneous Localization and Mapping with Sparse Extended Information Filters. The International Journal of Robotics Research, 2004, 23(7/8): 693-716. [5] MONTEMERLO M, THRUN S, KOLLER D, et al. FastSLAM: A Factored Solution to the Simultaneous Localization and Mapping Problem // Proc of the AAAI National Conference on Artificial Inte-lligence. Palo Alto, USA: AAAI Press, 2002: 593-598. [6] MONTEMERLO M, THRUN S. Simultaneous Localization and Ma-pping with Unknown Data Association Using FastSLAM // Proc of the IEEE International Conference on Robotics and Automation. Washington, USA: IEEE, 2003: 1985-1991. [7] MONTEMERLO M, THRUN S, ROLLER D, et al. FastSLAM 2.0: An Improved Particle Filtering Algorithm for Simultaneous Localization and Mapping that Provably Converges // Proc of the 18th International Joint Conference on Artificial Intelligence. San Francisco, USA: Morgan Kaufmann, 2003: 1151-1156. [8] 陈世明,袁军锋,陈小玲,等.基于类电磁机制优化的FastSLAM2.0算法.控制理论与应用, 2015, 32(1): 127-132. (CHEN S M, YUAN J F, CHEN X L, et al. A FastSLAM2.0 Algorithm Based on Electromagnetism-Like Mechanism. Control Theory & Applications, 2015, 32(1): 127-132.) [9] 周 武,赵春霞,张浩峰.一种基于AMPF和FastSLAM的复合SLAM算法.模式识别与人工智能, 2009, 22(5): 718-725. (ZHOU W, ZHAO C X, ZHANG H F. An AMPF and FastSLAM Based Compositive SLAM Algorithm. Pattern Recognition and Artificial Intelligence, 2009, 22(5): 718-725.) [10] LEMAIRE T, LACROIX S. Monocular-Vision Based SLAM Using Line Segments // Proc of the IEEE International Conference on Robotics and Automation. Washington, USA: IEEE, 2007: 2791-2796. [11] ENGEL J, SCH PS T, CREMERS D. LSD-SLAM: Large-Scale Direct Monocular SLAM // Proc of the 13th European Conference on Computer Vision. Berlin, Germany: Springer, 2014: 834-849. [12] MUR-ARTAL R, MONTIEL J M M, TARDS J D. ORB-SLAM: A Versatile and Accurate Monocular SLAM System. IEEE Trans on Robotics, 2015, 31(5): 1147-1163. [13] FORSTER C, PIZZOLI M, SCARAMUZZA D. SVO: Fast Semi-direct Monocular Visual Odometry // Proc of the IEEE Internatio-nal Conference on Robotics and Automation. Washington, USA: IEEE, 2014: 15-22. [14] MEI C, SIBLEY G, CUMMINS M, et al. A Constant Time Efficient Stereo SLAM System // Proc of the British Machine Vision Conference. Durham, UK: BMVA Press, 2009: 54.1-54.11. [15] ENGEL J, STCKLER J, CREMERS D. Large-Scale Direct SLAM with Stereo Cameras // Proc of the IEEE/RSJ International Conference on Intelligent Robots and Systems. Washington, USA: IEEE, 2015: 1935-1942. [16] ENDRES F, HESS J, STURM J, et al. 3-D Mapping with an RGB-D Camera. IEEE Trans on Robotics, 2014, 30(1): 177-187. [17] NEWCOMBE R A, IZADI S, HILLIGES O, et al. KinectFusion: Real-Time Dense Surface Mapping and Tracking // Proc of the 10th IEEE International Symposium on Mixed and Augmented Reality. Washington, USA: IEEE, 2011: 127-136. [18] 辛 菁,苟蛟龙,马晓敏,等.基于Kinect的移动机器人大视角3维V-SLAM.机器人, 2014, 36(5): 560-568. (XIN J, GOU J L, MA X M, et al. A Large Viewing Angle 3-Dimensional V-SLAM Algorithm with a Kinect-Based Mobile Robot System. Robot, 2014, 36(5): 560-568.) [19] YAMAUCHI B. Frontier-Based Exploration Using Multiple Robots // Proc of the 2nd International Conference on Autonomous Agents. New York, USA: ACM, 1998: 47-53. [20] SIMMONS R, APFELBAUM D, BURGARD W, et al. Coordination for Multi-robot Exploration and Mapping // Proc of the 17th National Conference on Artificial Intelligence and 12th Conference on Innovative Applications of Artificial Intelligence. Palo Alto, USA: AAAI Press, 2000: 852-858. [21] MOOREHEAD S J, SIMMONS R, WHITTAKER W L. Autonomous Exploration Using Multiple Source of Information // Proc of the IEEE International Conference on Robotics and Automation. Washington, USA: IEEE, 2001, III: 3098-3103. [22] AMIGONI F, GALLO A. A Multi-objective Exploration Strategy for Mobile Robots // Proc of the IEEE International Conference on Robotics and Automation. Washington, USA: IEEE, 2005: 3850-3855. [23] GONZLEZ-BAOS H, LATOMBE J C. Navigation Strategies for Exploring Indoor Environments. The International Journal of Robo-tics Research, 2002, 21(10/11): 829-848. [24] TOVAR B, MUOZ-GMEZ L, MURRIETA-CID R, et al. Pla-nning Exploration Strategies for Simultaneous Localization and Ma-pping. Robotics and Autonomous Systems, 2006, 54(4): 314-331. [25] TOVAR B, GUILAMO L, LAVALLE S M. Gap Navigation Trees: Minimal Representation for Visibility-Based Tasks // ERDMANN M, OVERMARS M, HSU O, et al., eds. Algorithmic Foundations of Robotics VI. Berlin, Germany: Springer, 2005: 425-440. [26] BANDYOPADHYAY T, LI Y D, ANG JR M P, et al. Stealth Tracking of an Unpredictable Target among Obstacles // ERDMANN M, OVERMARS M, HSU O, et al., eds. Algorithmic Foundations of Robotics VI. Berlin, Germany: Springer, 2005: 43-58. [27] NASIR R, ELNAGAR A. Gap Navigation Trees for Discovering Unknown Environments. Intelligent Control and Automation, 2015, 6(4): 229-240. [28] LAVALLE S M, JR KUFFNER J J. Rapidly-Exploring Random Trees: Progress and Prospects // DONALD B, LYNCH K, RUS D, eds. Algorithmic and Computational Robotics. New Directions. USA: A. K. Peters, 2000: 293-308. [29] LAVALLE S M, JR KUFFNER J J. Randomized Kinodynamic Planning. The International Journal of Robotics Research, 2001, 20(5): 378-400. [30] ORIOLO G, VENDITTELLI M, FREDA L, et al. The SRT Me-thod: Randomized Strategies for Exploration // Proc of the IEEE International Conference on Robotics and Automation. Washington, USA: IEEE, 2004, V: 4688-4694. [31] VALLV J, ANDRADE-CETTO J. Potential Information Field for Mobile Robot Exploration. Robotics and Autonomous Systems, 2015, 69: 68-79. [32] TOURETZKY D, TIRA-THOMPSON E. The Tekkotsu Crew: Teaching Robot Programming at a Higher Level // Proc of the 24th AAAI Conference on Artificial Intelligence. Palo Alto, USA: AAAI Press, 2010: 1908-1913. [33] TOURETZKY D. Tekkotsu Wiki[EB/OL]. [2016-01-15]. http://wiki.tekkotsu.org/index.php/Main_Page. [34] BOUGUET J. Camera Calibration Toolbox for Matlab[EB/OL]. [2016-01-15]. http://www.vision.caltech.edu/bouguetj/calib_doc/index.html. [35] TIRA-THOMPSON E. Denavit-Hartenberg Conventions[EB/OL]. [2016-01-15]. http://wiki.tekkotsu.org/index.php/Denavit-Hartenberg Conventions. [36] TOURETZKY D, HALELAMIEN N S, TIRA-THOMPSON E, et al. Dual-Coding Representations for Robot Vision Programming in Tekkotsu. Autonomous Robots, 2007, 22(4): 425-435. [37] OLSON E. AprilTag: A Robust and Flexible Visual Fiducial System // Proc of the IEEE International Conference on Robotics and Automation. Washington, USA: IEEE, 2011: 3400-3407.