Simultaneous Localization and Mapping Matthew Thompson, UF matthewbot@ufl.edu The Problenn Simultaneous Localization and Mapping, or SLAM, is a problem in the field of autonomous vehicles. Its solution, only found in the last decade, has been called "a 'Holy Grair of the autonomous vehicle research community" [3]. Until several papers released in 2001 detailed new approaches to approaching SLAM, most roboticists had either studied mapping unknown environments using a robot which always knew its exact position, or determining the position of a robot which already had advance knowledge of its surroundings [6]. SLAM, true to its name, is the problem of performing both of the tasks simultaneously, without prior information about the environment or the robot's own position. It is a rather akin to the question of "which came first, the chicken or the egg?" [6]. To build an accurate map of its environment, a robot needs to first know its own position in the world, yet in order to determine its position, the robot must first have an accurate map of its environment. The Solution and Ongoing Research Although no one SLAM technique is clearly best, and indeed even among the mort important papers listed here there are varied solutions, most take an estimation-theoretic approach using a Kalman Filter [1]. The Kalman filter is a widely used method of processing uncertain measurements and producing an optimal estimate of the actual state of a system. The estimation-theoretic has been successfully implemented many times, and has been proven to converge to an ideal map and perfect robot position even in the face of noisy measurements and uncertainty in the robot's motion [3^. Ongoing research covers many improvements to the basic approach. For large environments the storage requirements of the naive approach grow too quickly, and so research is done in exploring potentially valuable trade offs in changing the underlying implementation of the map [1][5^. Experiments are also ongoing in achieving SLAM using sensors other than a LIDAR scanner, such as a sonar array [7]. Another area is improving the efficiency of methods used for relaxation of the robot's map, a process used to "close the loop" when a robot has traveled in a circle and senses previously mapped landmarks but at a different estimated position due to accumulated error [6^. 10 Features and Landmarks Vehicle-Feature Relative Observation 6- -2- Closing the loop after traveling along a rectangular hallway A multilevel Relaxation Algorithm for Simultaneous Localization and Mapping Frese U, Larsson P, Ducket T ' Laser scan -^<- Sonar returns 0 10 Mobile Vehicle Global Reference Frame A Solution to the Simultaneous Localization and Map Building (SLAM) Problem Dissanayake G, Newman P, Clark S Sonar and LIDAR readings compared Robust Mapping and Localization in Indoor Environments Using Sonar Data Tardos, Neira, Newman, Leonard 100 80- 60 40 - 20- -20 -40 - Radar Observations Vehicle Path Radar Refelectors t;' ''>i>^- *yi>,/ -*: t 1- I 4 4 '< _L -100 -50 50 X(m) A radar-based SLAM tracking features along a vehicle's path A Solution to the Simultaneous Localization and Map Building (SLAM) Problem Dissanayake G, Newman P, Clark S, Durrant-Whyte H, Csorba M Applications SLAM has had very immediate applications in autonomous vehicles such as those participating in the DARPA Grand Challenge. SLAM is also important for indoor robots, since it can calculate the position of a robot in the absence of GPS. It represents a huge step towards future domestic robots that interact and work with humans in a human- centric world, where the robots will have to learn for themselves and will be expected to adapt to an already existing environment. Another important application is search and rescue robots that could aid disaster response teams by quickly and efficiently mapping a disaster site and reporting back the location of survivors. Activmedia Peoplebot exploring an environment A multilevel Relaxation Algorithm for Simultaneous Localization and Mapping Frese U, Larsson P, Ducket T NaviGATOR, from the Center for Intelligent Machines and Robotics at the University of Florida http://www.research.ufl.edu/publications/explore/vllnl/story2.html Prolific Authors Name Total Papers Dissanayake, G L 1 11 Bal,T 10 Newman, P L 10 s Nuchter, A 9 Rajannani, SK L 9 Neira, J 8 Grisetti, G L 8 Leonard, JJ 7 mportant Papers Title Year Times Institutions or Cited Organizations Simu taneous map bui ding and localization for an autonomous mobile robot 1991 N/A (Conference proceedings) Princeton, University of Oxford t; Topologica simultaneous localization and mapping (SLAM): toward exact loca ization without exp icit ocalization 2001 175 Carnegie Mellon 1 \ A so ution to the simu taneous ocalization and map building (SLAM) problem 2001 407 EEE \ Optimization of the simultaneous loca ization and map-building a gorithm for real-time imp ementation 2001 212 EEE Robust Mapping and Localization in Indoor Environments Using Sonar Data 2002 156 University Zaragoza, \ MIT \ I A multi evel relaxation a gorithm for simultaneous loca ization and mapping 2005 49 EEE ^ MonoSLAM: Real-Time Single Camera SLAM 2007 118 IEEE ^ Name Papers University of Sydney 25 , MIT 15 ^ University Zaragoza 15 ' University of Oxford 12 Top Journals Name Papers Impact Factor . International Journa of Robotics Research 40 1.993 IEEE Transactions on Robotics 39 2.035 Robotics and Autonomous Systems 36 1.361 Journa of Field Robotics 25 1.989 Key Paper MonoSLAM: Real-Time Single Camera SLAM, stands out on the list of highly cited papers as the most recent, as well as the only paper whose number of citations per year is still increasing. It takes a breakthrough approach to SLAM by achieving it with a single camera [2]. This presents a very difficult SLAM scenario, because a single camera gives lots of high speed data to process but no direct distance measurements [2]. Technology exists to recreate the motion of a camera passing through a static environment, but only offline, processing the entire sequence from beginning to end. MonoSLAM gives a technique whereby estimates of the camera position and the map of the environment its in can be updated with each new frame of information acquired in real time [2], and can travel in a fixed environment indefinitely while maintaining a bounded error [2], while being efficient enough to operate at an update rate of 30hz [2]. (a) (b) ^ 1 r Im / d. f '^ ^ ^hb HB V C' -^ K^B m ^r "I r-J ^ ^ CiAAt? ^ r./' ^ ._____. .^' "^^Hl ^ y--^^'^ 4= (c) (d) Snapshots of a humanoid robot running MonoSLAM software on a single camera as it walks in a circle. MonoSLAM: Real-Time Single Camera SLAM Davison A, Reid I, Molton N, Stasse O References Choset, H., & Nagatani, K. (2001). Topological simultaneous localization and mapping (SLAM): Toward exact localization without explicit localization. Robotics and Automation, IEEE Transactions on, 17(2), US- UI. Davison, A. J., Reid, I. D., Molton, N. D., & Stasse, O. (2007). MonoSLAM: Real-time single camera SLAM. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 29(6), 1052-1067. Dissanayake, M. W. M. G., Newman, P., Clark, S., Durrant-Whyte, H. R, & Csorba, M. (2001). A solution to the simultaneous localization and map building (SLAM) problem. Robotics and Automation, IEEE Transactions on, 17(3), 229-241. Frese, U., Larsson, P., & Duckett, T. (2005). A multilevel relaxation algorithm for simultaneous localization and mapping. Robotics, IEEE Transactions on, 21(2), 196-207. Guivant, J. E., & Nebot, E. M. (2001). Optimization of the simultaneous localization and map-building algorithm for real-time implementation. Robotics and Automation, IEEE Transactions on, 17(3), 242-257. Leonard, J. J., & Durrant-Whyte, H. F. (1991). Simultaneous map building and localization for an autonomous mobile robot. Intelligent Robots and Systems '91. 'Intelligence for Mechanical Systems, Proceedings IROS '91. lEEE/RSJ International Workshop on, 1442-1447 vol.3. Tardos, J. D., Neira, J., Newman, P. M., & Leonard, J. J. (2002). Robust mapping and localization in indoor environments using sonar data. The InternationalJournal of Robotics Research, 21(4), 311-330.