Simultaneous localization and mapping (SLAM) is the problem of having a robot simultaneously estimating a map of an unknown environment and tracking its location in the constructed map. In its general form, SLAM assumes the robot has no a priori knowledge of the environment and is applicable to many forms of moving sensors (on robots, vehicles, mobile devices, etc.).
used by robots and autonomous vehicles to construct a map of an unknown environment (without a priori knowledge), or to update a map within a known environment (with a priori knowledge from a given map), while at the same time keeping track of their current location.
In this assignment, you will become familiar with the SLAM process for building a simple map using ROS's default navigation stack. No new Turtlebot packages will be developed for this assignment. The details of implementing SLAM are beyond the scope of the course, but in future assignments you will use the map you have constructed. This assignment is simply to introduce the problem and get you to understand some of the challenges involved with localization and mapping.
As an example of SLAM, below is a map of the CIT fourth floor generated by the ROS navigation stack (Room 404 is missing in the map):
Note: This map was produced by the PR2, with much more accurate, precise, and expensive ranging and odometry measurement equipment than the Kinect on the Turtlebot. Your maps probably will not be quite this nice, but you should be able to get close.
Assigned: Oct 5, 2011
Due : Oct 12, 2011, 12:30pm
In this assignment you will build a closed-loop map.
Your tasks for this assignment are as follows:
Two netbooks are provided in CIT Room 404. You will need to use one of these netbooks as your workstation and run SLAM to build a map. Password for all accounts is 123456.
In order to access and control a robot remotely, two Linux environment variables need to be set. If setup is done correctly, “rostopic list” should show the list of available topic from the robot. If you see “Could not contact ROS master”, either something is wrong with your environment settings or the robot has not yet been started. Note that setting an environment variable in a particular terminal window affects only commands given in that terminal window.
> export ROS_HOSTNAME=[ROBOT_IP]
> export ROS_HOSTNAME=[WORKSTATION_IP] > export ROS_MASTER_URI=http://[ROBOT_IP]:11311
Please refer this page Turtlebot Network Setup for more details
> roslaunch [turtlebot name].launch
> rosrun turtlebot_dashboard turtlebot_dashboard
Please refer the previous robot bringup instructions for more details.
Open a terminal on workstation and run teleop_keyboard
> rosrun teleop_twist_keyboard teleop_twist_keyboard.py /cmd_vel:=/turtlebot_node/cmd_vel
> rosrun mjpeg_server mjpeg_server
> rosrun rviz rviz -d `rospack find turtlebot_navigation`/nav_rviz.vcg
> roslaunch turtlebot_navigation gmapping_demo.launch
After this step, you should be able to see a turtlebot model, current progress of the map, laser sensor data. For more details, refer Turtlebot SLAM
Here is the sample closed-loop on CIT fourth floor:
Once your robot has navigated around the hall and built a beautiful closed-loop rviz map, save the map and end the project.
> rosrun map_server map_saver
This command will save the map as file name “map.pgm” and “map.yaml.” Refer this page for details map server
To submit this assignment, create a branch of your repository named “run_slam_[Your team name]”. You must submit a ROS package called “slam” which includes your generated map (both the pgm and yaml files), and a README file. The README file should explain your mapping procedure and anything you want us to know.
For extra credit, you can provide an additional image that is a Minkowski sum of the map with a circle of Turtlebot diameter. You will have to do this anyway for the path planning assignment, so it might as well be done earlier rather than later.
This assignment is a preparation for the next assignment, which will involve path planning. Make sure you understand how to set up the remote workstation access and SLAM.
Note that the map does not have to be perfect since the Turtlebot odometry and Kinect range sensors are both fairly noisy. However, the map you build should be better than this:
Turtlebot mapping example from ROS nav_stack tutorials: