In this assignment you will develop a ROS package for navigating to a goal location using the “Bug2” planar navigation algorithm with an m-line simplification. The task will involve following a path, defined by a piece of tape on the ground, while navigating around obstacles which block the path.
A video of Kayle Gishen's test implementation of this assignment:
Assigned: Sep 28, 2011
Due: Oct 5, 2011, 12:30 PM
Bug algorithms typically assume the robot knows its relative localization to a goal location, but has no knowledge of obstacles in its world. Bug2 uses localization to construct an “m-line” (line from start-to-goal location), which is used for greedy navigation towards the goal. Bug2 will navigate along the m-line until bumping into an obstacle. The robot will navigate around the obstacle until re-encountering the m-line again at a closer location to the goal. At which point, the robot will continue along m-line to the goal.
In place of localization, we will approximate the m-line with a line on the floor (marked with tape) that a robot can follow. This tape-based m-line can be sensed by the iRobot Create's cliff detectors. These cliff detectors are infrared sensors that emit light and sense the amount of light reflects back from nearby surfaces. This sensor is typically used to determine when the robot is nearing a cliff or a stair. In this case, we use this sensor to detect the difference between carpet and tape, which reflect IR differently. Regardless of this m-line approximation, a properly implemented Bug2 will be able navigate the robot to the goal in the presence of arbitrary obstacles on the m-line.
Your tasks for this assignment are as follows
For this assignment you will need navigate around a random pattern of obstacles along a line, ranging from no obstacles to 5 or more. Your controller should be able to navigate around consecutive objects without needing to return to the line between objects. (I.E the robot may not be able to fit between two obstacles so it should go around the second as well). The robot should not continuously grind against obstacles in its path, it should back up off the object before turning to go around it.
The line to be followed may or may not be straight and will be about 2m in length and 2.54cm in width.
> roslaunch [turtlebotname].launch
> rosrun turtlebot_dashboard turtlebot_dashboard
Please refer the previous robot bringup instructions for more details.
In order to detect the m-line, you need to calibrate each of 4 cliff sensors on the robot to distinguish carpet and tape. The iRobot Create comes equipped with 4 cliff sensors show in the image above. The Turtlebot publishes cliff sensor values in ROS in the turtlebot_node/sensor_state topic, under the boolean fields “cliff_left”, “cliff_front_left”, “cliff_front_right”, and “cliff_right”. You can see these these values as they are published using rostopic:
rostopic echo /turtlebot_node/sensor_state
You should see a noticeable difference in published cliff sensor values when the sensor is placed directly over tape as compared to carpet.
The recommended method for calibration for this assignment will be to read the IR values over a certain time period and then choose some threshold of the max and min values to signal when the IR sensor is over a tape or not. Note: Tape lines reflect light better than carpet and will have higher IR values
Calibration for this assignment should be done at run-time to permit usage in many different environments. That is, your robot should be able to calibrate without modification or recompilation of our its controller.
If your calibration is working, you should be able to make a simple modification to your enclosure escape controller to wall follow around tape boundaries in addition to walls. As an example is shown below by Jon Mace and Stephen Brawner's modified enclosure escape for wall following around tape boundaries and objects:
Devise a strategy for line following using the cliff sensors. Once implemented, use your wall and line followers within the Bug2 algorithm for navigation. It is not expected that your robot will be able to sense the goal location, only reach the goal location.
To submit this project, create a branch of your repository named “buggin_out_[Your team name]”. You must submit your code in the form of a ROS package. Further, your submission must include a file named buggin_out.launch. To test your code, the course staff (after using rosmake to build your packages) will evoke the command:
Additionally, provide two images inside the top level directory of your packages:
Your ROS package will be checked out and run by the course staff, without modification. This code will be tested against five trials with two different obstruction patterns. Your controller must have an 80% success rate over all trials, where the robot is able to navigate around each obstruction and follow the line to the end. A trial is considered successful if the robot reaches the goal within 2 minutes. This should not imply moving the robot as fast as possible, but rather find a good combination of parameters.
Sample enclosures are outlined below: