Strict Standards: Declaration of action_plugin_safefnrecode::register() should be compatible with DokuWiki_Action_Plugin::register($controller) in /home/brownrob/public_html/cs148/lib/plugins/safefnrecode/action.php on line 14

Strict Standards: Declaration of action_plugin_popularity::register() should be compatible with DokuWiki_Action_Plugin::register($controller) in /home/brownrob/public_html/cs148/lib/plugins/popularity/action.php on line 57

Warning: Cannot modify header information - headers already sent by (output started at /home/brownrob/public_html/cs148/lib/plugins/safefnrecode/action.php:14) in /home/brownrob/public_html/cs148/inc/auth.php on line 352

Strict Standards: Declaration of Doku_Renderer_metadata::table_open() should be compatible with Doku_Renderer::table_open($maxcols = NULL, $numrows = NULL, $pos = NULL) in /home/brownrob/public_html/cs148/inc/parser/metadata.php on line 24

Strict Standards: Declaration of Doku_Renderer_metadata::table_close() should be compatible with Doku_Renderer::table_close($pos = NULL) in /home/brownrob/public_html/cs148/inc/parser/metadata.php on line 24

Warning: Cannot modify header information - headers already sent by (output started at /home/brownrob/public_html/cs148/lib/plugins/safefnrecode/action.php:14) in /home/brownrob/public_html/cs148/inc/actions.php on line 180
buggin_out – Introduction to Autonomous Robotics
Dieses Dokuwiki verwendet ein von Anymorphic Webdesign erstelltes Thema.

CS 148 Project 3: Buggin' Out



In this assignment you will develop a ROS package for navigating to a goal location using the “Bug2” planar navigation algorithm with an m-line simplification. The task will involve following a path, defined by a piece of tape on the ground, while navigating around obstacles which block the path.

A video of Kayle Gishen's test implementation of this assignment:

The Adobe Flash Plugin is needed to display this content.

Important Dates

Assigned: Sep 28, 2011

Due: Oct 5, 2011, 12:30 PM


Bug algorithms typically assume the robot knows its relative localization to a goal location, but has no knowledge of obstacles in its world. Bug2 uses localization to construct an “m-line” (line from start-to-goal location), which is used for greedy navigation towards the goal. Bug2 will navigate along the m-line until bumping into an obstacle. The robot will navigate around the obstacle until re-encountering the m-line again at a closer location to the goal. At which point, the robot will continue along m-line to the goal.

In place of localization, we will approximate the m-line with a line on the floor (marked with tape) that a robot can follow. This tape-based m-line can be sensed by the iRobot Create's cliff detectors. These cliff detectors are infrared sensors that emit light and sense the amount of light reflects back from nearby surfaces. This sensor is typically used to determine when the robot is nearing a cliff or a stair. In this case, we use this sensor to detect the difference between carpet and tape, which reflect IR differently. Regardless of this m-line approximation, a properly implemented Bug2 will be able navigate the robot to the goal in the presence of arbitrary obstacles on the m-line.

Assignment Instructions

Your tasks for this assignment are as follows

  1. Bring up Turtlebot
  2. Automatically Calibrate IR values
  3. Perform wall following based on IR thresholds
  4. Follow the line based on the thresholds determined
  5. Combine wall and line following to navigate to goal
  6. Experiments and Submission

For this assignment you will need navigate around a random pattern of obstacles along a line, ranging from no obstacles to 5 or more. Your controller should be able to navigate around consecutive objects without needing to return to the line between objects. (I.E the robot may not be able to fit between two obstacles so it should go around the second as well). The robot should not continuously grind against obstacles in its path, it should back up off the object before turning to go around it.

The line to be followed may or may not be straight and will be about 2m in length and 2.54cm in width.

Running Turtlebot Driver

  • Run Turtlebot Driver, substituting the name of the robot for [turtlebotname]:
> roslaunch [turtlebotname].launch
  • Open another terminal and run Turtlebot Dashboard
> rosrun turtlebot_dashboard turtlebot_dashboard

Please refer the previous robot bringup instructions for more details.

Cliff Sensor

In order to detect the m-line, you need to calibrate each of 4 cliff sensors on the robot to distinguish carpet and tape. The iRobot Create comes equipped with 4 cliff sensors show in the image above. The Turtlebot publishes cliff sensor values in ROS in the turtlebot_node/sensor_state topic, under the boolean fields “cliff_left”, “cliff_front_left”, “cliff_front_right”, and “cliff_right”. You can see these these values as they are published using rostopic:

rostopic echo /turtlebot_node/sensor_state

You should see a noticeable difference in published cliff sensor values when the sensor is placed directly over tape as compared to carpet.


The recommended method for calibration for this assignment will be to read the IR values over a certain time period and then choose some threshold of the max and min values to signal when the IR sensor is over a tape or not. Note: Tape lines reflect light better than carpet and will have higher IR values

Calibration for this assignment should be done at run-time to permit usage in many different environments. That is, your robot should be able to calibrate without modification or recompilation of our its controller.

Wall Following

If your calibration is working, you should be able to make a simple modification to your enclosure escape controller to wall follow around tape boundaries in addition to walls. As an example is shown below by Jon Mace and Stephen Brawner's modified enclosure escape for wall following around tape boundaries and objects:

The Adobe Flash Plugin is needed to display this content.

Line following and Bug2

Devise a strategy for line following using the cliff sensors. Once implemented, use your wall and line followers within the Bug2 algorithm for navigation. It is not expected that your robot will be able to sense the goal location, only reach the goal location.

Submission and Test

To submit this project, create a branch of your repository named “buggin_out_[Your team name]”. You must submit your code in the form of a ROS package. Further, your submission must include a file named buggin_out.launch. To test your code, the course staff (after using rosmake to build your packages) will evoke the command:

roslaunch buggin_out.launch

Additionally, provide two images inside the top level directory of your packages:

  • bug2_tape_fail.png: an outline of an environment where the tape-based m-line will fail
  • bug2_tape_improve.png: an outline of an environment where the tape-based m-line will perform better than Bug2

Your ROS package will be checked out and run by the course staff, without modification. This code will be tested against five trials with two different obstruction patterns. Your controller must have an 80% success rate over all trials, where the robot is able to navigate around each obstruction and follow the line to the end. A trial is considered successful if the robot reaches the goal within 2 minutes. This should not imply moving the robot as fast as possible, but rather find a good combination of parameters.

Sample enclosures are outlined below:

buggin_out.txt · Last modified: 2011/10/03 15:03 by brian
Trace: buggin_out
Dieses Dokuwiki verwendet ein von Anymorphic Webdesign erstelltes Thema.
CC Attribution 3.0 Unported Valid CSS Driven by DokuWiki do yourself a favour and use a real browser - get firefox!! Recent changes RSS feed Valid XHTML 1.0