Strict Standards: Declaration of action_plugin_safefnrecode::register() should be compatible with DokuWiki_Action_Plugin::register($controller) in /home/brownrob/public_html/cs148/lib/plugins/safefnrecode/action.php on line 14

Strict Standards: Declaration of action_plugin_popularity::register() should be compatible with DokuWiki_Action_Plugin::register($controller) in /home/brownrob/public_html/cs148/lib/plugins/popularity/action.php on line 57

Warning: Cannot modify header information - headers already sent by (output started at /home/brownrob/public_html/cs148/lib/plugins/safefnrecode/action.php:14) in /home/brownrob/public_html/cs148/inc/auth.php on line 352

Warning: Cannot modify header information - headers already sent by (output started at /home/brownrob/public_html/cs148/lib/plugins/safefnrecode/action.php:14) in /home/brownrob/public_html/cs148/inc/actions.php on line 180
object_seeking – Introduction to Autonomous Robotics
Dieses Dokuwiki verwendet ein von Anymorphic Webdesign erstelltes Thema.

CS 148 Project 2: Object Seeking

Introduction

In this assignment, you will develop a ROS package for perceiving 2 types of objects: (1) objects of a solid color and (2) objects labelled with an augmented reality (AR) tag. Your controller will drive the robot to move between these objects.

A video of Lisa Miller's implementation of this assignment from Spring Semester 2009:

The Adobe Flash Plugin is needed to display this content.

and an updated version by the course staff for the Turtlebot:

The Adobe Flash Plugin is needed to display this content.

Important Dates

Assigned: Sept 19, 01:00pm, 2011

Due: Sept 26, 12:30pm, 2011

Description

In this assignment, you will build a controller in ROS to perform “object seeking”. In this seeking task, your robot will perceive and drive to objects that are visually recognizable by a solid color appearance or labeled with an AR tag from the robot's visual sensing (i.e., camera). For this assignment, you will be working primarily with the turtlebot and a Microsoft Kinect. For object recognition in ROS, you will use the cmvision package for color blobfinding and the ar_recog package for AR tag recognition.

Assuming perception of objects salient by color or pattern, you will develop an object seeking package for this assignment that enables a robot to continually drive between these (non-occluded) objects in a sequence given at run-time. Your controller's decision making should take the form a finite state machine (FSM). This FSM should use one state variable to specify the currently sought object. For motion control, you should use proportional-derivative (PD) servoing to center objects in the robot's field of view. As a whole, your controller should put the current object in the center of view, drive as close as possible to an object without hitting it, increment the state variable, and continue the process for the next object.

ROS support packages

image_view (ros.org)

  • A simple viewer for subscribing to and displaying ROS image topics. Includes a specialized viewer for stereo and disparity images.

cmvision (ros.org)

  • Based on the CMU CMVision library, the cmvision package performs segmentation of solid colored regions (or “blobs”) in an image, reported as bounding boxes. cmvision proxy thresholds and groups pixels in an images based on given YUV color ranges to estimate blobs. To calibrate color ranges, the colorgui node is include within cmvision to build color ranges from selected pixels in published image topics.

ar_recog (brown-ros-pkg)

  • Based on the ARToolkit augmented reality library, ar_recog recognizes augmented reality tags in an image. ar_recog publishes various information about recognized tags, such as its corners in image space and relative 6DOF pose in camera space.

To install cmvision and ar_recog package, see Software Environment Setup

Assignment Instructions

Your tasks for this assignment are as follows.

  1. View image topics from the camera using kinect camera and image_view
  2. Color calibration: generate a color calibration file for cmvision to recognize the solid color objects (balls and multi-color fiducials)
  3. AR pattern recognition: recognize augmented reality tags using ar_recog
  4. Drive to a single object: write a controller ROS to seek out, find, and drive towards a single object.
  5. Drive to sequence of objects: extend the controller to continually drive between a set of objects in a sequential order
  6. Experiments, and Submission

For Installation, color calibration and AR Pattern recognition, Please see Software Environment Setup

Running Turtlebot Driver

  • Run Turtlebot Driver, substituting the name of the robot for [turtlebotname]:
> roslaunch [turtlebotname].launch
  • Open another terminal and run Turtlebot Dashboard
> rosrun turtlebot_dashboard turtlebot_dashboard

Please refer the previous robot bringup instructions for more details.

Running image_view

Assuming turtlebot driver is running, view camera's image stream using the image_view node, subscribing to the Kinect's image topic ”/camera/rgb/image_color”:

> rosrun image_view image_view image:=/camera/rgb/image_color

If successful, you should see a new window emerge displaying the image stream from the robot's camera, example below. Stop image_view with the ctrl-c command in the terminal before proceeding

An example of ros image_view.

Color Calibration and Recognition

AR Tag Recognition

Follow these instructions for using ar_recog, specifically under the “Recognition/Execution” subsection.

Seeking Individual Objects

  • Create a ROS package for object seeking. Object seeking has more dependencies than enclosure escape, for cmvision is required for color blogging, and ar_recog is required for AR tag recognition.
> roscreate-pkg object_seeking rospy std_msgs turtlebot_node cmvision ar_recog
  • Within this package, create the nodes directory and create a file called object_seeking.py, or similar file in the client library of your choice).
  • Object seeking should subscribe the following topics:
  • ”/turtlebot_node/sensor_state” topic of message type “TurtlebotSensorState” from “turtlebot_node” package
  • “blobs” topic of message type Blobs from “cmvision” package
  • “tags” topic of message type Tags from “ar_recog” package.
  • Object seeking should publish the following topics:
  • ”/turtlebot_node/cmd_vel” the topic of message type “Twist” from “geometry_msgs” package.

Note: The appropriate message types must be imported/included in your source code. To see what topics are available in a given package, or to check the message type of a given topic, it is helpful to look at the <topic>.msg files of a given package.

  • Write an event loop in object_seeking.py that can distinguish the following objects (ordered by integer identifier):
  1. Yellow ball
  2. Pink fiducial
  3. Green over orange fiducial
  4. Orange over green fiducial
  5. Alpha AR tag

and move the robot to any one of these objects (specified as <tt>seek_visit_order</tt> by rosparam). For example, the following command will set the “Green over orange” object as the target:

> rosparam set seekorder "3"

From your code, you can access the parameter using the get_param function.

Note: these identifiers do not necessarily need to match identifiers in the colorfile.

Given appropriate color calibration, recognizing single solid color and AR tag objects should be straightforward.

Remember: to consider the proportional-derivative servo when coding the seeking behavior of your robot.

Seeking a Sequence of Objects

Given a specific ordering, your client should drive the robot to visit each of the given objects continuously in this order. The order of visitation should not be hard coded, but rather should be easily changeable. As such, this order is specified as a string using rosparam. For example, the given ordering “3 2 1 4 5” should direct the robot to visit the green/orange fiducial, pink fiducial, yellow ball, orange/green fiducial, alpha AR Tag. This order would be specified by rosparam as:

> rosparam set seekorder "3 2 1 4 5"  

A finite state machine (FSM) is a good choice for controlling the decision making process for object seeking. An FSM for seeking should have transition conditions that determine when the robot seek the next object in the input sequence.

Submission and Test

To submit this project, create a branch of your repository named “object_seeking_[Your team name]”. This code will be checkout and run by the course staff, without modification. This code will be tested against five trials with two different seek orders. Your controller must have an 80% success rate over all trials, where the robot is able to reach every object in each trial.

You must submit your code in the form of ROS packages. Further, your submission must include a file named object_seeking.launch. To test your code, the course staff (after using rosmake to build your packages) will evoke the command:

> roslaunch object_seeking.launch

Appendix: RGB-YUV Conversions

The color conversion routines used by CMVision for blobfinding are below:

#define YUV2RGB(y, u, v, r, g, b)

 r = y + ((v*1436) >>10);
 g = y - ((u*352 + v*731) >> 10);
 b = y + ((u*1814) >> 10);
 r = r < 0 ? 0 : r;
 g = g < 0 ? 0 : g;
 b = b < 0 ? 0 : b;
 r = r > 255 ? 255 : r;
 g = g > 255 ? 255 : g;
 b = b > 255 ? 255 : b

#define RGB2YUV(r, g, b, y, u, v)

 y = (306*r + 601*g + 117*b)  >> 10;
 u = ((-172*r - 340*g + 512*b) >> 10)  + 128;
 v = ((512*r - 429*g - 83*b) >> 10) + 128;
 y = y < 0 ? 0 : y;
 u = u < 0 ? 0 : u;
 v = v < 0 ? 0 : v;
 y = y > 255 ? 255 : y;
 u = u > 255 ? 255 : u;
 v = v > 255 ? 255 : v
object_seeking.txt · Last modified: 2011/09/24 09:28 by jihoonl
Trace: object_seeking
Dieses Dokuwiki verwendet ein von Anymorphic Webdesign erstelltes Thema.
CC Attribution 3.0 Unported
www.chimeric.de Valid CSS Driven by DokuWiki do yourself a favour and use a real browser - get firefox!! Recent changes RSS feed Valid XHTML 1.0