In this assignment, you will develop a ROS package for perceiving 2 types of objects: (1) objects of a solid color and (2) objects labelled with an augmented reality (AR) tag. Your controller will drive the robot to move between these objects.
A video of Lisa Miller's implementation of this assignment from Spring Semester 2009:
and an updated version by the course staff for the Turtlebot:
Assigned: Sept 19, 01:00pm, 2011
Due: Sept 26, 12:30pm, 2011
In this assignment, you will build a controller in ROS to perform “object seeking”. In this seeking task, your robot will perceive and drive to objects that are visually recognizable by a solid color appearance or labeled with an AR tag from the robot's visual sensing (i.e., camera). For this assignment, you will be working primarily with the turtlebot and a Microsoft Kinect. For object recognition in ROS, you will use the cmvision package for color blobfinding and the ar_recog package for AR tag recognition.
Assuming perception of objects salient by color or pattern, you will develop an object seeking package for this assignment that enables a robot to continually drive between these (non-occluded) objects in a sequence given at run-time. Your controller's decision making should take the form a finite state machine (FSM). This FSM should use one state variable to specify the currently sought object. For motion control, you should use proportional-derivative (PD) servoing to center objects in the robot's field of view. As a whole, your controller should put the current object in the center of view, drive as close as possible to an object without hitting it, increment the state variable, and continue the process for the next object.
To install cmvision and ar_recog package, see Software Environment Setup
Your tasks for this assignment are as follows.
For Installation, color calibration and AR Pattern recognition, Please see Software Environment Setup
> roslaunch [turtlebotname].launch
> rosrun turtlebot_dashboard turtlebot_dashboard
Please refer the previous robot bringup instructions for more details.
Assuming turtlebot driver is running, view camera's image stream using the image_view node, subscribing to the Kinect's image topic ”/camera/rgb/image_color”:
> rosrun image_view image_view image:=/camera/rgb/image_color
If successful, you should see a new window emerge displaying the image stream from the robot's camera, example below. Stop image_view with the ctrl-c command in the terminal before proceeding
Follow these instructions for CMVision color calibration.
Follow these instructions for using ar_recog, specifically under the “Recognition/Execution” subsection.
> roscreate-pkg object_seeking rospy std_msgs turtlebot_node cmvision ar_recog
Note: The appropriate message types must be imported/included in your source code. To see what topics are available in a given package, or to check the message type of a given topic, it is helpful to look at the <topic>.msg files of a given package.
and move the robot to any one of these objects (specified as <tt>seek_visit_order</tt> by rosparam). For example, the following command will set the “Green over orange” object as the target:
> rosparam set seekorder "3"
From your code, you can access the parameter using the get_param function.
Note: these identifiers do not necessarily need to match identifiers in the colorfile.
Given appropriate color calibration, recognizing single solid color and AR tag objects should be straightforward.
Remember: to consider the proportional-derivative servo when coding the seeking behavior of your robot.
Given a specific ordering, your client should drive the robot to visit each of the given objects continuously in this order. The order of visitation should not be hard coded, but rather should be easily changeable. As such, this order is specified as a string using rosparam. For example, the given ordering “3 2 1 4 5” should direct the robot to visit the green/orange fiducial, pink fiducial, yellow ball, orange/green fiducial, alpha AR Tag. This order would be specified by rosparam as:
> rosparam set seekorder "3 2 1 4 5"
A finite state machine (FSM) is a good choice for controlling the decision making process for object seeking. An FSM for seeking should have transition conditions that determine when the robot seek the next object in the input sequence.
To submit this project, create a branch of your repository named “object_seeking_[Your team name]”. This code will be checkout and run by the course staff, without modification. This code will be tested against five trials with two different seek orders. Your controller must have an 80% success rate over all trials, where the robot is able to reach every object in each trial.
You must submit your code in the form of ROS packages. Further, your submission must include a file named object_seeking.launch. To test your code, the course staff (after using rosmake to build your packages) will evoke the command:
> roslaunch object_seeking.launch
The color conversion routines used by CMVision for blobfinding are below:
#define YUV2RGB(y, u, v, r, g, b)
r = y + ((v*1436) >>10); g = y - ((u*352 + v*731) >> 10); b = y + ((u*1814) >> 10); r = r < 0 ? 0 : r; g = g < 0 ? 0 : g; b = b < 0 ? 0 : b; r = r > 255 ? 255 : r; g = g > 255 ? 255 : g; b = b > 255 ? 255 : b
#define RGB2YUV(r, g, b, y, u, v)
y = (306*r + 601*g + 117*b) >> 10; u = ((-172*r - 340*g + 512*b) >> 10) + 128; v = ((512*r - 429*g - 83*b) >> 10) + 128; y = y < 0 ? 0 : y; u = u < 0 ? 0 : u; v = v < 0 ? 0 : v; y = y > 255 ? 255 : y; u = u > 255 ? 255 : u; v = v > 255 ? 255 : v