Monday, July 27, 2009

Waterloo Robot Race 2009

Waterloo Robot Race 2009

Snowtires went to the races, and competed in the Waterloo Robot Race for 2009, competing against 10 other robots, qualifying for the final race, and finally placing second overall. This competition marks the culmination of this project (for now). I was ecstatically pleased with the performance of Snowtires, and happy to note that some of the other UBC robots used elements of the Snowtires guidance code for their own vision systems.

The Waterloo Robot Race is a trio of challenges. The first is a drag race, a flat-out speed competition down a cone-lined track. The second is a round of static judging for aesthetic and technical appeal. The third round is the circuit race: a cone-lined figure-8 track with a moving purple obstacle cone at the intersection. There are also two stop signs and a traffic signal. The robots face penalties if they do not stop for three seconds at the signs and for the duration of the red light at the signal. Additional penalties are levelled when robots take the wrong path or hit the orange cones.

The UBC-CERM3 Thunderbird robotics team entries:


From left to right: Snowfury MkIII, Johnny5, Snowtires, Big Dave, Quartz, and Oscar.
  • Snowfury MkIII was the original Team Thunderbird 1/10th scale racer, and is shown here in its third incarnation as a vision-guided robot. It uses the snowtires vision code in an underdamped, aggressive configuration. The basic platform is a Tamiya Hummer with an ASUS eeePC 701 computer onboard. Snowfury was constructed and operated by Marcel Veronesi, with assistance from Tim Lee. Snowfury achieved the shortest time in its heat and placed fourth overall in the competition.
  • Johnny5 is another vision-based robot, which runs a more conservative configuration than Snowfury. Johnny5's creator Tom Huryn pioneered the rear-mounted camera configuration - the further back the camera is, the better peripheral vision the robot has, improving its ability to avoid objects at close ranges. The underlying platform is a Traxxas Rustler, with an eeePC for the brains. Johnny5 placed 5th in the competition.
  • Snowtires is my machine, seen here with its new bodyshell. We integrated infrared rangefinders for close-range collision avoidance, but in the bright sunlight of race day they were of little use except at point-blank ranges. Snowtires placed second in the competition.
  • Big Dave is our fourth vision machine, constructed and operated by Tim Lee. It is configured similarly to Snowtires, with an up-front camera and pure vision guidance.
  • Quartz is the team flagship, constructed and operated by Ash McKay. Equipped with a laser rangefinder, it has an accurate and detailed 2D view of the world. It uses a vision system to detect stop signs and traffic signals. The rangefinder can detect the purple obstacle cone, and so Quartz is distinguished and the only robot in the entire competition with a full suite of features, capable of handling all of the competition challenges. Quartz placed third in the competition, and would have placed first if the penalties for ignoring the traffic signs and lights were of any significance. It wowed the crowd right out of the starting gate for recovering from a guidance error by backing up and correcting its course.
  • Oscar is the maverick robot, eschewing vision guidance for sonar-modulated odometry guidance. Oscar had a map of the course, and verified its position using sonar readings. However, sonar units have trouble in outdoor areas, particularly under the high winds that we experienced on race day.
Race Day!

Race day opened with an electrical storm. The competition
rules said rain or shine, so we got to work putting the waterproofing
on our robots - plastic bags covering the gaps in their bodyshells.
Fortunately the rain cleared up by the time we got there.


Here's Snowtires in race regalia, during a practice run on the
course. The course was largely dry with puddles by the
time we were practicing.

The crossroads, compete with stop sign and purple cone (with its crane).


A test run with Snowtires (I don't have any race footage as I
was poised over the panic button in case the robot made
an error: I'll post some of my colleague's footage when
I get it. Note how Snowtires avoids puddles when it sees
the reflection of a cone in them.


Here's the drag race: my heat against the Windsor U. robot.
The Windsor robot was a torpedo: it went on to win the competition
due to its mixture of vision guidance and raw power. A little
frightening recovery at the end there, but awesome to behold.


My robot is considerably more conservative. So much so that
this video is of the same drag race, just, you'know... later.


In the second round, I was up against Quartz. My starting timer
didn't engage, so I started three seconds before I should have,
and took a 3 second penalty. It didn't matter anyway as Quartz
passed Snowtires on the straightaway to take the victory.


This is Windsor's Team Invincible running the circuit race.
Their robot has impressive speed, and really pulled out all the
stops for the main race. As they pass the intersection you can
see Johnny5 crossing the finish line as part of the previous heat.

Here's Quartz mere feet from the finish line. Due to a battery issue,
Quartz had a slow second half of the race, but placed third anyway
due to making only a single error in three whole laps.


And here's Snowfury doing its wild weasel routine. The vision
guidance is underdamped, so the robot "bounces" off the sides of
the course, as it usually can only see one side at a time. It had the
second fastest lap time after Windsor, and lapped Snowtires, but
incurred more penalties while doing so. Definitely the
"people's choice" and really entertaining to watch.

Windsor took first, and UBC took places 2-6. None of the other robots fielded were able to qualify to race, in some cases only due to issues with outdoor operation. There were several impressive contenders, but in the end it was a Windsor/UBC field. We're very pleased with the results, and we'll be passing a good legacy on to future Thunderbird robotics club members.

A big thanks goes out to our sponsors, and to everyone who made this possible. To our team leader, Dr. John Meech, and all the robotics club members this has been fantastic, thanks for one wild ride.

-t

Sunday, June 14, 2009

Indoor testing with panning camera

Here's a video of a test run indoors. I've incorporated the pan-tilt unit into the vision system, so the camera now turns in the direction the robot is steering. This allows the vision system to image the road in the predicted direction without having to wait for the robot to roll forward and the chassis to turn. This takes a large amount of latency out of the guidance loop and improves performance, allowing the robot to move faster.



This is an example of cephalisation - placing sensors on a "head" whose orientation is not directly coupled with the main body, allowing the sensor suite to be directed independently from the body orientation.

Thursday, June 11, 2009

The new guidance system


Fig. 1: the guidance visualisation: on the left is the
source image, and on the right is the perspective
warped map of the ground ahead.

Snowtires is a vision-guided robot, so here's a rundown of the the vision system works. The basic idea is really simple: the robot veers away form any objects of a designated colour in front of it. In practice this requires several steps:
  1. Capture an image of the road ahead (left side of Fig. 1)
  2. Determine the horizon line in the image (top of green box)
  3. Determine the perspective transformation from the image to the ground plane.
  4. Transform the section of the image below the horizon to see the image is though it were viewed orthogonally from above. (Fig. 1 , right side)
  5. Filter the transformed image to determine if there are any orange pixels (or whatever colour you're avoiding)
  6. Draw a selection of possible paths onto the image
  7. Compute the number of orange pixels that are within a set distance of each curve - this will be the curve's penalty.
  8. Choose the curve that passes over or near the fewest orange pixels
It sounds fairly simple, and it is with the help of OpenCV. First, OpenCV allows the camera capture using the HighGUI functions. I choose the horizon line manually, and hope that the robot doesn't rock forward or back too much... You could use a horizon detector here, but I can't be bothered right now.

Next comes the transformation. This is really easy. OpenCV has the function GetPerspectiveTransform which takes as inputs the corners of the rectangle in the source image and the corners of the trapezoid in the distorted image, and returns a 3x3 matrix representing the perspective transform in homogeneous coordinates. Next, you feed the matrix and the imge into WarpPerspective, and it deposits the warped image into the map. I draw the trapezoid just to be sure all my math is lining up.

To filter the colour, first convert the map to the HSV (Hue, Saturation, Value) colour space. The hue represents "colour" in the sense of where on the rim of the colour wheel that colour would be (0 is red, orange is 15, yellow is 30, etc etc), so we can filter out any pixels that are not within a threshold of the hue we want. The saturation represents the colour's intensity. The cones have extremely high saturation, whereas the saturation of the pavement is very low. Filter any pixels with low saturation. Finally, Value represents the grayscale illuminance of the colour. I filter out any pixels that have really low value.

That leaves only the bright orange pixels. I create a series of curves, and apply a penalty to each curve dependent on the number of orange pixels within a certain distance (usually the image width/8) of the curve. The curve with the least number of orange pixels within that range "wins" and its curvature is used to determine the steering output. Note that in the figure, due to my ineptitude, curves with high penalties are green and curves with low penalties are red. Sigh.

Monday, June 8, 2009

eeePC battery issues

Immediately after the practice race I put the eeePC away in its sleeve. About an hour later I pulled it out, only to discover that is was extremely hot - apparently it had not shut down properly and had overheated while unable to cool itself in the sleeve. I removed the battery and allowed it to cool. Fortunately, the eeePC still seems to work normally, however, the battery will no longer charge up - it remains at 33% charge no matter how long you charge it. So, it looks like the battery was thermally damaged somehow. I'll have to get it repaired or a new one - race day is a month away.

The bettery is indeed damaged. I haven't experimented with it yet but I suspect that the 33% reading is accurate and that's where this battery tops out now. Anyhow, I got a new 8800 mAh battery to replace it, and I'm going to try to get NCIX to replace my existing battery under warranty.

Sunday, June 7, 2009

So it's been a while...

Well, after five months I'm blogging again. Only very recently did I make any tangible progress over what's been previously posted. Now things are rolling again, and there's a lot to report from the past month or so of work, so I'll elaborate those over the next little while.

First, what you really want to see: a video.



Here you see Snowtires in its new form doing a lap at the Thunderbird robotics club practice intramurals race day. We set up a race to mock up the conditions of the actual Waterloo race in order to test the robots under realistic conditions. The robots need to do laps of a concrete course lined with the usual orange cones. Doing this outside introduces some new challenges, namely, outside there's an abundance of IR radiation going around, which completely blinds the camera if not filtered. To this effect I installed a mirror-finish sunglass lens and a pair of polarizing filters to reduce glare and block IR. This dims the image enough for effective vision even under direct sunlight. The camera is now on a pan-tilt unit mounted front and center on the chassis. It is coupled to a new guidance system that I will elaborate on later. Other modifications include the replacement of the Arduino-based chassis controller with a Furious Module: a USB-based servo and sensor controller designed by the team's own Ash McKay (for details see: his website).

Snowtires won the practice race by completing three laps of the course with only a single error. At the moment the guidance system only works at very low speed: I'll be attempting to improve that as time goes on, but we're getting close to the July 11th competition date. Other requirements: a stop sign detector, a stop light detector, and a purple-object avoidance routine. TIme to get back to work...

Monday, January 19, 2009

Another test run

Here's the latest test run, this time using cones for the course markers. It does two successful laps before missing a turn. It would appear that the guidance has the most trouble with sharp turns, particularly if that results in it approaching a wall at a near-perpendicular angle. Future improvements will hopefully address this problem.



P.S. If the video plays absurdly fast, just restart it. Some browsers seem to have an issue with the video playback.

P.P.S. If you're wondering at the three-month gap in posting, I've been finishing my Master's thesis, and have only recently reemerged into the sunlit world.

Tuesday, October 21, 2008

A new video

As you can see in the following video, the eeePC now operates the robot. However, there are some issues. Since it's not nearly as powerful as the Macbook Pro, I had to reduce the resolution of the input image. This makes the line-finding less precise. Also, it's not yet tuned properly, so you can see it tends to overcorrect and constantly be "bouncing" off the sides of the course. You can see as it rounds the curve that the guidance picks up the left line of the course as the right line by accident, causing it to run between the two sides of the U-shaped course. Needs some work.