Sunday, June 14, 2009

Indoor testing with panning camera

Here's a video of a test run indoors. I've incorporated the pan-tilt unit into the vision system, so the camera now turns in the direction the robot is steering. This allows the vision system to image the road in the predicted direction without having to wait for the robot to roll forward and the chassis to turn. This takes a large amount of latency out of the guidance loop and improves performance, allowing the robot to move faster.



This is an example of cephalisation - placing sensors on a "head" whose orientation is not directly coupled with the main body, allowing the sensor suite to be directed independently from the body orientation.

Thursday, June 11, 2009

The new guidance system


Fig. 1: the guidance visualisation: on the left is the
source image, and on the right is the perspective
warped map of the ground ahead.

Snowtires is a vision-guided robot, so here's a rundown of the the vision system works. The basic idea is really simple: the robot veers away form any objects of a designated colour in front of it. In practice this requires several steps:
  1. Capture an image of the road ahead (left side of Fig. 1)
  2. Determine the horizon line in the image (top of green box)
  3. Determine the perspective transformation from the image to the ground plane.
  4. Transform the section of the image below the horizon to see the image is though it were viewed orthogonally from above. (Fig. 1 , right side)
  5. Filter the transformed image to determine if there are any orange pixels (or whatever colour you're avoiding)
  6. Draw a selection of possible paths onto the image
  7. Compute the number of orange pixels that are within a set distance of each curve - this will be the curve's penalty.
  8. Choose the curve that passes over or near the fewest orange pixels
It sounds fairly simple, and it is with the help of OpenCV. First, OpenCV allows the camera capture using the HighGUI functions. I choose the horizon line manually, and hope that the robot doesn't rock forward or back too much... You could use a horizon detector here, but I can't be bothered right now.

Next comes the transformation. This is really easy. OpenCV has the function GetPerspectiveTransform which takes as inputs the corners of the rectangle in the source image and the corners of the trapezoid in the distorted image, and returns a 3x3 matrix representing the perspective transform in homogeneous coordinates. Next, you feed the matrix and the imge into WarpPerspective, and it deposits the warped image into the map. I draw the trapezoid just to be sure all my math is lining up.

To filter the colour, first convert the map to the HSV (Hue, Saturation, Value) colour space. The hue represents "colour" in the sense of where on the rim of the colour wheel that colour would be (0 is red, orange is 15, yellow is 30, etc etc), so we can filter out any pixels that are not within a threshold of the hue we want. The saturation represents the colour's intensity. The cones have extremely high saturation, whereas the saturation of the pavement is very low. Filter any pixels with low saturation. Finally, Value represents the grayscale illuminance of the colour. I filter out any pixels that have really low value.

That leaves only the bright orange pixels. I create a series of curves, and apply a penalty to each curve dependent on the number of orange pixels within a certain distance (usually the image width/8) of the curve. The curve with the least number of orange pixels within that range "wins" and its curvature is used to determine the steering output. Note that in the figure, due to my ineptitude, curves with high penalties are green and curves with low penalties are red. Sigh.

Monday, June 8, 2009

eeePC battery issues

Immediately after the practice race I put the eeePC away in its sleeve. About an hour later I pulled it out, only to discover that is was extremely hot - apparently it had not shut down properly and had overheated while unable to cool itself in the sleeve. I removed the battery and allowed it to cool. Fortunately, the eeePC still seems to work normally, however, the battery will no longer charge up - it remains at 33% charge no matter how long you charge it. So, it looks like the battery was thermally damaged somehow. I'll have to get it repaired or a new one - race day is a month away.

The bettery is indeed damaged. I haven't experimented with it yet but I suspect that the 33% reading is accurate and that's where this battery tops out now. Anyhow, I got a new 8800 mAh battery to replace it, and I'm going to try to get NCIX to replace my existing battery under warranty.

Sunday, June 7, 2009

So it's been a while...

Well, after five months I'm blogging again. Only very recently did I make any tangible progress over what's been previously posted. Now things are rolling again, and there's a lot to report from the past month or so of work, so I'll elaborate those over the next little while.

First, what you really want to see: a video.



Here you see Snowtires in its new form doing a lap at the Thunderbird robotics club practice intramurals race day. We set up a race to mock up the conditions of the actual Waterloo race in order to test the robots under realistic conditions. The robots need to do laps of a concrete course lined with the usual orange cones. Doing this outside introduces some new challenges, namely, outside there's an abundance of IR radiation going around, which completely blinds the camera if not filtered. To this effect I installed a mirror-finish sunglass lens and a pair of polarizing filters to reduce glare and block IR. This dims the image enough for effective vision even under direct sunlight. The camera is now on a pan-tilt unit mounted front and center on the chassis. It is coupled to a new guidance system that I will elaborate on later. Other modifications include the replacement of the Arduino-based chassis controller with a Furious Module: a USB-based servo and sensor controller designed by the team's own Ash McKay (for details see: his website).

Snowtires won the practice race by completing three laps of the course with only a single error. At the moment the guidance system only works at very low speed: I'll be attempting to improve that as time goes on, but we're getting close to the July 11th competition date. Other requirements: a stop sign detector, a stop light detector, and a purple-object avoidance routine. TIme to get back to work...