Username:
Password:
 

Software

Requirements

The Beagleboard based bot makes use of a number of software assests. The beagleboard itself runs a linux distribution. We chose to use The Ångström Distribution (link) due to the large amount of documention that is available for this distribution specifically in regards to the Beagleboard. Other distributions includes Android and Ubuntu. No Ångström specific software has been used so as long as the linux distribution is able to providing a working setup for the additional software parts, the system should work. For more information, see documentation provided by Ångström (link).

The Beagleboard bot uses GStreamer(link) for image capture from the web cam. The following GStreamer plugins are used. Only the input plugins are required. The output plugins are purely for debugging and can be disabled in the source code. More information about GStreamer can be found in the GStreamer documention (link)

  Element Name GStreamer Package
INPUT
v4l2src gst-plugins-good
videobalance gst-plugins-good

videorate

gst-plugins-base
ffmpegcolorspace gst-plugins-base
appsink gst-plugins-base
 
OUTPUT
(debugging only)
appsrc gst-plugins-base
ffmpegcolorspace gst-plugins-base
xvimagesink gst-plugins-base

The Beagleboard bot uses a Phidgets (link) motor controller. Phidgets provides a number of different motor controllers. We used the PhidgetMotorControl LV (low voltage) controller. The Phidgets motor controller requires the phidgets driver libraries and header files (for compiling). They are available from the phidgets drivers page (link).

The Beagleboard bot uses a simple GNU makefile and GCC as the compiler.

 

Robot Overview

Below is a brief overview on the operation of the robot. For specific regarding using the individual libraries used by this bot we direct you to their respective documentation. For specific details about the algorithms we used for obstacle detection and circle detection please see our algorithms page (link).

First, external libraries are initialized. This included GStreamer and Phidgets. At this point GStreamer creates the pipelines specified in the source to allow access to images from the webcam and optionally the ability to output debug images. The Phidgets libraries detect and attach to the motor controler.

At this point, the bot begins processing results from the webcam. Upon recieving a new image from the webcam, the RGB image is stored in a queue which is processed by a seperate thread.

Two main operations are performed. First the image is processed to look for obstacles in front of the robot. This is done using the technique described on the algorithms page. The second operation is the search for balls, which were used as objects of interest. These two operations provide the inputs to enable the robot to choose the appropriate action.

Obstacles are given priority in the control logic. If an obstacle is detected the robot turns left or right to avoid the object. If no obstacles are detected then objects of interest (in this case a colored ball) are searched for. If an object of interest is found, it will turn appropriately to go towards the object. For testing purposes we added the ability to limit the environment of the robot by placing blue tape on the floor. The blue tape represents an impassable barrier the robot avoids. Currently, the code stops upon approaching the target object. When the target is removed it returns to exploring the region.