Robotic Testbed
Inspiration: during summer 2004, we went to CalTech where we used
Richard Murray's
Multi-vehicle wireless testbed ,
which includes the fully autonomous Kelly vehicle shown to the right.
That vehicle has an onboard laptop computer, two ducted fans for self-propulsion, and it wears a hat with a bar code on top that is read by an overhead vision
tracking system. Information from the vision system is fed back to the vehicle
through wireless networking. Many Kellys can communicate on the floor
of the testbed through this network.
The MVWT platform inspired us to build our own platform at UCLA. Below
we describe our first generation testbed.
First generation testbed 2005:
Kevin Leung, Chung Hsieh, and Rick Huang built the vehicles platform.
Our testbed arena is a 5X8 area; requiring much smaller vehicles
than the CalTech Kellys. We have developed a system using radio controlled
cars. The algorithms are programmed off-board and controls are
sent to the individual cars using different radio frequencies.
We have an overhead vision tracking system modelled on the one
at CalTech.
Here is a schematic of the vehicles platform.
On left is a photo of the group and the first generation of vehicles.
Left to right - Maria D'Orsogna, Chung Hsieh, Kevin Leung, Yao-Li Chuang, and Rick Huang. On the right is a close up photo of the first generation
of vehicles.
Here is a videoclip of a demo involving area servicing by three vehicles.
In the video, a student uses a stick to flash target images to the overhead cameras. One of the vehicles must then visit that target site.
Otherwise the vehicles maintain a holding pattern.
A second generation of the testbed was build in summer 2006.
The original vehicle is improved with on-board range sensing,
on-board computing, and wireless communication, while
maintaining economic feasibility and scale. A second, tank-based
platform, uses a flexible caterpillar-belt drive and the same
modular sensing and communication components.
We demonstrate practical use of the testbed for algorithm validation by
implementing a recently proposed cooperative steering law
involving obstabcle avoidance.
The tank-based vehicle proves to be quite useful in the implementation
of an environmental mapping algorithm based on ENO interpolation.
In 2007 we mounted phototransistors on the car-based platform (shown left)
for use in designing and testing boundary tracking algorithms
with noisy data. Our testbed results show that a recently
developed algorithm (see paper by Jin and Bertozzi) is
effective in boundary tracking in noisy environments.
Experimental work on boundary tracking was published in the 2009 American Control Conference.
In 2010 we built the third generation testbed whose main new development was the ALMC-100 microcar platform.
The third generation micro-cars (model ALMC-100,
see Fig. 2 for a hardware schematic) are purpose built
from the ground up, in contrast to previous generation vehicles that were modified off-the-shelf toy cars.
The ALMC-100 is designed to mimic many of the
features one would expect to find in a full sized au-
tonomous vehicle, in a compact package. The vehicle measures approximately 8 cm (wheelbase) x 5 cm
(width); the height varies from 5.8 cm to 8 cm depending on configuration. The ALMC-100 is a rear wheel drive, front steering car with a maximum speed
of 20 cm/s and maximum turning angle of 18.
Power comes from four AAA batteries with approximately 3.8 W, yielding a run time of greater than 30 minutes.
The ALMC-100 features two processing units on individual printed circuit boards, which are stacked
atop each other. The lower chassis board is the base
of the vehicle where the drive train is bolted in addition
to the electronics. The chassis board contains a 50MHz ARM Cortex-M3 processor with 8KB SRAM
and 64KB flash memory. A 1KB EEPROM is also included to store unique, non-volatile information such
as vehicle identification number and motor control gains and offsets.
The upper processing board contains an off-the- shelf Xilinx Virtex-4 FX12 FPGA Mini-Module.
Currently, the FPGA is configured to embed a 300MHz
PowerPC405 processor plus a number of peripheral
interfaces. The interfaces allow the PPC405 to access
64MB of DDR SDRAM and 4MB of flash memory
among other peripherals.
The wireless communication system consists of
two Wi.232 radio modules, one on each board, capable of transmitting and
receiving at 11520 bps. The wireless module on the processing board is
configured to
115200 bps and is intended for inter-vehicle communication and for access of
the vehicle via the remote
terminal. The two radios operate on different frequencies to avoid
interference.
The driving factor behind the use of two processing units is to segregate
motion control and path planning. The motion control is accomplished on the
chassis board, which maintains its control loop at
1000 Hz while sampling the various sensors at 500
Hz. The chassis processor extracts the vehicle's own
position from the overhead tracking system's broadcast sent at 30 Hz.
The vehicle's position and other
vehicle and sensor states are relayed to the processing board also at 30 Hz
over the universal asynchronous receiver/transmitter (UART) connecting the
two boards. Thanks to the powerful processing available to the upper board,
the cars can perform all required path planning themselves;
in previous versions of the AMLT, vehicles relied on a desktop computer
to perform all such calculations and relay instructions
to the cars.