UBC Thunderbots 2009 Team Description Paper. Alim Jiwa, Amanda Li, Amir Bahador Moosavi zadeh, Howard Hu, George Stelle, Byron Knoll, Kevin Baillie,

Similar documents
(Skip to step 11 if you are already familiar with connecting to the Tribot)

Team Members: Erik Stegman Kevin Hoffman

Introduction to GRIP. The GRIP user interface consists of 4 parts:

Beethoven Bot. Oliver Chang. University of Florida. Department of Electrical and Computer Engineering. EEL 4665-IMDL-Final Report

Figure 2: components reduce board area by 57% over 0201 components, which themselves reduced board area by 66% over 0402 types (source Murata).

6.111 Final Project Proposal Kelly Snyder and Rebecca Greene. Abstract

Smart Traffic Control System Using Image Processing

Image Processing Using MATLAB (Summer Training Program) 6 Weeks/ 45 Days PRESENTED BY

Dynamic Animation Cube Group 1 Joseph Clark Michael Alberts Isaiah Walker Arnold Li

SRV02-Series. Rotary Pendulum. User Manual

DMC550 Technical Reference

ITAndroids Very Small Size League Team description paper 2016

TIGERS Mannheim. Team Description for RoboCup Mechanical System

Bird-Inspired Robot with Visual Feedback and Navigated from the Computer

Koolio: An Autonomous Refrigerator Robot

Real-time Chatter Compensation based on Embedded Sensing Device in Machine tools

Author: Seth Reed Lakritz

1ms Column Parallel Vision System and It's Application of High Speed Target Tracking

MATLAB & Image Processing (Summer Training Program) 4 Weeks/ 30 Days

THE NEW LASER FAMILY FOR FINE WELDING FROM FIBER LASERS TO PULSED YAG LASERS

Internet of Things Technology Applies to Two Wheeled Guard Robot with Visual Ability

Innovative Rotary Encoders Deliver Durability and Precision without Tradeoffs. By: Jeff Smoot, CUI Inc

Using the Siemens S65 Display

Application Note AN-708 Vibration Measurements with the Vibration Synchronization Module

COMPUTER ENGINEERING PROGRAM

Embedded System Training Module ABLab Solutions

(Cat. No IJ, -IK)

UNIT V 8051 Microcontroller based Systems Design

Designing for the Internet of Things with Cadence PSpice A/D Technology

Considerations for Specifying, Installing and Interfacing Rotary Incremental Optical Encoders

HEBS: Histogram Equalization for Backlight Scaling

Data Converters and DSPs Getting Closer to Sensors

Pivoting Object Tracking System

SHOWLINE SL BEAM 100 LED LUMINAIRE SPECIFICATIONS.

Digital Logic Design: An Overview & Number Systems

Scan. This is a sample of the first 15 pages of the Scan chapter.

METHOD, COMPUTER PROGRAM AND APPARATUS FOR DETERMINING MOTION INFORMATION FIELD OF THE INVENTION

AN ARTISTIC TECHNIQUE FOR AUDIO-TO-VIDEO TRANSLATION ON A MUSIC PERCEPTION STUDY

Log-detector. Sweeper setup using oscilloscope as XY display

In-process inspection: Inspector technology and concept

Simple motion control implementation

MULTIPLE TPS REHOST FROM GENRAD 2235 TO S9100

D-Lab & D-Lab Control Plan. Measure. Analyse. User Manual

SPECIFICATION NO Model 207 Automatic GTAW Welding System

Shumatech DRO Jitter Conclusions. Copyleft protects this article.

Multipurpose Robot. Himanshu Gupta 1, Mohammad Shahid 2

ORM0022 EHPC210 Universal Controller Operation Manual Revision 1. EHPC210 Universal Controller. Operation Manual

Model Identification of Displacement Controlled Linear Actuator in Hydraulic System

SNG-2150C User s Guide

International Research Journal of Engineering and Technology (IRJET) e-issn: Volume: 03 Issue: 07 July p-issn:

Using the Siemens S65 Display

SWITCH: Microcontroller Touch-switch Design & Test (Part 2)

Operating Instructions

Customized electronic part transport in the press shop siemens.com/metalforming

PROTOTYPING AN AMBIENT LIGHT SYSTEM - A CASE STUDY

SPECIFICATION NO NOTE

Sharif University of Technology. SoC: Introduction

Methods for Time Stamping Analog and Digital Video. Frank Suits

Prototyping & Engineering Electronics Kits Magic Mandala Kit Guide

CHARACTERIZATION OF END-TO-END DELAYS IN HEAD-MOUNTED DISPLAY SYSTEMS

Vtronix Incorporated. Simon Fraser University Burnaby, BC V5A 1S6 April 19, 1999

OPTICAL POWER METER WITH SMART DETECTOR HEAD

Reconfigurable Neural Net Chip with 32K Connections

System Quality Indicators

TV Character Generator

ECE 480. Pre-Proposal 1/27/2014 Ballistic Chronograph

BLINKIN LED DRIVER USER'S MANUAL. REV UM-0 Copyright 2018 REV Robotics, LLC 1

Color Reproduction Complex

Optimizing BNC PCB Footprint Designs for Digital Video Equipment

Automatic Defect Recognition in Industrial Applications

MICROMASTER Encoder Module

MIE 402: WORKSHOP ON DATA ACQUISITION AND SIGNAL PROCESSING Spring 2003

LOW POWER & AREA EFFICIENT LAYOUT ANALYSIS OF CMOS ENCODER

SHOWLINE SL BAR 640 LINEAR WASH LUMINAIRE SPECIFICATIONS.

Optical Engine Reference Design for DLP3010 Digital Micromirror Device

SRV02-Series. Ball & Beam. User Manual

White Paper. Uniform Luminance Technology. What s inside? What is non-uniformity and noise in LCDs? Why is it a problem? How is it solved?

V9A01 Solution Specification V0.1

Timing Error Detection: An Adaptive Scheme To Combat Variability EE241 Final Report Nathan Narevsky and Richard Ott {nnarevsky,

Analog, Mixed-Signal, and Radio-Frequency (RF) Electronic Design Laboratory. Electrical and Computer Engineering Department UNC Charlotte

How smart dimming technologies can help to optimise visual impact and power consumption of new HDR TVs

MotionPro. Team 2. Delphine Mweze, Elizabeth Cole, Jinbang Fu, May Oo. Advisor: Professor Bardin. Preliminary Design Review

Module 8 VIDEO CODING STANDARDS. Version 2 ECE IIT, Kharagpur

A New Hardware Implementation of Manchester Line Decoder

Surveillance Robot based on Image Processing

INTRODUCTION OF INTERNET OF THING TECHNOLOGY BASED ON PROTOTYPE

C O B A R 18R U s e r G u i d e P a g e 1. User Guide COBRA 18R. Wireless Firing System.

Oculomatic Pro. Setup and User Guide. 4/19/ rev

Chapter 9 MSI Logic Circuits

LEDs, New Light Sources for Display Backlighting Application Note

Drift Tubes as Muon Detectors for ILC

FP-QUAD-510. Features. Power Requirement OPERATING INSTRUCTIONS. 4-Axis, Quadrature Input Module

Digital Audio Design Validation and Debugging Using PGY-I2C

(Refer Slide Time: 00:55)

SignalTap Plus System Analyzer

SUBSYSTEMS FOR DATA ACQUISITION #39. Analog-to-Digital Converter (ADC) Function Card

Long Range Wireless HDMI/SDI HD Video Transmission Suite LINK-MI LM-SWHD01. User manual

Electric Rotary Modules. Rotary Actuators

PulseCounter Neutron & Gamma Spectrometry Software Manual

UNIT-3 Part A. 2. What is radio sonde? [ N/D-16]

Transcription:

UBC Thunderbots 2009 Team Description Paper Alim Jiwa, Amanda Li, Amir Bahador Moosavi zadeh, Howard Hu, George Stelle, Byron Knoll, Kevin Baillie, Jerome Pasion,, Lawrence Wong, John Yuen University of British Columbia robocup@ece.ubc.ca www.ece.ubc.ca/~robocup Abstract. This paper gives an overview of the UBC Thunderbots, a new team aiming to participate in the 2009 Robocup Small Size League. The team has made significant development in the electronic, mechanical, artificial intelligence, image recognition and wireless communication aspects of its robot. 1 Introduction UBC Thunderbots is a team of undergraduate students at the University of British Columbia that is currently pursuing its first competitive initiative within the Small Size League at Robocup 2009. The team was established in 2006 and has made significant development of its team of autonomous soccer playing robots. This paper will outline the current progress in implementation of the current model of robots, detailing the rationale, design and results of both the hardware and software architecture. 2 Hardware Architecture The hardware architecture of the robots consists of their mechanical and electronics design. 2.1 Mechanical Design The mechanical design of the robot can be broken down into four main components: Chassis, Driving System, Kicking System and Dribbling System.

2.1.1 Chassis The chassis is used to hold all other components and also works as heat sink for the H-bridges and the voltage converter. It consists of two aluminum plates which are 10cm apart, and are connected together by four aluminum rods. The bottom plate holds the motors and the kicker mechanism, while the top plate serves to hold the electronics, and the batteries and the motors are placed in between the two plates and are held by aluminum brackets. The robot has a maximum diameter of 172mm and a height of 144 mm. 2.1.2 Driving System The robots are four wheel omni-directional robots. For all of the robots the two front wheels are separated with an angle of 110, while the configuration of the two back wheels depends on the role of the robot. For attackers, the back wheels are separated with an angle of 60, giving the robot a higher forward speed. For defenders, the back wheels are separated at an angle of 90, giving the robot a higher lateral speed. The wheels are actuated by four Mabuchi brushed DC motors 1 and are connected to a miniature optical encoder 2 which measures the actual speed of the wheel for feedback control purposes. Figure 1 below shows the drawing of the whole robot as it was done in SolidWorks. In this figure the important components of our robot are labeled. Robot Identifiers Main Board Top Plate Wiring Board (Microcontroller) DC Motors Omni directional Wheels Motor Bracket Solenoid Kicker Bar Figure 1: Overall Robot SolidWorks Model Base plate

2.1.3 Kicking System The kicking system consists of a solenoid kicker, designed for straight shooting and passing. This consists of a solenoid connected to a straight bar. The following figure shows the speed of the ball as a function of mass of the kicker bar together with the plunger. It turns out that we can achieve maximum ball speed when the kicker (the plunger and the bar) weights the same as the ball (i.e. 46grams) Figure 2: Ball speed as a function of the mass of the plunger and the kicker (regardless of the solenoid) The solenoid is a tubular push solenoid with the maximum stroke length of one inch and maximum force of 340Oz. Using the conservation of momentum, the maximum speed of the ball (when kicked at highest power) is calculated to be 6.6 m/s, when the coefficient of restitution between the ball and the kicker is estimated to be 0.8. Experimental results, also, show that the maximum kicking speed of the robots is about 6m/s. 2.1.4 Dribbling System The dribbling system is implemented for handling the ball during the game. The axis of dribbler is parallel to the ground plain and it is located at the height of 30mm. The dribbling system consists of a sorbothane cylinder wrapped around an aluminum core and it is driven by a Maxon flat DC Motor. 2.2 Electronic Design The electronics consist of three main parts: the wireless communication, the actuators and the kicking system drivers. All three parts sit on a single main board which is connected directly to the

microcontroller board. The microcontroller used 3 is an ATMEL atmega 128 chip pre-built on an electronics i/o board, called the Wiring board. Figure 3 shows the schematics of the robot s main board. Figure 3: Main Board on Robot The wireless communication on the main board consists of an XBee ZigBee OEM RF Module 4, the microcontroller and the interface circuit 5 (AXIC) between the two that adjusts for their different operating voltages. This set up receives commands from a similar one which is connected to the central computer and then sends them to the DC motor controlling circuits. The Mabuchi DC motors are controlled using two Pololu dual motor drivers 6. Each motor driver can provide up to 10 Amps at 14 volts through a Pulse Width Modulated (PWM) signal. The control algorithm also relies on the feedback loop created using the optical encoders. The encoder has a resolution of 360 counts per rotation. Each encoder is connected to a up/down counter chip which records the wheel speed as measured by the encoders and provides it to the microcontroller when enabled. The kicker solenoid is also controlled with PWM and it is capable of kicking at variable speeds. The solenoid is actuated using two MOSFETs and one capacitor.

3 Software Design 3.1 Artificial Intelligence The Artificial Intelligence (AI) module of the system receives inputs from the Image Recognition module and the Referee Box. It outputs wireless signals to the robots using a transmitter. Figure 4: IR and AI System Interface The AI module has been divided into five layers as shown in Figure 4. Starting at the Central Analyzing Unit, data is received and analyzed from the Image Recognition module, then used to execute the remaining layers. This design uses a hierarchy from a higher level strategy type (assigned in Decision Unit) to lower level robot behaviors (assigned in Central Strategy Unit) to the lowest level commands (assigned in Local Strategy Unit). The Decision Unit receives inputs from the referee box and sets the global strategy type. Global strategy types within the decision unit include: start of play, penalty kick, free kick, goal kick, throw in, corner kick and ball in play. The global strategy type is then sent to the Central Strategy Unit which determines the behavior assigned to each robot. Behaviors assigned to robots include stop, pass ball to another player, receive pass from another player, shoot ball, chase ball, go to a location on the field, be the goalie and be a defender.

Then based off this assigned behavior, lower level commands are sent to the Robot Controller by the Local Strategy Unit. Low level commands include direction to move, direction to face, whether or not to dribble the ball and whether or not to kick the ball. The very last layer is the Robot Controller which translates and transmits low level commands into wireless signals. There are hierarchies within layers of the AI module as well. Within the Local Strategy Unit some of the behaviors are lower level than other behaviors. This allows a behavior to be implemented using a combination of lower level behaviors. For example, to implement the shoot ball behavior, the chase ball behavior can be used if the robot does not have control of the ball. The chase ball behavior in turn can use the move behavior in order to get closer to the ball. Figure 3 Thunderbots AI Simulator (Visualizer) The AI module also includes a Simulator and a Visualizer. The Simulator is used to test robot strategies and behaviors without needing input from Image Recognition or outputting wireless signals to physical robots. It takes the place of all input and output to the AI module (Image Recognition, Referee Box, and wireless signals). In order to update the position of entities on the field it makes predictions based on velocity and acceleration. The Robot Controller updates the acceleration and properties of robots in the Simulator instead of sending out wireless signals. The Visualizer is a graphical interface which displays the position of the ball and robots on the field that can be used for both simulations and actual games in order to observe the current state of the AI module.

3.2 Image Processing Our image processing algorithm is broken down into two smaller parts: one dealing with object detection, and the other dealing with object tracking. The reason behind this breakdown may not be obvious for one-of-a-kind objects on the field, such as the ball, because its tracking reduces to its detection frame by frame. However, the situation is slightly different with the robot players, since they are five of a kind, and their identities need to be maintained over time so as to not confuse the strategic AI module. After considering various approaches to the problem, two were selected for further research. 3.2.1 Color Filtering The first IR approach was based on color filtering. Given a pixel in the image, the algorithm extracts its three color parameters: hue, saturation, and greyscale intensity. Hue is the position of any given color in the color spectrum. Saturation is a measure of how pronounced that color is with respect to black and white. Greyscale intensity is the darkness of the color. The algorithm then proceeds to create ranges for each of them, yielding minimum and maximum values for each parameter. Then, the image is traversed and a new binary image is created, where every pixel color falling within these ranges is highlighted. For example, a filter created based on the pixel taken from the blue marker would highlight all of the blue in the image, which, if the range sizes were selected properly, should be all the other blue makers, and nothing else. Effectively, this takes care of detecting the markers of the same color, and the ball. Our solution to the tracking problem relied on the assumption that the center of the robot cannot move outside of its circumference over the duration of 1 frame update. Despite the robots' high degree of mobility, this is a fair assumption, because the camera refresh rate is as high as 60 frames per second. Having made this assumption, we developed an algorithm that would create a

search area for each robot within its previous circumference, and search the filtered image for the center marker of a given robot inside its respective search area. Figure 5 below shows the program outputs during the tracking stage. At the top left there is a simulator view of the robots and the ball with red boxes drawn around them, indicting successful identification. At the top center there is an output of the binary yellow filter, which aligns with the positions of the yellow markers on the original image. On the top right, the hue, saturation, and greyscale intensity ranges for yellow color are displayed. At the bottom, ball and robot coordinates are printed. Figure 5: IR Tracking Outputs This technique proved to work well with the videos generated by the simulator, but its performance with a game video of 2006 RoboCup SSL champion CMDragons 7 was rather poor due to the high fluctuations across the saturation scale. This can probably be considered a video encoding problem which would not be encountered had we been working with raw streaming video instead. Therefore, more conclusive testing needs to be done on the actual game robots.

As far as tracking is concerned, the algorithm operated with a high degree of success, falling short in just one crucial aspect: running time. Having to perform computationally intensive color filtering every frame, it could only process 8 frames per second, which is not fast enough for the fast-paced game style of the Small Size League. There is however a number of optimizations that can be made, which may reduce the running time. Some of these include using high-efficiency pixel access provided by OpenCV, creating filters for high-interest regions of the image instead of the whole image, and getting rid of huge overhead used for debugging and testing. 3.2.2 Lucas-Kanade Optical Flow Estimation Another algorithm that we were experimenting with is based on Lucas-Kanade optical flow estimation. Optical flow algorithms estimate deformations, or changes, between two image frames based on the assumption that pixel intensity does not change much 8. One extra feature that was implemented in order to improve performance of this algorithm in the RoboCup setting was adding background subtraction. The idea behind background subtraction is that before any non-stationary objects are placed on the field, the picture of the field is taken and stored to be later subtracted from every new frame. Such subtraction would create a filter of sorts, which would highlight all the objects that have not been present on the original image. Luckily, these are the very objects that need to be detected and tracked. Lucas-Kanade optical flow is more time-efficient, yielding 12 frames per second. This is still short of 60 frames per second provided by the camera, but a definite improvement over the color filtering method. However, optical flow too has a disadvantage as because it is reliant on pixel intensity alone, it is prone to detecting false positives. For example, when it is used to track the ball, and the ball is crossing a white line on the field, the tracker stops at the line, losing track of the actual ball. In order to account for cases like this, background subtraction was used. Because

the white line on the field is present in the original image, it is subtracted away, and not visible on the filtered image, rendering optical flow tracker successful. Figure 4: Ball detection using Lucas-Kanade optical flow estimation On the figures above the optical flow tracking point is meant to track the ball, which is moving upwards away from the robots. On the left image, the tracking point gets stuck on the white line, while the ball continues onwards. On the right image, the tracking point passes the white line while still keeping track of the ball as the background subtraction hides the white line. It becomes clear that both optical flow and color filtering have their shortcomings, and could be improved upon, or used interchangeably to achieve best results. Ranges for color filtering have to be selected very carefully; otherwise either false positives or false negatives are bound to occur. Implementing background subtraction for color filtering may also significantly aid detection quality and speed. Information about object velocities from the previous frames can be used to improve the accuracy of ball and robot location prediction, further narrowing the search area, and thereby increasing detection speed. In short, color and field dynamics should be used together to yield reliable detection and tracking in a timely fashion, however working out the exact way to make the best use of this information is subject to testing in the real world setting. 3.3 Wireless Communication The wireless communication can be described by the hardware, system and software elements. The hardware element has been detailed in section 2.2 regarding the onboard electronic setup.

The system of the wireless communication can be seen below in figure 6. Microcontroller Microcontroller Microcontroller Figure 6: Wireless System The software of the wireless communication receives information from the AI and sends a transmit request to the coordinating XBEE module. The data (containing each robot s velocity and kicking signal information) will then be sent to the appropriate target XBee module as specified by the AI. The target XBee module, informing its host microcontroller that the information has arrived at its DI port, will send the received data in the form of API Data Frame to the microcontroller, that will then parse out the important information from the data frame, and based on the message received, it will control the DC motors accordingly.

4. Future Plans By Jun 1 st 2009 we expect to have a functioning robotic team, capable of competing in the RoboCup International Competitions. With all major subsections near the end of construction or complete, the majority of the work to be done before the competition lies in the integration of the parts and testing. In the meantime, preparations for the competition have begun now on the assumption that we are able to qualify. This is so that should we be accepted, we are ready and able to attend. Moreover, we are planning to attend the RoboCup US Open 2009 in preparation for this year s international competition in Austria. REFERENCES [1] Mabuchi Motor. 2008. <http://www.mabuchi-motor.co.jp> [2] US Digital. 2008. <http://www.usdigital.com/products/encoders/incremental/rotary/kit/e4p/> [3] Wiring. 2008. <http://www.wiring.org.co> [4] XBee. 2008. <http://www.digi.com/products/wireless/point-multipoint/xbee-series1-module.jsp> [5] Igoe, Tom. Arduino XBee Interface Circuit. 2008. <http://mrtof.danslchamp.org/axic> [6] Pololu. 2008. <http://www.pololu.com/catalog/product/707> [7] CMDragons. Championship Game. 2006. <http://www.cs.cmu.edu/~coral-downloads/small/movies/index.html> [8] Gamal, Abbas El and SukHwan Li. Optical Flow Estimation Using High Frame Rate Sequences. 2008. <http://isl.stanford.edu/~abbas/group/papers_and_pub/icip2001.pdf>