1ms Column Parallel Vision System and It's Application of High Speed Target Tracking

Similar documents
Pivoting Object Tracking System

Reconfigurable Neural Net Chip with 32K Connections

Interfacing the TLC5510 Analog-to-Digital Converter to the

Data Converters and DSPs Getting Closer to Sensors

Chapter 9 MSI Logic Circuits

FPGA Laboratory Assignment 4. Due Date: 06/11/2012

EEM Digital Systems II

Introduction To LabVIEW and the DSP Board

PARALLEL PROCESSOR ARRAY FOR HIGH SPEED PATH PLANNING

National Park Service Photo. Utah 400 Series 1. Digital Routing Switcher.

A MISSILE INSTRUMENTATION ENCODER

GALILEO Timing Receiver

Intensity based laser distance measurement system using 2D electromagnetic scanning micromirror

8 DIGITAL SIGNAL PROCESSOR IN OPTICAL TOMOGRAPHY SYSTEM

TIME RESOLVED XAS DATA COLLECTION WITH AN XIA DXP-4T SPECTROMETER

A NOVEL APPROACH FOR TEACHING DIGITAL IMAGE PROCESSING BASED ON A NEW MULTI-SCALABLE HARDWARE PLATFORM

V9A01 Solution Specification V0.1

Approaches to synchronize vision, motion and robotics

1 Digital BPM Systems for Hadron Accelerators

A High-Speed CMOS Image Sensor with Column-Parallel Single Capacitor CDSs and Single-slope ADCs

AND9191/D. KAI-2093 Image Sensor and the SMPTE Standard APPLICATION NOTE.

Real-time Chatter Compensation based on Embedded Sensing Device in Machine tools

Field Programmable Gate Array (FPGA) Based Trigger System for the Klystron Department. Darius Gray

Illumination-based Real-Time Contactless Synchronization of High-Speed Vision Sensors

Data flow architecture for high-speed optical processors

Final Report. PIBot(Pill Informer robot) EEL 5666: Intelligent Machines Design Laboratory Student Name: Duckki Lee

DSP in Communications and Signal Processing

Digilent Nexys-3 Cellular RAM Controller Reference Design Overview

Area-Efficient Decimation Filter with 50/60 Hz Power-Line Noise Suppression for ΔΣ A/D Converters

Simple motion control implementation

Sharif University of Technology. SoC: Introduction

Introduction to Data Conversion and Processing

UNIT V 8051 Microcontroller based Systems Design

A Real Time Infrared Imaging System Based on DSP & FPGA

Internet of Things Technology Applies to Two Wheeled Guard Robot with Visual Ability

Lab 1 Introduction to the Software Development Environment and Signal Sampling

Building Video and Audio Test Systems. NI Technical Symposium 2008

Release Notes for LAS AF version 1.8.0

6.111 Final Project Proposal Kelly Snyder and Rebecca Greene. Abstract

Smart Traffic Control System Using Image Processing

VXI RF Measurement Analyzer

CSE140L: Components and Design Techniques for Digital Systems Lab. CPU design and PLDs. Tajana Simunic Rosing. Source: Vahid, Katz

8K 240-HZ FULL-RESOLUTION HIGH-SPEED CAMERA AND SLOW-MOTION REPLAY SERVER SYSTEMS

The CIP Motion Peer Connection for Real-Time Machine to Machine Control

Introduction. ECE 153B Sensor & Peripheral Interface Design Winter 2016

ADC Peripheral in Microcontrollers. Petr Cesak, Jan Fischer, Jaroslav Roztocil

Authentic Time Hardware Co-simulation of Edge Discovery for Video Processing System

1. Abstract. Mixed Signal Oscilloscope Ideal For Debugging Embedded Systems DLM2000 Series

Data Conversion and Lab (17.368) Fall Lecture Outline

SignalTap Plus System Analyzer

An FPGA Based Solution for Testing Legacy Video Displays

Libera Hadron: demonstration at SPS (CERN)

Lattice Embedded Vision Development Kit User Guide

Image Processing Using MATLAB (Summer Training Program) 6 Weeks/ 45 Days PRESENTED BY

Ensemble QLAB. Stand-Alone, 1-4 Axes Piezo Motion Controller. Control 1 to 4 axes of piezo nanopositioning stages in open- or closed-loop operation

DIGITAL CIRCUIT LOGIC UNIT 9: MULTIPLEXERS, DECODERS, AND PROGRAMMABLE LOGIC DEVICES

EECS150 - Digital Design Lecture 12 - Video Interfacing. Recap and Outline

J. Maillard, J. Silva. Laboratoire de Physique Corpusculaire, College de France. Paris, France

CCD Element Linear Image Sensor CCD Element Line Scan Image Sensor

ADOSE DELIVERABLE D6.9; PUBLIC SUMMARY SRS Testing of components and subsystems

Hardware Implementation of Viterbi Decoder for Wireless Applications

INDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR NPTEL ONLINE CERTIFICATION COURSE. On Industrial Automation and Control

VHDL Design and Implementation of FPGA Based Logic Analyzer: Work in Progress

Laboratory Exercise 4

L12: Reconfigurable Logic Architectures

Microbolometer based infrared cameras PYROVIEW with Fast Ethernet interface

Tiptop audio z-dsp.

Digital Lock-In Amplifiers SR850 DSP lock-in amplifier with graphical display

Written Progress Report. Automated High Beam System

PROVIDING AN ENVIRONMENT TO TEACH DSP ALGORITHMS. José Vieira, Ana Tomé, João Rodrigues

Embedded System Training Module ABLab Solutions

Team Members: Erik Stegman Kevin Hoffman

1.1 Digital Signal Processing Hands-on Lab Courses

CESR BPM System Calibration

Chapter 6: Real-Time Image Formation

Figure 1: Feature Vector Sequence Generator block diagram.

CHAPTER1: Digital Logic Circuits

A 5-Gb/s Half-rate Clock Recovery Circuit in 0.25-μm CMOS Technology

CAN Application in Modular Systems

LMH0340/LMH0341 SerDes EVK User Guide

CONVOLUTIONAL CODING

Digital Signal Processing By John G Proakis 4th Edition Solution

Exhibits. Open House. NHK STRL Open House Entrance. Smart Production. Open House 2018 Exhibits

Design and Realization of the Guitar Tuner Using MyRIO

A TARGET-based camera for CTA

DMC550 Technical Reference

IMS B007 A transputer based graphics board

FPGA Design. Part I - Hardware Components. Thomas Lenzi

J R Sky, Inc. Cross-Modulation Distortion Analyzer

A First Laboratory Course on Digital Signal Processing

PCB Error Detection Using Image Processing

PICOSECOND TIMING USING FAST ANALOG SAMPLING

CONTENTS. Section 1 Document Descriptions Purpose of this Document... 2

ni.com Digital Signal Processing for Every Application

Amplification. Most common signal conditioning

Chapter 1. Introduction to Digital Signal Processing

BTV Tuesday 21 November 2006

Optimized design for controlling LED display matrix by an FPGA board

EE241 - Spring 2005 Advanced Digital Integrated Circuits

OL_H264e HDTV H.264/AVC Baseline Video Encoder Rev 1.0. General Description. Applications. Features

Transcription:

Proceedings of the 2(X)0 IEEE International Conference on Robotics & Automation San Francisco, CA April 2000 1ms Column Parallel Vision System and It's Application of High Speed Target Tracking Y. Nakabo, M. Ishikawa, Department of Mathematical Engineering and Information Physics, University of Tokyo 7-3-1, Hongo, Bunkyo-Ku, Tokyo 113-8656, Japan n akabo@k2, t. u-tokyo, ac.j p H. Toyoda and S. Mizuno Central Research Laboratory Hamamatsu Photonics K.K. 5000, Hirakuchi, Hamakita City, Shizuoka 434-8601, Japan Abstract Robot control using a real-time visual feedback has been recently improved (visual servoing.) Conventional vision systems are too slow for these application, because the CCD cameras are restricted to the video frame rate (NTSC 30Hz, PAL 25Hz). To solve this problem, we have developed a 1ms vision system, to provide a far faster frame rate than that of the conventional systems. Our 1ms vision system has a 128 128 PD array and an all parallel processor array connect to each other in a column parallel architecture, so that the bottleneck of an image transfer has been solved. 1ms visual feedback has been realized in this system, in which the image feature value is extracted in 1ms cycle-time for visual servoing. We have also developed a high speed Active Vision System (A VS)-II, which makes a gaze of vision system to move at high speed. In this paper, we will provide a detail discussion on our 1ms vision system and its performance through some experiments. 1 Introduction It is effective to use visual feedback information for systems such as for controlling robots or autonomous land vehicles. In recent years, the study, which realizes visual feedback by a real-time closed loop is popular, these technique is called visual servoing. But, most of the conventional vision systems are using CCD cameras for image sensing. The transmission speed of an image on these systems has been limited to video rate (NTSC 30Hz, PAL 25Hz). Therefore, the operation speed of the system has been limited to that of the video rate, even if fast image processing had been carried out. This limitation causes a serious problem when realizing a visual servo control, because it is generally accepted that a servo rate around lkhz is needed for robot control. Essentially the sampling rate of conventional vision systems is too slow compared to the robot dynamics. The origin of such a problem lies in the bottleneck of the image transmission. That is to say, high-speed frame rate is difficult to realize using the transmittion of the image information, which consists of a lot of pixel datas, through the small number of lines. On the other hand, Ishikawa et al. have developed a vision chip, and proposed its architecture of S3PE (Simple and Smart Sensory Processing Elements) in which all photo-detectors (PDs) are directly connected to the parallel processing elements (PEs) [1, 2]. These PEs are integrated into one chip so that the bottleneck of the image transmission does not occur and therefor realization of high speed vision systems becomes possible which can provide far faster frame rate than conventional vision systems. But in this way, a lot of sensors and processing elements must be integrated into one chip, so that the circuit of the processing elements should be compactly designed. Yet this requirement disagrees with the needs to have an ability to carry out various kinds of image processing completed on the PEs. It is reported that 64 64 pixels are the realistically possible limit with satisfying these competitive requests fabricated by the present semiconductor technology [1]. 0-7803-5886-4/00/$ 10.00 2000 IEEE 650

cycle time 0.1ms 1 ms Available now by one chip Vision systems with S~E architecture Realized by... ~... column parallel PE array boards (128x128 all parallel PE~ Image DATA (128 column parallel)f< ~-: CPV system Controller board mtrol, I/O) ire value 10ms 33ms I Video rame rate 0 CCD camera based conventional vision systems........................... -~!~.*-... I I I P 64x64 128x128 256x256 (pixels) resolution P/ PD array moduel (128x128 pixels, 128 ADC) Active Vision System (AVS-II) (PAN, TILT motors) Camera motion control Host DSP network (TMS320C40) Figure 1 tems Comparison with conventional image sys- Figure 2 Overview of 1ms vision system 2 Column Parallel Approach In contrast to these approaches, the system that we developed uses a column type parallel image transfer and it is called Column Parallel Vision (CPV) system. In this system, sensors and processors are not integrated into one chip, but implemented on separate chips and boards. The images are transmitted parallel in the columns and using scanning in the rows. Adopting this method, the communication bottleneck has been also solved and, on the other hand, restriction on resolution. The Figure 1 shows the comparison of each approaches discussed above. For image processing, the CPV system uses the same S3pE architecture than the one chip integrated vision chip. It has all parallel processing elements for every pixel, so that massively parallel and high speed image processing can be realized. For digitizing the captured image, the system has an 8-bits analog to digital converter array, which is placed in column parallel. As a result, combining scanning the data in a row and processing it at all parallel, high resolution of 128x 128 pixels becomes possible to realize as it is required in robot control applications. In our previous study, the SPE-256 system has been developed which is recognized to be a pre-step and a scale up model of the vision chip. Using this system, the high-speed target tracking with the active vision system has been developed and the lms visual feedback has been realized. Also the system has been applied to various kinds of robot control that much more dynamical than conventional visual servo sys- tems [3, 4, 5]. Furthermore, we proposed a novel approach in high-speed image processing and developed various kinds of algorithms utilizing the characteristic of the high frame rate of the vision system [6, 7]. These papers discuss that the high-speed vision system such as the lms vision system, which is effective for robot control. However, the resolution of the SPE-256 system was limited to 16 16 pixels binary images which was remarkably low from integration problems on implementation. It doesn't provide adequate resolution to show the effectiveness of the high-speed vision system. Our new CPV system has 64 times higher resolution, compared to the previous version of our system, an improved 8-bits gray level and all provides in lms cycle time. Also, there was a problem that the response time of the actuator is not enough for high speed visual feedback in the previous version of the active vision system (AVS-I). The recent AVS-II system is designed by considering the match of the vision system and the actuator in the sense a compactness of size and a responsibility. In the following section, the detail of the lms vision system including the CPV system and the AVS-II is described. And then, some experimental results are shown. 3 lms Vision System The lms vision system consists of the CPV system for the image processing, the active vision system (AVS- II) for the gaze control and a host DSP network for 651

Image input array I 128 x 128 pixels PE array I 128 x 128 I1\ Column parallel image data inp PE array (128 x128 PEs) Image data transfer using shift register Processing Element (S~PE architecture ) Csi~nnt r ~s I Instructions Controller Column CirCuit feature parld~ extraction to DSP network pan Figure 3 Column Parallel Vision(CPV) system motion control. An overview of the whole system is shown in Fig. 2. 3.1 Column Parallel Vision (CPV) System with 128 128 Pixels In the CPV system, various image processing algorithms applied after the image acquisition which was followed by the extraction of the image feature value as the output to the DSP (Digital Signal Processor) element. This information is used for the visual feedback control. Inside the CPV system, there are three modules, namely a PD array, a PE array and a controller. The block diagram of the CPV system is shown in Fig. 3. At the following paragraph, details in each systems are described. 3.1.1 PD Array The PD array consists of photo-detectors and an AD converter array which are integrated into one chip. The resolution of the PD array is 128 128 pixel. AD converter array is in 128 lines column parallel and have an 8-bits gray scale resolution. The output image data from a selected row in PD array is transmitted to the PE array in 128 lines column parallel. As a result, all the pixel data of each frame can be read in around 1 ms. 3.1.2 PE Array The image processing part consists of 128 128 parallel PEs array that realizes a totally parallel processing in each pixel of the image. The inner structure of the PEs is based on the S3pE architecture [2], which adopts a SIMD type program control, 4-neighbor connection, a bit serial ALU, 24-bits local memory and memory mapped I/Os. These all are to satisfy the needs of Figure 4 S3PE architecture with column parallel data transfer compactness and functionality. The controller sends the instructions for the PEs. Full descriptions of the controller are written in the next paragraph. The datas of pixels from PDs are transferred to the corresponding PEs by using shift registers implemented in the CPV system. Using this interface, the sensor datas are forwarded one bye one from selected column of the PD array, and are taken in the PEs with corresponding column. For the implementation of this massively parallel PE array, 128 chips (XILINX XC4044XL) of FPGA (Field Programmable Gate Array) are used. Every chip has 16x8 PEs and all the chips are loaded on eight sheets of circuit boards. 1 instruction of the system is executed within 330ns at present. The figure 4 shows the architecture of S3pE with column parallel data transfer used in the PE array. 3.1.3 Controller In general, when constructing a parallel processing system, the role of controller is important, which regulates the operation of the whole system and carries an interface with outer system without the bottleneck. An architecture of the controller is shown in Fig.5. Corresponding to the purpose of the CPV system given at the beginning of this section, the functions realized in this controller are described as follows. Extracting and calculating image feature values : Extracting and calculating image feature value as a result of processing in the PE array. An exclusive circuit makes it possible to summarize all outputs of PEs without any delay. The interface of data in/out with the PE array : Carrying out 128-line parallel data input/output with the PE array. 652

~ Extracted image E=instruction I.~,~.. I feature value Data bus ~ ' ~ ' J J ~ J 1 control signal neighbor I0 J ~ I. "'J J 1D buffer Main memory A~ress Host DSP : ~ memory (Dual Port)32x64k ~ata network Extracted target position... I Target image J ~vsys[em I' Center of / Motor Pan tilt I image ~ (~J Cont roler J - ~ - ~ - - ~ ~ _ ~ T a r g e t ~ m o t i o n Figure 6 Block diagram of AVS-II control Figure 5 Architecture of Controller The SIMD control of the SZPE : Sending instructions to the PE array and regulating SIMD control. The control of the system by the user program : Being all operation of the system controlled by the user program which is downloaded to the main memory The interface with the outer system : Caring out data input/output without the bottleneck with host DSP system using the sharing memory method. Managing the synchronization of the total system : Managing the synchronization of all modules in the CPV system. An implementation of the controller is also realized by FPGA. It uses two chips (XILINX XC4044XL) loaded on one board. As a result, the CPV system satisfies the goal of the cycle time of lms, because a series of processing and a flow of information for the visual feedback control are ideally implemented without the bottleneck. 3.2 Active Vision System (AVS)-II The AVS-II is an actuator part of the 1ms vision system to make a gaze move at high speed The PD array is loaded on the mobile platform which has two degrees of freedom; pan and tilt; and each axes have the Z-II AC servo motor (100W) made by Yaskawa electric corporation. The motors are used by direct drive and without any gears which allows it to remove the disturbance of the effects of the friction. High power and compact size are realized together with the fast response time compared to the previous version of our active vision system [3, 6]. Figure 7 CPV system with AVS-II The control of the motion are carried out in the host DSP network which used parallel processing DSPs (TMS320C40 by TI). Using the DSP network enables lms cycle time without bottleneck dispersing load of the processing and I/Os [5]. The block diagram of the control of the AVS-II is shown in Fig.6. A real-time visual servo control by the method of torque control is constructed. The photograph of the CPV system and the AVS-II is shown in Fig.7. 4 Experimental results 4.1 High Speed Image Processing At first, several kinds of generally used image processing had been carried out to confirm the performance of the CPV system In the experiments, 8-bits digitized images are continuously obtained at PD array. Image processing is applied at every frame cycle carried out by parallel 653

~lrlllllllllij ~ t=o t=66 t= 132 8bit Input raw image 7times 4-Blured image t=198 t=264 t=330 [ ms ] Figure 9 Photo strip of target tracking Embossed image 2-Edged image Figure 8 Image processing carried out in CPV system Table 1 image processing steps 8-bits data input 8 2-neighbor blur 58 7-neighbor blur 406 2-neighbor emboss 59 2-neighbor edge 59 time 26.6/zs 19.3/zs 135.3 ps 19.6 #s 19.6/zs Image processing time in CPV system processing on the PE array. To make it easy to verify the results of the image processing, the frame rate is temporary kept down to around 50 fps to avoid effect of the noise of the sensor. The image processing executed in the experiments is 2-neighbor blur, 7 times 4-neighbor blur, 2-neighbor embossing and 2-neighbor edge detection. The instruction steps and the processing time are shown in Tab.1. The dumped images of each results are shown in Fig.8. It is shown that the results of each image processing are obtained correctly. Also, it should be emphasized that the numbers of the steps executed through the image processing axe remarkably small owing to an all parallel processing at PE array. processing contents steps I time 3-bits data input :: n : 1.0/zs filtering 41 I 13.6 ~s self windowing,,, 2.0 tzs extracting centroid 126 I 42.0 #s total 176 I 58.6 #s Table 2 Processing time on target tracking 4.2 Tracking Irregular Motion of Target The next experiments are carried out using an active vision actuator to verify the performance of the system as the visual servo system by applying to target tracking. Because the high speed of the system is important in this experiment, realizing the 1ms cycle time in visual feedback is set to be the goal. In practice, 3-bits images are used and a lkhz frame rate is realized which is adequate for target tracking application. As the image processing, filtering, thresholding, self windowing[6] for target recognition and extracting centroid of the target pattern are executed on every frame. The instruction steps and the processing time are shown in Tab.2. Figure 9 shows the result of the target tracking. The active vision experiment keeps tracking the white ball as the target, as shown in the photos. The experiment proved that the system is sufficiently tracking even if the target moves fast and irregularly. 654

~ ~ ~ ~,,, :-- cuto, 0... ~--==--~ " :... ~... ~.?.'.L~-..-..'~.. g Gain== i i :: ::~, [""',,,.! == '~.,o... ~... ; ~ - - ::?.,<:-!... - = : =!! ", ', i! i! i i! i - i i.20...... applications of visual servoing. From the detail of the system described in the earlier part of the paper, it is possible to say that the system 0 L.--L i ', N ~ i i i! i "'~. : i i I i I i 2 5 10 20 50 Frequency [Hz] Figure 10 Frequency Response of Target Tracking 4.3 Frequency Response of Target Tracking The frequency responses of each axes of the system are measured during the target tracking. The target in this experiment is fixed to the base of the system and the goal position given in the image plane which is in a different frequency of sinusoid wave. From the trajectory and the goal position of the tracking motion, the gain and the phase shift are calculated. For the control law of the motion, phase improving method is used. The result is shown in Fig.10. It yields that the actuators of the both axes, tilt and pan, can follow the violent change of the goal position at more than 10Hz of the cutoff frequency. From this result, it is clear that a very high response characteristic has been proposed by our visual servo system, much higher than that of the conventional systems. 5 Conclusion In this paper, several vision system approaches are discussed, such as the conventional vision systems using CCD cameras restricted by the video frame rate, then the one chip directed vision chip approach and it's scale up model that have already been developed at present. Compared with these approaches, the CPV system, combining a column parallel image data transfer and an all parallel processing architecture based on an S3pE architecture, have been realized, and provides higher resolution with lms cycle time required for the