EECS150 - Digital Design Lecture 12 Project Description, Part 2

Similar documents
NOW Handout Page 1. Traversing Digital Design. EECS Components and Design Techniques for Digital Systems. Lec 13 Project Overview.

The Project & Digital Video. Today. The Project (1) EECS150 Fall Lab Lecture #7. Arjun Singh

Checkpoint 2 Video Encoder

To discuss. Types of video signals Analog Video Digital Video. Multimedia Computing (CSIT 410) 2

EECS150 - Digital Design Lecture 12 - Video Interfacing. Recap and Outline

An Overview of Video Coding Algorithms

Video 1 Video October 16, 2001

AN-ENG-001. Using the AVR32 SoC for real-time video applications. Written by Matteo Vit, Approved by Andrea Marson, VERSION: 1.0.0

Motion Video Compression

10 Digital TV Introduction Subsampling

1. Broadcast television

Chapter 3 Fundamental Concepts in Video. 3.1 Types of Video Signals 3.2 Analog Video 3.3 Digital Video

The World Leader in High Performance Signal Processing Solutions. Section 15. Parallel Peripheral Interface (PPI)

Multimedia. Course Code (Fall 2017) Fundamental Concepts in Video

Section 14 Parallel Peripheral Interface (PPI)

Transitioning from NTSC (analog) to HD Digital Video

Module 1: Digital Video Signal Processing Lecture 5: Color coordinates and chromonance subsampling. The Lecture Contains:

Checkpoint 2 Video Encoder and Basic User Interface

Serial Digital Interface

Multimedia Systems Video I (Basics of Analog and Digital Video) Mahdi Amiri April 2011 Sharif University of Technology

Multimedia Systems. Part 13. Mahdi Vasighi

MULTIMEDIA TECHNOLOGIES

EECS150 - Digital Design Lecture 13 - Project Description, Part 3 of? Project Overview

Advanced Computer Networks

COMP 249 Advanced Distributed Systems Multimedia Networking. Video Compression Standards

Chrontel CH7015 SDTV / HDTV Encoder

A Guide to Standard and High-Definition Digital Video Measurements

Primer. A Guide to Standard and High-Definition Digital Video Measurements. 3G, Dual Link and ANC Data Information

Ch. 1: Audio/Image/Video Fundamentals Multimedia Systems. School of Electrical Engineering and Computer Science Oregon State University

Audio and Video II. Video signal +Color systems Motion estimation Video compression standards +H.261 +MPEG-1, MPEG-2, MPEG-4, MPEG- 7, and MPEG-21

Computer and Machine Vision

Traditionally video signals have been transmitted along cables in the form of lower energy electrical impulses. As new technologies emerge we are

5.1 Types of Video Signals. Chapter 5 Fundamental Concepts in Video. Component video

Graduate Institute of Electronics Engineering, NTU Digital Video Recorder

MACROVISION RGB / YUV TEMP. RANGE PART NUMBER

So far. Chapter 4 Color spaces Chapter 3 image representations. Bitmap grayscale. 1/21/09 CSE 40373/60373: Multimedia Systems

RECOMMENDATION ITU-R BT (Questions ITU-R 25/11, ITU-R 60/11 and ITU-R 61/11)

Software Analog Video Inputs

Digital Media. Daniel Fuller ITEC 2110

Content storage architectures

06 Video. Multimedia Systems. Video Standards, Compression, Post Production

DATASHEET HMP8154, HMP8156A. Features. Ordering Information. Applications. NTSC/PAL Encoders. FN4343 Rev.5.00 Page 1 of 34.

An FPGA Based Solution for Testing Legacy Video Displays

Pivoting Object Tracking System

Television History. Date / Place E. Nemer - 1

Parallel Peripheral Interface (PPI)

ATSC vs NTSC Spectrum. ATSC 8VSB Data Framing

CEA Standard. Standard Definition TV Analog Component Video Interface CEA D R-2012

Video Encoders with Six 10-Bit DACs and 54 MHz Oversampling ADV7190/ADV7191

FPGA Laboratory Assignment 4. Due Date: 06/11/2012

TMS320DM646x DMSoC Video Port Interface (VPIF) User's Guide

Basic TV Technology: Digital and Analog

B. The specified product shall be manufactured by a firm whose quality system is in compliance with the I.S./ISO 9001/EN 29001, QUALITY SYSTEM.

Streamcrest Motion1 Test Sequence and Utilities. A. Using the Motion1 Sequence. Robert Bleidt - June 7,2002

METADATA CHALLENGES FOR TODAY'S TV BROADCAST SYSTEMS

RESEARCH AND DEVELOPMENT LOW-COST BOARD FOR EXPERIMENTAL VERIFICATION OF VIDEO PROCESSING ALGORITHMS USING FPGA IMPLEMENTATION

Contents. xv xxi xxiii xxiv. 1 Introduction 1 References 4

Obsolete Product(s) - Obsolete Product(s)

Rec. ITU-R BT RECOMMENDATION ITU-R BT PARAMETER VALUES FOR THE HDTV STANDARDS FOR PRODUCTION AND INTERNATIONAL PROGRAMME EXCHANGE

Digital Blocks Semiconductor IP

From Synchronous to Asynchronous Design

Dan Schuster Arusha Technical College March 4, 2010

Understanding Multimedia - Basics

ESI VLS-2000 Video Line Scaler

Lecture 1: Introduction & Image and Video Coding Techniques (I)

VIDEO Muhammad AminulAkbar

Video coding standards

MPEG-2. ISO/IEC (or ITU-T H.262)

FUNCTIONAL BLOCK DIAGRAM TTX TELETEXT INSERTION BLOCK 9 PROGRAMMABLE LUMINANCE FILTER PROGRAMMABLE CHROMINANCE FILTER REAL-TIME CONTROL SCRESET/RTC

INTERNATIONAL TELECOMMUNICATION UNION. SERIES H: AUDIOVISUAL AND MULTIMEDIA SYSTEMS Coding of moving video

4. Video and Animation. Contents. 4.3 Computer-based Animation. 4.1 Basic Concepts. 4.2 Television. Enhanced Definition Systems

Nintendo. January 21, 2004 Good Emulators I will place links to all of these emulators on the webpage. Mac OSX The latest version of RockNES

Lecture 2 Video Formation and Representation

Midterm Review. Yao Wang Polytechnic University, Brooklyn, NY11201

Digital Television Fundamentals

Module 8 VIDEO CODING STANDARDS. Version 2 ECE IIT, Kharagpur

2.4.1 Graphics. Graphics Principles: Example Screen Format IMAGE REPRESNTATION

High-Definition, Standard-Definition Compatible Color Bar Signal

for File Format for Digital Moving- Picture Exchange (DPX)

RECOMMENDATION ITU-R BT Digital interfaces for HDTV studio signals

BTV Tuesday 21 November 2006

CX25874/5 Digital Encoder with Standard-Definition TV and High-Definition TV Video Output. Data Sheet

COPYRIGHTED MATERIAL. Introduction to Analog and Digital Television. Chapter INTRODUCTION 1.2. ANALOG TELEVISION

Design and Implementation of an AHB VGA Peripheral

Sapera LT 8.0 Acquisition Parameters Reference Manual

Implementation of 24P, 25P and 30P Segmented Frames for Production Format

LogiCORE IP Spartan-6 FPGA Triple-Rate SDI v1.0

Progressive Image Sample Structure Analog and Digital Representation and Analog Interface

TERMINOLOGY INDEX. DME Down Stream Keyer (DSK) Drop Shadow. A/B Roll Edit Animation Effects Anti-Alias Auto Transition

RECOMMENDATION ITU-R BT.1203 *

Multimedia Communications. Video compression

Module 8 VIDEO CODING STANDARDS. Version 2 ECE IIT, Kharagpur

Module 3: Video Sampling Lecture 17: Sampling of raster scan pattern: BT.601 format, Color video signal sampling formats

«Stream Labs» Closed Joint -Stock Company. TPG-8 test image oscillator. Operation Manual

Video Compression. Representations. Multimedia Systems and Applications. Analog Video Representations. Digitizing. Digital Video Block Structure

Chrominance Subsampling in Digital Images

SMPTE x720 Progressive Image Sample Structure - Analog and Digital representation and Analog Interface

Mahdi Amiri. April Sharif University of Technology

Lecture 2 Video Formation and Representation

SingMai Electronics SM06. Advanced Composite Video Interface: HD-SDI to acvi converter module. User Manual. Revision 0.

Transcription:

EECS150 - Digital Design Lecture 12 Project Description, Part 2 February 27, 2003 John Wawrzynek/Sandro Pintz Spring 2003 EECS150 lec12-proj2 Page 1 Linux Command Server network VidFX Video Effects Processor Video stream is decoded, processed, and redisplayed Commands are sent by you from Linux server The receiver gets a command from the networked Linux server. It decodes the command and dynamically changes the video processing Your particular board is addressed by your private MAC address. Everyone (working in groups of 2) will design, implement, debug, and demo a VidFX system on the Calinx board. VidFX Board Spring 2003 EECS150 lec12-proj2 Page 2

Calinx Board Video & Audio Ports AC 97 Codec & Power Amp Video Encoder & Decoder Flash Card & Micro-drive Port Four 100 Mb Ethernet Ports 8 Meg x 32 SDRAM Quad Ethernet Transceiver Prototype Area Seven Segment LED Displays Xilinx Virtex 2000E Spring 2003 EECS150 lec12-proj2 Page 3 1. Rough calculations / feasibility 2. Network Side LANs Network stacks Ethernet 125 Network Architecture VidFX Packet Format Calinx Network Interface 3. Video Side Digital Video Basics Example Video Standards Calinx Video Interface (encoder/decoder) Video Effects Processing Outline 4. Frame-Buffer Design 5. VidFX High-level Organization 6. Review of Schedule and Design Checkpoints Spring 2003 EECS150 lec12-proj2 Page 4

Digital Video Basics Pixel Array: A digital image is represented by a matrix of values where each value is a function of the information surrounding the corresponding point in the image. A single element in an image matrix is a picture element, or pixel. A pixel includes info for all color components. The array size varies for different applications and costs. Some common sizes shown to the right. Frames: The illusion of motion is created by successively flashing still pictures called frames. 240 480 600 720 900 1080 SIF, 82 Kpx 352 Video, 300 Kpx PC/Mac, 1 2 Mpx 640 800 High-Definition Television (HDTV), 1 Mpx Workstation, 1 Mpx 1152 1280 High-Definition Television (HDTV), 2 Mpx 1920 Spring 2003 EECS150 lec12-proj2 Page 5 Refresh Rates & Scaning The human perceptual system can be fooled into seeing continuous motion by flashing frames at a rate of around 20 frames/sec or higher. Much lower and the movement looks jerky and flickers. TV in the US uses 30 frames/second (originally derived from the 60Hz line current frequency). Images are generated on the screen of the display device by drawing or scanning each line of the image one after another, usually from top to bottom. Early display devices (CRTs) required time to get from the end of a scan line to the beginning of the next. Therefore each line of video consists of an active video portion and a horizontal blanking interval portion. A vertical blanking interval corresponds to the time to return from the bottom to the top. In addition to the active (visible) lines of video, each frame includes a number of non-visible lines in the vertical blanking interval. The vertical blanking interval is used these days to send additional information such as closed captions and stock reports. Spring 2003 EECS150 lec12-proj2 Page 6

Interlaced Scanning Early inventers of TV discovered that they could reduce the flicker effect by increasing the flash-rate without increasing the frame-rate. Interlaced scanning forms a complete picture, the frame, from two fields, each comprising half the scan lines. The second field is delayed half the frame time from the first. The first field, odd field, displays the odd scan lines, the second, even field, displays the even scan lines. Non-interlaced displays are call progressive scan. Spring 2003 EECS150 lec12-proj2 Page 7 Pixel Components A natural way to represent the The color signals (components) are information at each pixel is with the color differences, defined as: brightness of each of the primary B-Y and R-Y, where Y is the brightness color components: red, green and signal (component). blue (RBG). In the digital domain the three In the digital domain we could components are called: transmit one number for each of Y luma, overall brightness red, green, and blue intensity. C B chroma, Y-B Engineers had to deal with issue C when transitioning from black and R chroma,y-r white TV to color. The signal for Note that it is possible to black and white TV contains the reconstruct the RGB representation overall pixel brightness (a if needed. combination of all color One reason this representation components). survives today is that the human Rather than adding three new visual perceptual system is less signals for color TV, they decided to sensitive to spatial information in encode the color information in two chrominance than it is in luminance. extra signals to be used in Therefore chroma components are conjunction with the B/W signal for usually subsampled with respect to color receivers and could be luma component. ignored for the older B/W sets. Spring 2003 EECS150 lec12-proj2 Page 8

Chroma Subsampling RGB 4:4:4 Y C R C B 4:4:4 4:2:2 (ITU-601) 4:2:0 (MPEG-1) 4:2:0 (MPEG-2) R 0 R 2 R 1 R 3 Y 2 Y 3 Y 2 Y 3 Y 2 Y 3 Y 2 Y 3 G 0 G 2 G 1 G 3 C B C B C B C B C B 0-1 C B 2-3 C B 0-3 C B 0-3 B 0 B 2 B 1 B 3 C R C R C R C R C R 0-1 C R 0-1 C R 0-3 C R 0-3 Variations include subsampling horizontally, both vertically and horizontally. Chroma samples are coincident with alternate luma samples or are sited halfway between alternate luna samples. Spring 2003 EECS150 lec12-proj2 Page 9 Common Interchange Format (CIF) Example 1: commonly used as output of MPEG-1 decoders. Common Interchange Format (CIF) Developed for low to medium quality applications. Teleconferencing, etc. Variations: QCIF, 4CIF, 16CIF Examples of component streaming: line i: Y C R Y Y C R Y Y line i+1: Y C B Y Y C B Y Y Alternate (different packet types): line i: Y C R Y C B Y C R Y C B Y line i+1: Y Y Y Y Y Bits/pixel: 6 components / 4 pixels 48/4 = 12 bits/pixel Frame size Frame rate Scan Chroma subsampling Chroma alignment Bits per component Effective bits/pixel 352 x 288 30 /sec progressive 4:2:0 2:1 in both X & Y interstitial Spring 2003 EECS150 lec12-proj2 Page 10 8 12

ITU-R BT.601 Format The Calinx board video encoder supports this format. Formerly, CCIR-601. Designed for digitizing broadcast NTSC (national television system committee) signals. Variations: 4:2:0 PAL (European) version Component streaming: line i: Y C B Y C R Y C B Y C R Y line i+1: Y C B Y C R Y C B Y C R Y Bits/pixel: 4 components / 2 pixels 40/2 = 20 bits/pixel Frame size Frame rate Scan Chroma subsampling Chroma alignment Bits per component Effective bits/pixel 720 x 487 29.97 /sec interlaced 4:2:2 2:1 in X only coincident 10 20 Spring 2003 EECS150 lec12-proj2 Page 11 Analog Devices ADV7185 Calinx Video Decoder analog side digital side Takes NTSC (or PAL) video signal on analog side and outputs ITU601/ITU656 on digital side. Many modes and features not use by us. VidFX project will use default mode: no initialization needed. Generates 27MHz clock synchronized to the output data. Digital input side connected to Virtex pins. Analog output side wired to on board connectors or headers. Camera connection through composite video. Spring 2003 EECS150 lec12-proj2 Page 12

Analog Devices ADV7185 Calinx Video Encoder DIGITAL INPUT VIDEO INPUT PROCESSING VIDEO SIGNAL PROCESSING VIDEO OUTPUT PROCESSING ANALOG OUTPUT 27MHz CLOCK PLL AND 54MHz ITU R.BT 656/601 YCrCb IN 4:2:2 FORMAT DEMUX AND YCrCb- TO- YUV MATRIX COLOR CONTROL DNR GAMMA CORRECTION VBI TELETEXT CLOSED CAPTION CGMS/WSS CHROMA LPF SSAF LPF LUMA LPF 2 OVERSAMPLING OR 4 OVERSAMPLING COMPOSITE VIDEO Y [S-VIDEO] C [S-VIDEO] RGB YUV YPrPb TV SCREEN OR PROGRESSIVE SCAN DISPLAY I 2 C INTERFACE Supports: Multiple input formats and outputs Operational modes, slave/master VidFX project will use default mode: ITU-601 as slave s-video output ADV7194 Digital input side connected to Virtex pins. Analog output side wired to on board connectors or headers. I 2 C interface for initialization: Wired to Virtex. We will supply this as a pre-designed module. Spring 2003 EECS150 lec12-proj2 Page 13 ITU-R BT.656 Details Last sample of digital active line FIGURE 1 Composition of interface data stream Sample data for O H instant First sample of digital active line Interfacing details for ITU-601. Pixels per line 858 Lines per frame 525 Frames/sec 29.97 Pixels/sec 13.5 M Viewable pixels/sec 720 Viewable lines/frame 487 With 4:2:2 chroma sub-sampling need to send 2 words/pixel (1 Y and 1 C). words/sec = 27M, Therefore encoder runs off a 27MHz clock. Control information (horizontal and vertical synch) is multiplexed on the data lines. Encoder data stream show to right: Luminance data, Y Chrominance data, C R Chrominance data, C B C B 359 C B 359 Y 718 Y 718 C R 359 C R 359 Y 719 Y 719 736 857 718 719 720 721 ( 732) (863) 0 1 2 C B 360 Y 720 368 359 360 ( 366) 0 1 368 359 360 ( 366) 0 1 C R 360 Y 721 Replaced by timing reference signal End of active video C B 368(366) Y 736(732) C R 368(366) Replaced by digital blanking data Timing reference signals Y 855(861) C B 428(431) Y 856(862) C R 428(431) Y 857(863) Replaced by timing reference signal Start of active video C B 0 C R 0 C B 0 C R 0 Note 1 Sample identification numbers in parentheses are for 625-line systems where these differ from those for 525-line systems. (See also Recommendation ITU-R BT.803.) Spring 2003 EECS150 lec12-proj2 Page 14 D01

ITU-R BT.656 Details Control is provided through End of Video (EAV) and Start of Video (SAV) timing references. Each reference is a block of four words: FF, 00, 00, <code> The <code> word encodes the following bits: F = field select (even or odd) V = indicates vertical blanking H = 1 if EAV else 0 for SAV Horizontal blanking section consists of repeating pattern 80 10 80 10 Spring 2003 EECS150 lec12-proj2 Page 15 Video Effects Processing Required processing: 1. Freeze Frame upon receiving freeze command from the network, the currently displayed frame is held static. Subsequent unfreeze command will restore display to normal (intervening frames are lost). 2. Zoom-out - display quartersized image in center of screen. Optional (extra-credit): 1. Variable zoom out: zoom-out command with a parameter indicating the zoom out amount. 2. Black and white: Eliminate the color components and display a black and white picture 3. Inverse video: Subtracting the actual component value from the maximum component value. 4. Zoom In: Either simple line and pixel doubling or linear interpolation (higher quality). 5. Dynamic change of aspect ratio: Change the on screen aspect ratio by scaling one or the other dimension. 6. Your proposal make sure to get approval from teaching staff. Spring 2003 EECS150 lec12-proj2 Page 16

Image Scaling (zoom-out) One technique is to replace every block of 2x2 pixels by a single pixel representing the original block of 4. a) Could just use one of the original pixels (lower right pixel, for instance). b) Different (integer) scale factors can be achieved using larger blocks. c) Better technique would use all four original pixels, for instance in an average calculation. Other more complex and higher-quality techniques are possible but not necessary for VidFX. The minimum required technique is a) above. Spring 2003 EECS150 lec12-proj2 Page 17 Many video application designs are eased through the use of a frame buffer. It holds one frame of video as it is being displayed. Logically, the buffer can considered to have two ports one for writing from the video source (camera in case of VidFX) and one for reading for video display. from camera Frame Buffer 720 x 487 = 350,640 Bytes Frame Buffer to video display Frame buffers are used on PC video display cards to decouple video generation from display. For VidFX it permits the freeze frame command. Frame buffer will not fit in memory internal to Virtex FPGA Virtex 2000E has 160 blockrams (4K bits each) = 655,360 bits. Buffer to be implemented using external SDRAM with control logic implemented on the FPGA. Spring 2003 EECS150 lec12-proj2 Page 18

Active Lines per Frame Tuesday s Lecture said 507, today 487!? NTSC: 525 lines/frame total 487 active, 38 blanking odd field: lines 4-19 (16 lines) blanking lines 20-263 (244) active lines 264-265 (2) blanking 262 lines even field: lines 266-282 (17) blanking lines 283-525 (243) active Lines 1-3 (3) blanking 263 lines 525 lines total Our camera generates 525 total lines per frame, but 507 of these are active. 507-487 = 20 extra lines of the NTSC frame when the camera sends active video. Which lines? Doesn t really matter: Use control bits in video stream (V bit) to indicate active video versus blanking. Implement frame buffer large enough to hold active camera data (507 lines). Spring 2003 EECS150 lec12-proj2 Page 19 VidFX PHY 25 MHz (Network) Network Interface composite video composite video ADV7185 ADV7194 27 MHz (Camera) ITU 601/656 8 Video Decode 8 ITU 601/656 Video Encode Control 32 Special 32 Effects 27 MHz (Video Encoder) control Async Fifo 27 MHz (Video Encoder) 32 control Control Sdram Control Data 32 32 32 Sync Fifo control control SDRAM FPGA Spring 2003 EECS150 lec12-proj2 Page 20

Checkpoints 1. Video Decoder: (1 week, check-off by week of 3/10) Decode the incoming video stream and compute a value representing each one of the 3 video components. Display that value on the 7-segment LEDs 2. Video Encoder: (2 weeks, check-off by week of 3/31) Initialize blockram memory with video test pattern, display it on monitor. Tests video display control and data-path. Checks understanding of memory-based video control. 3. SDRAM Test: (2 weeks, check-off by week of 4/14) Write and read data patterns to SDRAM. 4. Network Test: (1 week, check-off by week of 4/21) Receive packets from network, display payload on LED display. Tests receiver MAC and packet filter/parser 5. Integration, debugging, improvements. (3 weeks, final check-off by week of 5/5) Extra credit for early check-off. Spring 2003 EECS150 lec12-proj2 Page 21 References for Video 1. For general background on video and digital video: Charles Poynton, "A technical Introduction to Digital Video", Chapter 1. 2. www.inforamp.net/~poynton/pdfs/tidv/basic_principle.pdf 3. Calinx board user's manual (on class website) 4. Tom Oberheim's "CS150 Board Digital Video in a Nutshell" 5. VideoNutshell.doc on class website 6. The official specification supported by the video encoder on the Calinx board: ITU specifications 656 & 601 ITU656.doc, ITU601.doc on class website 7. Video encoder datasheet: ADV7194.pdf on class website 8. Video decoder datasheet: ADV7185.pdf on class website Spring 2003 EECS150 lec12-proj2 Page 22