Simple LCD Transmitter Camera Receiver Data Link

Similar documents
MULTI-STATE VIDEO CODING WITH SIDE INFORMATION. Sila Ekmekci Flierl, Thomas Sikora

An Overview of Video Coding Algorithms

Digital Video Telemetry System

Implementation of an MPEG Codec on the Tilera TM 64 Processor

Chapter 2 Introduction to

RainBar: Robust Application-driven Visual Communication using Color Barcodes

Video coding standards

Research Topic. Error Concealment Techniques in H.264/AVC for Wireless Video Transmission in Mobile Networks

ATSC Standard: Video Watermark Emission (A/335)

Compressed-Sensing-Enabled Video Streaming for Wireless Multimedia Sensor Networks Abstract:

VLSI Chip Design Project TSEK06

IEEE P a. IEEE P Wireless Personal Area Networks. hybrid modulation schemes and cameras ISC modes

h t t p : / / w w w. v i d e o e s s e n t i a l s. c o m E - M a i l : j o e k a n a t t. n e t DVE D-Theater Q & A

AN IMPROVED ERROR CONCEALMENT STRATEGY DRIVEN BY SCENE MOTION PROPERTIES FOR H.264/AVC DECODERS

Intra-frame JPEG-2000 vs. Inter-frame Compression Comparison: The benefits and trade-offs for very high quality, high resolution sequences

CODING EFFICIENCY IMPROVEMENT FOR SVC BROADCAST IN THE CONTEXT OF THE EMERGING DVB STANDARDIZATION

CHARACTERIZATION OF END-TO-END DELAYS IN HEAD-MOUNTED DISPLAY SYSTEMS

Analysis of MPEG-2 Video Streams

Mobile Phone Camera-Based Indoor Visible Light Communications With Rotation Compensation

WiBench: An Open Source Kernel Suite for Benchmarking Wireless Systems

ATSC Candidate Standard: Video Watermark Emission (A/335)

Focus: Robust Visual Codes for Everyone

GNURadio Support for Real-time Video Streaming over a DSA Network

BER MEASUREMENT IN THE NOISY CHANNEL

Error Resilient Video Coding Using Unequally Protected Key Pictures

Focus: Robust Visual Codes for Everyone

G-106 GWarp Processor. G-106 is multiple purpose video processor with warp, de-warp, video wall control, format conversion,

Luma Adjustment for High Dynamic Range Video

A reliable asynchronous protocol for VLC communications based on the rolling shutter effect

Dual Frame Video Encoding with Feedback

ONE SENSOR MICROPHONE ARRAY APPLICATION IN SOURCE LOCALIZATION. Hsin-Chu, Taiwan

Pre-processing of revolution speed data in ArtemiS SUITE 1

LCD and Plasma display technologies are promising solutions for large-format

17 October About H.265/HEVC. Things you should know about the new encoding.

Therefore, HDCVI is an optimal solution for megapixel high definition application, featuring non-latent long-distance transmission at lower cost.

Monitor QA Management i model

Module 8 VIDEO CODING STANDARDS. Version 2 ECE IIT, Kharagpur

Research Article. ISSN (Print) *Corresponding author Shireen Fathima

Dynamic IR Scene Projector Based Upon the Digital Micromirror Device

Optical Engine Reference Design for DLP3010 Digital Micromirror Device

G-106Ex Single channel edge blending Processor. G-106Ex is multiple purpose video processor with warp, de-warp, video wall control, format

1ms Column Parallel Vision System and It's Application of High Speed Target Tracking

Source/Receiver (SR) Setup

Traditionally video signals have been transmitted along cables in the form of lower energy electrical impulses. As new technologies emerge we are

Modeling and Optimization of a Systematic Lossy Error Protection System based on H.264/AVC Redundant Slices

Styrofoam: A Tightly Packed Coding Scheme for Camera-based Visible Light Communication

FIBRE CHANNEL CONSORTIUM

DESIGN OF VISIBLE LIGHT COMMUNICATION SYSTEM

Advanced Video Processing for Future Multimedia Communication Systems

Microbolometer based infrared cameras PYROVIEW with Fast Ethernet interface

Improved Error Concealment Using Scene Information

Audio Compression Technology for Voice Transmission

Multi-Frame Matrix Capture Common File Format (MFMC- CFF) Requirements Capture

FLEXIBLE SWITCHING AND EDITING OF MPEG-2 VIDEO BITSTREAMS

COMP 249 Advanced Distributed Systems Multimedia Networking. Video Compression Standards

Region Adaptive Unsharp Masking based DCT Interpolation for Efficient Video Intra Frame Up-sampling

Real Time PQoS Enhancement of IP Multimedia Services Over Fading and Noisy DVB-T Channel

Pivoting Object Tracking System

FRAME ERROR RATE EVALUATION OF A C-ARQ PROTOCOL WITH MAXIMUM-LIKELIHOOD FRAME COMBINING

Bar Codes to the Rescue!

DELTA MODULATION AND DPCM CODING OF COLOR SIGNALS

ECE 4220 Real Time Embedded Systems Final Project Spectrum Analyzer

A Novel Study on Data Rate by the Video Transmission for Teleoperated Road Vehicles

Warping. Yun Pan Institute of. VLSI Design Zhejiang. tul IBBT. University. Hasselt University. Real-time.

Motion Video Compression

DRAFT. Proposal to modify International Standard IEC

Implementation of a turbo codes test bed in the Simulink environment

How to Match the Color Brightness of Automotive TFT-LCD Panels

CHECKPOINT 2.5 FOUR PORT ARBITER AND USER INTERFACE

HEVC/H.265 CODEC SYSTEM AND TRANSMISSION EXPERIMENTS AIMED AT 8K BROADCASTING

V9A01 Solution Specification V0.1

Data flow architecture for high-speed optical processors

PRODUCT GUIDE CEL5500 LIGHT ENGINE. World Leader in DLP Light Exploration. A TyRex Technology Family Company

How Does H.264 Work? SALIENT SYSTEMS WHITE PAPER. Understanding video compression with a focus on H.264

Lossless Compression Algorithms for Direct- Write Lithography Systems

Table of content. Table of content Introduction Concepts Hardware setup...4

UNIVERSAL SPATIAL UP-SCALER WITH NONLINEAR EDGE ENHANCEMENT

Part 1: Introduction to Computer Graphics

Introduction. Edge Enhancement (SEE( Advantages of Scalable SEE) Lijun Yin. Scalable Enhancement and Optimization. Case Study:

ATI Theater 650 Pro: Bringing TV to the PC. Perfecting Analog and Digital TV Worldwide

Team Members: Erik Stegman Kevin Hoffman

Optimization of Multi-Channel BCH Error Decoding for Common Cases. Russell Dill Master's Thesis Defense April 20, 2015

Elements of a Television System

White Paper. Uniform Luminance Technology. What s inside? What is non-uniformity and noise in LCDs? Why is it a problem? How is it solved?

Design and FPGA Implementation of 100Gbit/s Scrambler Architectures for OTN Protocol Chethan Kumar M 1, Praveen Kumar Y G 2, Dr. M. Z. Kurian 3.

TERRESTRIAL broadcasting of digital television (DTV)

Presented at the IPS 2004 Fulldome Standards Summit, Valencia, Spain, 7/8 July 2004 R.S.A. COSMOS

COSC3213W04 Exercise Set 2 - Solutions

Television System. EE 3414 May 9, Group Members: Jun Wei Guo Shou Hang Shi Raul Gomez

Common assumptions in color characterization of projectors

ECE3296 Digital Image and Video Processing Lab experiment 2 Digital Video Processing using MATLAB

International Journal for Research in Applied Science & Engineering Technology (IJRASET) Motion Compensation Techniques Adopted In HEVC

Advanced Techniques for Spurious Measurements with R&S FSW-K50 White Paper

N T I. Introduction. II. Proposed Adaptive CTI Algorithm. III. Experimental Results. IV. Conclusion. Seo Jeong-Hoon

1022 IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 19, NO. 4, APRIL 2010

COPYRIGHTED MATERIAL. Introduction. 1.1 Overview of Projection Displays

Ch. 1: Audio/Image/Video Fundamentals Multimedia Systems. School of Electrical Engineering and Computer Science Oregon State University

Patterns Manual September 16, Main Menu Basic Settings Misc. Patterns Definitions

AUDIOVISUAL COMMUNICATION

User manual. Long Range Wireless HDMI/SDI HD Video Transmission Suite

Transcription:

Simple LCD Transmitter Camera Receiver Data Link Grace Woo, Ankit Mohan, Ramesh Raskar, Dina Katabi LCD Display to demonstrate visible light data transfer systems using classic temporal techniques. QR Codes, Data Matrix Codes, Shot Codes and EZ Codes are examples of popular D barcodes which encode information to be transmitted visually. These modes of D visual information transmission already conform to ISO standards developed [, 3, 1] to utilize D encoding and decoding of visual information. 1. Our implementation of a display and camera provides high bit rate transfer of up to 3Mbps using a single link where the transmitting LCD screen is a standard Dell inch plasma screen and the receiving camera is a Canon Rebel SLR with standard feature settings. Here we summarize the fundamental contributions and limitations of this work: Camera Figure 1: Simple setup with a commodity camera and standard LCD screen. Contributions: We build an end-to-end transmit and receive prototype system using a standard Dell LCD display as the transmitter, and a commodity Cannon SLR camera as the receiver where the total oversampling rate is 1 pix to pix x pix. The bitrate per frame demonstrated in this work is.96mbits per frame with an uncoded average BER of %. ABSTRACT We demonstrate a freespace optical system using a consumer camera and projector in indoor environments using available devices for visual computing. Through design, prototype and experimentation with this commodity hardware, we analyze a practical optical solution as well as the drawbacks for current wireless challenges unmet by classic RF wireless communication. We summarize and introduce some new applications enabled by such similar setups. 1. This is an assertion of a channel model for a pixel-to-pixel communication system based on an actual implemented system Based on the observed channel properties, we propose several application scenarios that make use of the visible light channel. Introduction We consider a simple hardware setup using a consumer camera and a standard LCD screen such as the one shown in Figure 1. There are currently many visible light communication (VLC) broadcast configurations exploring the potentials of a data link established using custom freespace optical hardware equipment. This design differs in it s use of a consumer hardware setup and aims to achieve high bit rate as a result. 1.1 Contributions and Limitations The limitations of this approach are the same as any existing visible light link approach. One primary goal of this work is to characterize key benefits to using a visible light transmitter to receiver pair. Limitations: Line of sight is required between the transmitter and the receiver. End-to-end visible communication poses new problems, such as geometric angular effects and resampling issues, which potentially call for new algorithms and protocols. Related Work Komine et. al presented prototypes and a fundamental framework for transmission of white LED light in [, 5]. More recently, Little et. al [6] build an indoor wireless lighting system also employing OOK time modulation. This class of works represent an effort Performance of visible light systems is ultimately determined by hardware. Key performance enhancements are determined by hardware which 1

LCD...11... Camera...11... Figure : An LCD and a camera used for communication in our Netpix system. The LCD displays images corresponding to the input binary data; the camera captures a photo of the display and decodes the images to recover the data....1... Display Camera k,l,m BER...1... Figure 3: A feedback system allows for better adjustment and fine tuning of parameters between the transmitter and receiver. The correctly exposed parameters result in a system which may be adjusted depending on the number of users. we limit ourselves to by using off-the-shelf equipment.. System Overview We build on a simple understanding of a channel model and propose a communication system that uses a camera and a LCD display to communicate using visible light. We describe the main components of our transmitter and receiver chain..1 Transmitter The transmitter proposed considers a transmitter scenario such as the one depicted in Figure. We are interested in the bitrates received at each one of the cameras, with respect to angle and distance as related to users in a physical space. We consider a scenario where incoming bits are first compressed. These bits may be split into several frames to guarantee independence. Next, forward error correction is placed to protect the bits through a lossy channel. An additional block in this chain creates a feedback system to determine how these protection bits must be placed in physical space. Figure shows a block diagram for a system with multiple receive cameras, and a single transmitting display. We consider the details of the exposed parameters and how they are related to the results. Parameter k determines how bits are placed in time depending on the temporal coherence of the channel. Parameter m determines how bits are placed depending on where users are in physical space. Considering the need for these two parameters, we send two calibration frames before placing bits sequentially in the data frames. A system with feedback such as that in Figure 3 determines how often calibration frames are sent. The evaluation from the complete receive chain determines how these parameters k and m are set depending on the environment. The first calibration frame sent is a frame cornercal frame consisting of four corner markers. The second calibration frame sent is a frame sampoints frame consisting of a grid containing all sampling points in the next few data frames. Following these two frames are data frames containing all information.. Receiver The high level receive system block diagram is shown in Figure 5. It is split into two sections where many of the preprocessing elements are reminiscient of traditional image processing. The postprocessing elements are more reminiscient of RF design blocks. Here we explain the role of each component. Spatial Tracking: The purpose of spatial tracking is to determine where in the scene the data is located. We search for the corners using a modified fast corner algorithm [7]. As there might be several corners in the scene, each of the corner candidates using the fast corner detector is compared to a large quadrilateral generated from the second frame. This is done in the system using a function find corners which takes as input the incoming frame imin and the sampled frames cornercal frame and sampoints frame. The quadrilateral from the second frame is obtained by repeatedly blurring and lowpassing the second sampling frame and then converting to a highcontrast black and white image. The square is found by discovering all white regions in the scene and calculating a square score given by: ( area, ) min perimeter ( area, ) perimeter max All distances are calculated from the corner candidates in the first frame to the centroid of the square found in the next frame. The four corners with distances closest to each other are considered the corner calibration points for this block. The brightest pixel point associated with these corner calibration points are considered the corners for this round.

k m...11... Compress Frame Split Geometrically Placed Symbols LCD or Projector Figure : The transmit chain is designed for a scenario where a single screen might be transmitting information with many onlooking receivers. The transmit chain carries properties similar to that of a RF transmit chain with additional parameters for adjusting frame split, forward error correction and awareness of users in physical space. Homography: Once the scene is found, the perspective scene may be restored using well known techniques for recovering perspective projections. This is done in the system using a function frame recover. A transfer function is formed using the corners from find corners and a homogeneous inversion may be performed from the formed matrix C. All the following frames are cropped using corners found from find corners and the transform coordinates fround from frame recover. These are passed through the rest of the receive chain until another dark calibration frame is found. Timing Recovery: Timing recovery is done by detecting all dots from sampoints frame. sampoints frame recovers the image by adaptively equalizing the entiring image and then passing it through a high-contrast filter wth contrast limits of.. This is then converted to a high-contrast black and white image. The centroids of each one of these points are found using standard logic functions. These points are assumed to be contained on a near perfect grid. The points of this mesh are reordered as such. This mesh is interpolated by an upsampling factor of to allow for localized timing offset. Here, a match filter from the grid length is used as a kernel for the image. The convolved image is used to find the optimal sampling points. Sampling: Sampling is done from the result of the timing recovery frame where all maximum values corresponding to a region within a found grid point is sampled. Sampling takes as input an incoming color image and slices the image into three slices and considers each slice with grayscale levels. As a result of the interference discussed, each slice is adaptively equalized followed by high contrast adjustment. The result is sampled with sampling points found in the timing recovery. In reporting the BER of the system, the recovered frames are compared to the random frames sent. 3. Hardware Overview In this work, we use only commodity hardware to achieve a high-bit rate link and make no modifications. As discussed in future work, there are many 6 8 Red channel Green channel Blue channel Overall 1 1.1 1. 1.3 1. 1.5 Distance (meters) Figure 6: BER with respect to distance between the camera and display for encoding in all three color channels for a 1599x35 pixel transmit block. prototypes in development out of the scope of this work which would allow a much more powerful platform. We provide in detail the specifications of the truly off-the-shelf equipment we choose. In these experiments, we choose a standard Dell Ultra Sharp wide screen flat display with 19x1 resolution. The screen contains an antiglare coating. These screens cost approximately $35 today. A n x n array of these standard displays simulate a higher resolution transmitter displays available in the future. We use the Canon Rebel XS camera to capture transmitted images. These cameras likewise cost approximately $5. The sensor resolution is approximately megapixels. The lens is a standard 5mm lens with f/8 aperture size.. Results 3

Spatial tracking Homography Timing Recovery Sampling Decoding...11... BER Figure 5: The receive chain design is the most computationally heavy portion of the design. This block diagram gives a high level overview, the details are discussed in the text. Using the transmit chain and receive chain described in Section, we carry out several experiments which demonstrate the performance of the Netpix system. While the system design is based on ideal assumptions about the nature of the optical channel, the actual performance of the system is stressed under many parameters which are exposed to the highest layer. There are two goals in these experiments. The first is to establish the performance of the system. The second is to develop a channel model thus giving fundamental insight into the nature of a pixel to pixel system. Figure 6 reports BER with respect to distance for encoding in all color channels for a.96mbits/frame/screen in a n x n array of cameras and screens where n = 3. With a shutter rate of 1/3 seconds, this gives an aggregate throughput of 1.3 Gb/s. Here, the total transmit size of each frame is 1599 x 35 pixels. In this experiment, the array setup is such that each camera is exactly in front of each display. Here, we determine when the goodput is 1.3Gb/s - BER*1.3Gb/s. Despite large bit errors, the overall goodput for a 3 x 3 camera/display array is still very high ( 1Gb/s). These transmit frames were generated with 1 bit of information contained in each slice of red, green and blue. While it is interesting to consider the circumstances under which a very high speed link works, we would also like to understand how these pixels might behave under other constraints. Figure 7 is a 1599 x 35 pixel board with bits of symbol size. The symbol size is the number of pixels on the display that each bit occupies. Here, we consider the results with no color and how this might eliminate errors. Here, at 1m, there are errors and a sharp jump at 1.1m. Figure 7 reports BER when using only the black and white channels. This reports only 1/3 the throughput. These results decoded using the same receive chain however result in much lower BER. This suggests a considerable effect from neighboring color bands. Angle evaluation is done using a square black and white checkerboard carrying a throughput of 7.86Mbits. Figure 8 shows BER as a function of different viewing angles. A viewing angle of 3 degrees may cover significant area within a room. 1 3 1 1.1 1. 1.3 Distance (meters) 1. 1.5 Figure 7: BER with respect to distance between the camera and display for encoding in black and white. This eliminates the effects of color interference and reports BER purely as a result of what is observed at the sensor. 3 Angle (degrees) 3 Figure 8: BER with respect to angle between the camera and display for encoding in black and white. Angle increases in the photos from left to right (,,, and 3 degrees).

.3 x 3..1 1.9 1.8.1.5 3 1.7 6 8 Observation Number Figure 9: Coherence plots showing how BER changes with respect to time. This experimental plot suggests the presence of a stochastic channel in practice. We further explain the nature of this channel in the text. Figure 9 shows that although the channel model for these results are analytical, the resulting bit errors still have randomness. We can see this effect in both angle and time. Figure shows BER as a function of the symbol size. As the symbol size increases, the chance for neighbors interfering decreases thus reducing the BER. 5. Conclusion We present a wireless optical system built with commodity hardware which may be used for many network scenarios. Depending on the application and needs of the network, there are many modifications to be made which will ultimately result in significantly higher bit rates. We are currently in the process of developing such a hardware platform. Even with current commodity off-the shelf hardware, we obtain very high bit-rates. As the consumer world strives to develop the newest form factors for projectors and cameras, visual communication systems such as the one proposed in this work will allow users to use a new set of networking components capable of delivering high bit-rate wireless links. 1 8 Symbol size Figure : BER vs. symbol size viewed as reported for various angles between the camera and the display. The symbol size is the number of pixels on the display that each bit occupies. The BER falls down significantly as the symbol size goes up. Performance is similar for different angles. [] T. Komine and M. Nakagawa. Integrated system of white led visible-light communication and power-line communication. IEEE Trans. on Consumer Electronics, Feb 3. [5] T. Komine and M. Nakagawa. Fundamental analysis for visible-light communication system using led lights. IEEE Trans. on Consumer Electronics, Feb. [6] T. D. C. Little, P. Dib, K. Shah, N. Barraford, and B. Gallagher. Using led lighting for ubiquitous indoor wireless networking. IEEE Intl. Conf. on Wireless and Mobile Computing, Networking and Communications, October 8. [7] E. Rosten and T. Drummond. Machine learning for high-speed corner detection. In European Conference on Computer Vision, May 6. 6. References [1] ISO. International symbology specification maxicode. ISO/IEC 163:,. [] ISO. Automatic identification and data capture techniques QR code 5 bar code symbology specification. ISO/IEC 18:6, 6. [3] ISO. Automatic identification and data capture techniques - Data Matrix bar code symbology specification. ISO/IEC 16:6, 6. 5