Digital Video over Space Systems & Networks

Similar documents
Digital Television (DTV) Technology And NASA Space/Ground Operations

hdtv (high Definition television) and video surveillance

Audio and Video II. Video signal +Color systems Motion estimation Video compression standards +H.261 +MPEG-1, MPEG-2, MPEG-4, MPEG- 7, and MPEG-21

White Paper. Video-over-IP: Network Performance Analysis

Motion Video Compression

Digital Signage Content Overview

Digital Media. Daniel Fuller ITEC 2110

MULTIMEDIA TECHNOLOGIES

Synchronization Issues During Encoder / Decoder Tests

Module 8 VIDEO CODING STANDARDS. Version 2 ECE IIT, Kharagpur

Digital Television Fundamentals

DVB-S2 and DVB-RCS for VSAT and Direct Satellite TV Broadcasting

Digital Video Telemetry System

ATSC vs NTSC Spectrum. ATSC 8VSB Data Framing

Understanding Compression Technologies for HD and Megapixel Surveillance

Implementation of MPEG-2 Trick Modes

Hands-On Real Time HD and 3D IPTV Encoding and Distribution over RF and Optical Fiber

OPEN STANDARD GIGABIT ETHERNET LOW LATENCY VIDEO DISTRIBUTION ARCHITECTURE

Multimedia. Course Code (Fall 2017) Fundamental Concepts in Video

Video 1 Video October 16, 2001

A LOW COST TRANSPORT STREAM (TS) GENERATOR USED IN DIGITAL VIDEO BROADCASTING EQUIPMENT MEASUREMENTS

Video System Characteristics of AVC in the ATSC Digital Television System

Chapter 3 Fundamental Concepts in Video. 3.1 Types of Video Signals 3.2 Analog Video 3.3 Digital Video

P1: OTA/XYZ P2: ABC c01 JWBK457-Richardson March 22, :45 Printer Name: Yet to Come

TIME-COMPENSATED REMOTE PRODUCTION OVER IP

ATSC Standard: Video Watermark Emission (A/335)

Video. Philco H3407C (circa 1958)

New Technologies for Premium Events Contribution over High-capacity IP Networks. By Gunnar Nessa, Appear TV December 13, 2017

h t t p : / / w w w. v i d e o e s s e n t i a l s. c o m E - M a i l : j o e k a n a t t. n e t DVE D-Theater Q & A

Will Widescreen (16:9) Work Over Cable? Ralph W. Brown

Module 8 VIDEO CODING STANDARDS. Version 2 ECE IIT, Kharagpur

Contents. xv xxi xxiii xxiv. 1 Introduction 1 References 4

Issue 76 - December 2008

Frame Compatible Formats for 3D Video Distribution

Using Video over IP Inter and Intra Mission Control Centers and Beyond

. ImagePRO. ImagePRO-SDI. ImagePRO-HD. ImagePRO TM. Multi-format image processor line

INTEGRATION, PROCESSING AND RECORDING OF AIRBORNE HIGH RESOLUTION SENSOR IMAGES

REGIONAL NETWORKS FOR BROADBAND CABLE TELEVISION OPERATIONS

06 Video. Multimedia Systems. Video Standards, Compression, Post Production

Sending Images. Teacher Information

PixelNet. Jupiter. The Distributed Display Wall System. by InFocus. infocus.com

Chapter 10 Basic Video Compression Techniques

HEVC: Future Video Encoding Landscape

Multimedia Systems Video I (Basics of Analog and Digital Video) Mahdi Amiri April 2011 Sharif University of Technology

Hands-On DVB-T2 and MPEG Essentials for Digital Terrestrial Broadcasting

By David Acker, Broadcast Pix Hardware Engineering Vice President, and SMPTE Fellow Bob Lamm, Broadcast Pix Product Specialist

Video coding standards

THINKING ABOUT IP MIGRATION?

Understanding Multimedia - Basics

Introduction. Fiber Optics, technology update, applications, planning considerations

A320 Supplemental Digital Media Material for OS

COMP 249 Advanced Distributed Systems Multimedia Networking. Video Compression Standards

5.1 Types of Video Signals. Chapter 5 Fundamental Concepts in Video. Component video

Implementation of an MPEG Codec on the Tilera TM 64 Processor

The implementation of HDTV in the European digital TV environment

SERIES J: CABLE NETWORKS AND TRANSMISSION OF TELEVISION, SOUND PROGRAMME AND OTHER MULTIMEDIA SIGNALS Digital transmission of television signals

An Overview of Video Coding Algorithms

Metadata for Enhanced Electronic Program Guides

Issue 67 - NAB 2008 Special

HDMI Demystified April 2011

Personal Mobile DTV Cellular Phone Terminal Developed for Digital Terrestrial Broadcasting With Internet Services

HEVC/H.265 CODEC SYSTEM AND TRANSMISSION EXPERIMENTS AIMED AT 8K BROADCASTING

Jupiter PixelNet. The distributed display wall system. infocus.com

ELEC 691X/498X Broadcast Signal Transmission Winter 2018

VIDEO 101: INTRODUCTION:

Video Information Glossary of Terms

ATSC Candidate Standard: Video Watermark Emission (A/335)

Advanced Television Systems

A review of the implementation of HDTV technology over SDTV technology

ATI Theater 650 Pro: Bringing TV to the PC. Perfecting Analog and Digital TV Worldwide

Cisco D9894 HD/SD AVC Low Delay Contribution Decoder

FLEXIBLE SWITCHING AND EDITING OF MPEG-2 VIDEO BITSTREAMS

PAL uncompressed. 768x576 pixels per frame. 31 MB per second 1.85 GB per minute. x 3 bytes per pixel (24 bit colour) x 25 frames per second

Lecture 23: Digital Video. The Digital World of Multimedia Guest lecture: Jayson Bowen

Elegance Series Components / New High-End Audio Video Products from Esoteric

10 Digital TV Introduction Subsampling

A NEW METHOD FOR RECALCULATING THE PROGRAM CLOCK REFERENCE IN A PACKET-BASED TRANSMISSION NETWORK

HDMI Demystified. Industry View. Xiaozheng Lu, AudioQuest. What Is HDMI? Video Signal Resolution And Data Rate

Hands-On 3D TV Digital Video and Television

Tutorial on the Grand Alliance HDTV System

Video broadcast using cloud computing with metadata Carlos R. Soria-Cano 1, Salvador Álvarez Ballesteros 2

Reference Parameters for Digital Terrestrial Television Transmissions in the United Kingdom

Joint Optimization of Source-Channel Video Coding Using the H.264/AVC encoder and FEC Codes. Digital Signal and Image Processing Lab

In MPEG, two-dimensional spatial frequency analysis is performed using the Discrete Cosine Transform

VNP 100 application note: At home Production Workflow, REMI

IO [io] 8000 / 8001 User Guide

Digital Terrestrial HDTV Broadcasting in Europe

MPEG Solutions. Transition to H.264 Video. Equipment Under Test. Test Domain. Multiplexer. TX/RTX or TS Player TSCA

HDTV compression for storage and transmission over Internet

Illinois Telephone Users Group. Peoria, IL June 6, 2007

SHRI SANT GADGE BABA COLLEGE OF ENGINEERING & TECHNOLOGY, BHUSAWAL Department of Electronics & Communication Engineering. UNIT-I * April/May-2009 *

MPEG-2. ISO/IEC (or ITU-T H.262)

Understanding DVRs and Resolution From CIF to Full HD 1080P to Ultra HD 4K

decodes it along with the normal intensity signal, to determine how to modulate the three colour beams.

AMD-53-C TWIN MODULATOR / MULTIPLEXER AMD-53-C DVB-C MODULATOR / MULTIPLEXER INSTRUCTION MANUAL

Lecture 2 Video Formation and Representation

RECOMMENDATION ITU-R BT.1203 *

MOBILE DIGITAL TELEVISION. never miss a minute

Adtec Product Line Overview and Applications

RECOMMENDATION ITU-R BT * Video coding for digital terrestrial television broadcasting

Transcription:

SpaceOps 2010 Conference<br><b><i>Delivering on the Dream</b></i><br><i>Hosted by NASA Mars 25-30 April 2010, Huntsville, Alabama AIAA 2010-2060 Digital Video over Space Systems & Networks Rodney P. Grubbs NASA Marshall Space Flight Center, Mail Code EO50, MSFC, AL 35824 Digital video promises many things better quality, consistency, all the advantages that video as data provides, but it also presents some serious challenges for system designers. This paper will first provide some background about digital video, compression, and examples of Standard and High Definition video uses in space applications over the past two or three years. Particular emphasis will be on interoperability challenges within different spacecraft systems and between ground systems. The main body of the paper will discuss the challenges of implementing digital video into existing spacecraft systems as well as designing digital video for use in future space applications. Topics covered will include: Transport Streams vs. Element Streams; Video over IP; Link Integrity; Bandwidth trades; Latency; and Format conversions. The paper will also include a discussion about the importance of standards and interoperability specifications, a review of future plans for digital video applications from the International Space Station and NASA s Constellation Program, and the goal of the Consultative Committee for Space Data Standards Motion Imagery and Applications Working Group to provide guidelines and best practices to avoid interoperability issues that plague current systems. I. Introduction In the early days of human spaceflight, motion imagery was accomplished with motion picture cameras, set at varying frame rates depending on lighting conditions. Upon safe return, the film was processed and eventually shared with the world via documentaries or television. Inevitably, live video became operationally desirable for situational awareness and to satisfy the public s interest in high profile events such as the Moon landings or the Apollo-Soyuz test project. Compromises were made with those first video systems to fit within the constraints of bandwidth, avionics, and transmission systems. Even in the modern era, video systems on spacecraft are a hybrid of analog and digital systems, typically made to work within the spacecrafts avionics, telemetry and command/control systems. With the advent of digital cameras, encoding algorithms and modulation techniques, it is desirable to treat video as data and utilize commercially available technologies to capture and transmit live and recorded motion imagery, possibly in high definition or even better. Future Human Spaceflight endeavors are expected to be collaborations between many agencies, with complex interactions between spacecraft and Lunar/Mars surface systems, with intermediate locations (Extra Vehicular Activity crew, habitats, etc.) requiring the ability to view video generated by another agency s systems. Therefore, interoperability between these systems will be essential to mission success and, in some cases, crew safety. II. Digital Video Parameters Digital video requires far more considerations to a system designer than the use of analog video. Analog video typically means using one of three world-wide standards: National Television Standards Committee (NTSC), Phase Alternating Line (PAL), or Sequential Color with Memory (SECAM). These world-wide standards have a limited set of parameters that define them, including horizontal and vertical resolution and frame rate. Digital video, on the other hand, has a variety of horizontal and vertical resolutions, frame rates, and additional aspects to consider including scanning, compression, aspect ratios, distribution protocols and color sampling. All of these parameters are discussed in much further detail in a draft Consultative Committee for Space Data Standards (CCSDS) Green Book, Motion Imagery & Applications 1. Readers interested in learning more about digital video parameters are encouraged to read section 3 of that draft document. 1 This material is declared a work of the U.S. Government and is not subject to copyright protection in the United States.

All of these parameters provide considerable flexibility for users of video, but also introduce challenges for interoperability and distribution among space faring nations on collaborative projects. For example, imagine a scenario where three astronauts from three different country s space agencies are conducting an extra-vehicular activity. It is logical to assume that mission control rooms would want to monitor helmet-mounted cameras from each astronaut. It is likely all three mission control rooms would have different digital video monitoring systems and those systems may differ from the native format, aspect ratios, compression algorithm, or other parameters from what is used on the helmet-mounted camera or the spacecraft s video distribution system. Thus, format conversion or image transcoding will likely be required for multi-agency monitoring of events to account for differences in aspect ratios, compression, frame rates and resolution. Many of these interoperability challenges are to be addressed by the CCSDS Motion Imagery and Applications Working Group for designers of future spacecraft and future missions. This paper will address challenges of implementing digital video into an already existing spacecraft and ground system. Integrating digital video into already deployed systems or spacecraft can be very complex. Many of these systems were not designed to handle the bandwidths required. Cabling and switching systems capable of handling analog video may not be able to handle the rigorous requirements of digital video, especially High Definition video. Use of Internet Protocols (IP) over space links provides one option for implementing digital video, but there are pitfalls. This paper will attempt to address some of these and other pitfalls, based on systems tested or deployed operationally on the Space Shuttle and International Space Station. III. Video over IP Here on Earth in the year 2010 video over IP is ubiquitous. Commercial web sites such as YouTube and Hulu allow users to watch high quality video on their home or office computers. Streaming video takes advantage of the stability of internet networks and improvements in compression technology. Streaming digital video from a spacecraft is far more complicated. There are several important considerations to address when attempting streaming video over IP from a spacecraft compression, transport streams, and protocols. A. Compression Uncompressed streaming digital video requires a great deal of bandwidth. For example, uncompressed High Definition video requires just under 1.5 Gbps. Therefore, the use of digital video over a network will require compression of the video. Two primary industry standard compression algorithms are widely utilized--moving Pictures Expert Group (MPEG) -2 or MPEG-4 (Part 10). MPEG-4 (Part 10) is also referred to as h.264. With MPEG compression, the screen is divided into pixel blocks called macroblocks. In MPEG-2, the macroblocks of an entire frame are compared to the macroblocks of the next frame to determine differences. This comparison is done looking both forwards and backwards in a video stream. MPEG-4 takes this a step further, working at the macroblock level in doing comparisons. This allows sending information only on the macroblocks with significant changes, thus being more efficient than MPEG-2, but requiring more processing power. If the video is intended to be redistributed, it is best to try to compress the video as little as possible. In other words, utilize as much bandwidth as possible. Compressing video already compressed once introduces additional artifacts and reduces the quality of the video. Video intended for analysis purposes, where individual frames are extracted from a motion video sequence and further analyzed, require intra-frame encoding versus inter-frame encoding. MPEG utilizes inter-frame encoding where groups of pictures are combined to provide efficiency. Intra-frame encoding algorithms, such as Motion Joint Pictures Expert Group (Motion JPEG) 2000, would be a better choice of encoding for video applications where individual frames are important. Motion JPEG2000 requires considerably more bandwidth since each frame is encoded separately. JPEG2000 uses a frequency sub-band compression technique instead of macroblocks. This provides a more accurate image than MPEG based compression algorithms. B. Transport Stream or Elementary Stream? Typically, the output of a modern day hardware video compressor, or encoder, is either IP or a format called Asynchronous Serial Interface (ASI). ASI is typically used for cable television distribution or professional video applications. When encoding video and audio, the option of outputting a transport stream or a program elementary stream is available. At a simplistic level, a program element stream keeps the video and audio separate; a transport stream combines audio, video and any other elements, such as closed-captioning data, of the stream into a combined 2

transport stream. Typically elementary streams are used for file-based playback of video, such as Digital Video Disks (DVD) and video playback from computers. Transport streams are typically used for real-time video applications such as terrestrial broadcasting or digital satellite systems. For real time streaming applications from spacecraft, transport streams provide several advantages, but these advantages must be balanced against the increased packet overhead and higher requirements for the space link integrity. One advantage is that the audio and video are synchronized together into a single data stream. Audio would not have to be resynchronized on the ground, so lip synchronization is not an issue. Another advantage is the ease with which video can be integrated into an IP network system on a spacecraft or a ground network. The IP output of a commercial encoder can be routed directly into an IP router, so the video data stream looks like any other data on the network. On the ground, the video data transport stream can easily be identified automatically via specific Packet Identifier (PID) for further distribution to other control centers or decoders and data recorders. A disadvantage of utilizing transport streams is the added bandwidth overhead. Existing spacecraft were likely not designed to route IP without additional protocol conversion to route the data into an avionics system utilizing CCSDS packet protocols. Thus, where an elemental stream can be routed directly into an avionics system, a transport stream will have to plug into an IP system that will also have to undergo some overhead before being packetized using CCSDS packet protocols for downlinks. Then, all of the various packet overhead will have to be undone on the ground to get back to the original transport stream that a decoder can recognize. A NASA project to demonstrate a live HDTV downlink capability on the International Space Station was originally designed to utilize an elementary stream and CCSDS packetization. The Columbia Accident delayed the project. During the delay, a device called the Orbital Communications Adapter (OCA), which allows use of IP with a CCSDS packet-based avionics system, was updated to provide higher bandwidth data streams. The live HDTV downlink system was redesigned to output a transport stream to the OCA. The project was successful and the first ever real-time HDTV downlink was carried live on the Discovery Channel and Japan s NHK in November 2006. C. Real-time Transport Protocol One packet format typically utilized for packetization of video data streams is Real-time Transport Protocol (RTP). RTP is typically utilized for end-to-end multimedia applications such as voice over IP or video conferencing over IP. RTP can tolerate drops in packets and jitter on a network because the protocol establishes end-to-end bidirectional links, or handshakes, that allow some data buffering and reordering of packets. Because of this handshake aspect of RTP, it is not a good candidate for use in applications like streaming digital video from a spacecraft. Furthermore, most commercial hardware decoders are not designed for RTP IP streams. Decoding of RTP IP streams usually requires computers with software applications such as VideoLan Media Player or Quicktime to decode the video. Most control rooms and ground systems are designed for conventional analog or digital video (using Serial Digital Interface) distribution. There is no simple way to bridge the gap between those systems and a computer based software decode of an RTP IP stream. It is possible to use RTP with a transport stream and turn off the handshake between the encoder and decoder, but this is not a typical use of RTP and the advantages of RTP are lost. IV. Link Integrity As referenced earlier, link integrity is an important aspect of successful implementation of digital video into an existing spacecraft system. Because of the dramatic compression required, especially in the case of HDTV, it is critical that the packets arrive in order with little packet loss or error or the decoder will be unable to decode the video. Tests conducted by NASA comparing MPEG-2 and MPEG-4 compression and distribution, demonstrated that MPEG-4, while providing higher quality video with less bandwidth when compared with MPEG-2 at the same data rates, is also more susceptible to bit errors and packet loss. With MPEG-4, problems with data links such as jitter, resulting in bit errors or lost packets, can result in decoders locking up, where they do not output any picture at all. With MPEG-2, decoders can still produce some elements of a picture even if there is some break in the data stream. The lost packets and bit errors manifest themselves as frozen blocks in the picture, known as macroblocking. Users of US commercial digital television satellite services such as DirecTV or Dish Network see this effect during rain fade, for example. Because of the way MPEG-4 is encoded at the block level as well as the frame level, it is more likely that a decoder cannot produce any usable picture when the data link is corrupted. Similar testing by NASA of Motion JPEG2000 encoded video showed less susceptibility to bit errors and packet loss. This is not surprising given that Motion JPEG2000 utilizes intra-frame encoding. A break in a data link may 3

result in lost frames, but since there is no group of pictures as is the case with MPEG encoding, decoders can recover quickly and produce usable video quickly. V. Latency Since compression is required to make digital video practical, latency is a factor with any digital video system. As encoding algorithms mature and hardware improves, end-to-end encode/decode latencies also improve. But that is not the only source of latency. As discussed earlier, packetization, buffering, and other aspects of the transport protocols add to latency. Each packetization of the video data stream requires depacketization on the other end before the data can be decoded and viewed as video. It is a fact of life when utilizing digital video. Latency primarily impacts scenarios where video is used interactively, such as live television interviews with crew, or in situational awareness applications of video with real-time importance, such as on-orbit rendezvous. In the case of live interviews, operational factors mitigate the impact of the latency. Media are informed of the delay so they ask their questions and wait patiently for the reply. Astronauts are sometime given questions ahead of time so they can anticipate the end of each question and begin their answers swiftly. The latency is also tolerated because it emphasizes the implied distance of spaceflight and has become synonymous with such televised events. Latency in situational awareness applications can become more problematic. Mission control rooms must remember that what they are seeing on screens can be seconds behind the mission elapsed time clocks they are seeing. Video in these applications must be augmented by other real-time sensors and abort decisions must be based on those sensors, not the video. For the same reasons, uses of video on spacecraft for applications such as rendezvous are limited and augmented with other real time and more reliable systems, such as radar and laser based systems. VI. Conclusion Digital video provides many improvements over analog. Video can be treated as data, utilizing other data distribution systems such as IP, and provides for higher quality imagery with less degradation. Digital video is also more flexible, with multiple resolutions and frame rates supported. With that flexibility comes trades and challenges. Compression, transport protocols, link integrity, and latency are some of the factors to be considered. Design of new spacecraft and ground systems will no doubt take into account these factors and others. The consumers of the video are the beneficiaries of all this flexibility. At the time of this paper s writing in early 2010, HD video can be streamed from the ISS using mostly off the shelf commercial hardware and a protocol convertor built by the Japanese Space Agency. Plans are underway for further retrofitting of the ISS to allow more and better quality HDTV to be streamed live. It is clear the future of spacecraft video is digital, high definition. No doubt some day humans will watch the next steps on another celestial body in crystal clear high definition video and look back on the grainy images of Neil Armstrong s first steps on the Moon the same way we think of Alexander Graham Bell s first telephone transmission. Glossary of Terms: Appendix Macroblocks Term used to describe a subset of an image used in compression algorithms. Sizes of blocks are usually expressed in multiples of four. MPEG-2 utilizes blocks of 8x8 pixels. Macroblocking Term used to describe an artifact of missing data that becomes apparent when viewing video that has been compressed using an algorithm that utilizes macroblocks. These visual artifacts are usually the result of missing data or bit errors. Packetization Within the context of this paper, refers to the process of adding headers and/or identifiers to data streams containing compressed video. Transcoding Conversion from one digital algorithm or format to another. In this paper s context, a conversion from encoding algorithm to another, such as MPEG-2 to MPEG-4, or from one imagery format to another, such as 1920 x 1080 interlace to 1280 x 720 progressive video. 4

Acknowledgments The author would like to thank Steve Chubb and Susan Best from Marshall Space Flight Center s Mission Operations Lab and Walt Lindblom, SAIC, Huntsville, AL for reviewing drafts of this paper, and their suggestions that so improved the final version. References 1 CCSDS, Motion Imagery and Applications Working Group, Draft Green Book. The latest documents can be found at http://cwe.ccsds.org/default.aspx, SIS-MIA Draft Documents. 5