Multimedia Communication Systems 1 MULTIMEDIA SIGNAL CODING AND TRANSMISSION DR. AFSHIN EBRAHIMI
|
|
- Coral Knight
- 5 years ago
- Views:
Transcription
1 1 Multimedia Communication Systems 1 MULTIMEDIA SIGNAL CODING AND TRANSMISSION DR. AFSHIN EBRAHIMI
2 Table of Contents 2 1 Introduction 1.1 Concepts and terminology Signal representation by source coding Optimization of transmission 1.2 Signal sources and acquisition 1.3 Digital representation of multimedia signals Image and video signals Speech and audio signals Need for compression technology
3 Table of Contents 2 Fundamentals of Signal Processing and Statistics 2.1 Signals and systems Elementary signals Systems operations 2.2 Signals and Fourier spectra Two- and multi-dimensional spectra Spatio-temporal signals 2.3 Sampling of multimedia signals Separable two-dimensional sampling Sampling of video signals 2.4 Digital signal processing in multiple dimensions signals 2.5 Statistical analysis Sample Statistics Joint statistical properties Spectral properties of random Markov chain models Statistical foundations of information theory 3
4 Table of Contents Linear prediction Autoregressive models Linear prediction 2.7 Linear block transforms Orthogonal basis functions Basis functions of orthogonal transforms Efficiency of transforms Transforms with block overlap 2.8 Filterbank transforms Properties of subband filters Implementation of filterbank structures Discrete wavelet transform (DWT) Two- and multi-dimensional filter banks Pyramid decomposition
5 Table of Contents 5 3 Perception and Quality 3.1 Properties of vision Physiology of the eye Sensitivity functions Color vision 3.2 Properties of hearing Physiology of the ear Sensitivity functions 3.3 Quality metrics Objective signal quality metrics Subjective assessment
6 Table of Contents 6 4 Quantization and Coding Systematic variable-length codes 4.1 Scalar quantization and pulse code modulation Arithmetic coding 4.2 Coding theory Adaptive and context-dependent entropy coding Source coding theorem and rate-distortion function Entropy coding and transmission errors Rate-distortion function for correlated signals Lempel-Ziv coding Rate-distortion function for multi-dimensional 4.5 Vector quantization (VQ) signals Basic principles of VQ 4.3 Rate-distortion optimization of quantizers VQ with uniform codebooks 4.4 Entropy coding VQ with non-uniform codebooks Properties of variable-length codes Structured codebooks Huffman codes Rate-constrained VQ
7 Table of Contents 7 5 Methods of Signal Compression 5.1 Binary signal coding Run-length coding 5.2 Predictive coding Open-loop and closed-loop prediction systems Non-linear and shift-variant prediction Effects of transmission losses Vector prediction Prediction in multi-resolution pyramids 5.3 Transform coding Gain through discrete transform coding Quantization of transform coefficients Coding of transform coefficients Transform coding under transmission losses 5.4 Bitstreams with multiple decoding capability Simulcast and transcoding Scalable coding Multiple-description coding
8 Table of Contents 8 6 Still Image Coding 6.1 Compression of binary images Compression of bi-level images Binary shape coding Contour shape coding 6.2 Vector quantization of images 6.3 Predictive coding D prediction D vector prediction Quantization and encoding of prediction errors Error propagation in 2D DPCM 6.4 Transform coding of images Block transform coding Overlapping-block transform coding Subband and wavelet transform coding Local adaptation of transform bases by signal properties 6.5 Synthesis based image coding Region-based coding Colour and texture synthesis Post filtering 6.6 Still image coding standards
9 Table of Contents 9 7 Video Coding Motion-compensated temporal filtering. 7.1 Intraframe-only and frame replenishment coding Quantization and encoding of MCTF frames 7.2 Hybrid video coding 7.4 Coding of side information (motion, modes) Motion-compensated Hybrid Coders 7.5 Scalable video coding Characteristics of interframe prediction error Scalable hybrid coding signals Scalable 3D frequency coding Quantization error feedback and error 7.6 Multi-view video coding propagation 7.7 Synthesis based video coding Reference pictures in motion-compensated prediction Region-based video coding Accuracy of motion compensation Distributed source coding Hybrid coding of interlaced video signals Super-resolution synthesis Optimization of hybrid encoders Dynamic texture synthesis 7.3 Spatio-temporal transform coding 7.8 Video coding standards Interframe transform and subband coding
10 Table of Contents 10 8 Speech and Audio Coding 8.1 Coding of speech signals Linear predictive coding Parametric (synthesis) coding Speech coding standards 8.2 Audio (music and sound) coding Transform coding of audio signals Synthesis based coding of audio and sound signals Coding of stereo and multi-channel audio signals Music and sound coding standards
11 Table of Contents 11 Transmission and Storage of Multimedia Data 9.1 Digital multimedia services 9.2 Network interfaces 9.3 Adaptation to channel characteristics Rate and transmission control Error control 9.4 Definitions at systems level 9.5 Digital broadcast 9.6 Media streaming
12 Introduction 12 Multimedia communication systems are a flagship of the information technology revolution. The combination of multiple information types, particularly audiovisual information (speech/audio/sound/image/video/graphics) with abstracted (text), smelled or tactile information provides new degrees of freedom in exchange, distribution and acquisition of information. Communication includes exchange of information between different persons, between persons and machines, or between machines only. Sufficient perceptual quality must be provided, which is related to the compression and its interrelationship with transmission by networks. Advanced methodologies are based on content analysis and identification, which is of high importance for automatic user assistance and interactivity. In multimedia communication, concepts and methods from signal processing, systems and communications theory play a dominant role, where audiovisual signals are a primary challenge regarding transmission, storage and processing complexity.
13 Introduction 13 Books Steinmetz, R,; Nahrstedt, K.: Media Coding and Content Processing. Prentice Hall, Steinmetz, R.; Nahrstedt, K., Multimedia Systems, Springer Verlag, Steinmetz, R.; Nahrstedt, K., Multimedia Applications, Springer Verlag, J.R. Ohm, Multimedia Communication Technology, Springer, Magazines Multimedia Systems, ACM/Springer Multimedia Magazine, IEEE
14 Introduction 14 What is Multimedia? Simple definition of Multimedia: Multi - Media Any kind of system that supports more than one kind of medium Is Television Multimedia? Definition: Multimedia means the integration of continuous media (e.g., audio, video) and discrete media (e.g., text, graphics, images) through which the digital information can be conveyed to the user in an appropriate way. Multi: many, much, multiple Medium: A means to distribute and represent information
15 Facets of Medium Perception Medium - How do humans perceive information in a computer environment? (by seeing, by hearing,...) 2. Representation Medium - How is the information encoded in the computer? (ASCII, PCM, MPEG,...) 3. Presentation Medium - Which medium is used to output information from the computer or to bring it into the computer? Input: keyboard, microphone, camera, Storage Medium - Where is the information stored? 5. Transmission Medium - Which kind of medium is used to transmit the information? (copper cable, radio,...) 6. Information Exchange Medium (combination of storage and transmission media) - Which information carrier will be used for information exchange between different locations?
16 Classification of Media 16 Each medium defines Representation values Representation space Representation values determine the information representation of different media: continuous representation values (e.g. electro-magnetic waves) discrete representation values (e.g. characters of a text in digital form) Representation space determines the technique to output the media information, usually visually (e.g., paper, slideshow) or acoustically (e.g., speakers) Spatial dimensions: Two dimensional (2D graphics) Three dimensional (holography) Temporal dimensions: Time independent (document) - discrete media (e.g. text of a book) Time dependent (movie) - continuous media (e.g. sound, video)
17 Data Streams 17 When transmitted or played out, continuous media need a changing set of data in terms of time, i.e. data streams. How to deal with such streams? Asynchronous Transmission Suitable for communication with no time restrictions (discrete media) E.g. electronic mail Synchronous Transmission Beginning of transmission may only take place at well-defined times A clock signal runs the synchronization between a sender and a receiver Isochronous Transmission Periodic transmissions, time separation between subsequent transmissions is a multiple of a certain unit interval A maximum and a minimum end-to-end delay for each packet of a data stream (limited jitter) is required An end-to-end network connection is isochronous if it has a guaranteed bit rate and if the jitter also is guaranteed and small
18 Data Stream Characteristics 18 Strongly periodic data streams Identical intervals T No jitter (optimally) Example: uncompressed audio T t Weakly periodic data streams Periodic intervals T Timing variations in the intervals Example: segmented transmission T 1 T 2 T 3 T 1 T 2 T 3 T T t Aperiodic data streams Arbitrary intervals Example: transmission of mouse control signals T 1 T 2 T 3 T 4 T 5 T 6 t
19 Data Stream Characteristics 19 Strongly regular data streams Quantity remains constant during the entire lifetime of the stream Typical for uncompressed video/audio T D 1 D 1 D 1 D 1 D 1 t Weakly regular data streams Quantity varies periodically Can result from some compression techniques E.g. videos coded with MPEG T D 1 D 1 D 2 D 2 D 3 D 3 t Irregular data streams Quantity is neither constant nor periodically changing Typical for compressed audio/video Harder to transmit/process D 1 D 2 D 3... D n t
20 Data Stream Characteristics 20 Continuous media consist of a time-dependent sequence of individual information units: Logical Data Units (LDUs) Example: Symphony A symphony consists of independent movements, movements consists of scores Using e.g. PCM, samples are made per second. On a CD, samples are grouped into units with a duration of 1/75 second Possible LDUs with different granularity: movements, scores, groups, samples. Used in digital signal processing: sampling values as LDUs Example: Movie Consists of scenes represented by clips, clips consist of single frames, frames consist of blocks of e.g. 16x16 pixels. Pixels can consist of chrominance and luminance values Using e.g. MPEG, inter-frame coding is used, thus image sequences are the smallest sufficient LDUs Movie Clips Frames Blocks Pixels
21 Fields of the Lecture 21
22 Content 22 Basics Audio Technology Images and Graphics Video and Animation Multimedia Systems - Communication Aspects and Services Voice over IP, Video conferencing Group Communication, Synchronization Quality of Service and Resource Management Multimedia Systems Storage Aspects Optical storage media Multimedia file systems, Multimedia databases Multimedia Usage Design and User Interfaces, Abstractions for Programming
23 Concepts and terminology 23
24 Concepts and terminology 24 The classical model assumes independent optimization of source and channel coding for best performance as the optimum solution. A source coding method which achieves optimum compression can be extremely sensitive against errors occurring in the channel, e.g. due to feedback of previous reconstruction errors into the decoding process. This requires joint optimization of the entire chain, such that in fact the best quality is retained for the user while the rate to be transmitted over the physical channel is made as low as possible. The classical model assumes a passive receiver ('sink'), which is very much related to broadcast services. In multi media systems, the user can interact, and can take influence on any part of the chain, even back on the signal generation; this is reflected by providing a back channel, which can also be used by automatic mechanisms serving the user by best quality services. Instead of transmitter and receiver, denotation of devices at the front and back ends as server and client better reflects this new paradigm.
25 Concepts and terminology 25 The classical model assumes one monolithic channel for which the optimization of source coding, channel coding and modulation is made once. Multimedia communication mostly uses heterogeneous networks, which typically have largely varying characteristics; as a consequence, it is desirable to consider the channels more by an abstract level and perform proper adaptation to the instantaneous channel characteristics. Channels can be networks or storage devices. Recovery at the client side may include analysis which is far beyond traditional channel coding, e.g. by conveying loss characteristics to the server via the back channel. Multimedia services are becoming more 'intelligent', including elements of signal content analysis to assist the user. This includes support for content related interaction, support in finding the multimedia information which best serves the needs of the user. Hence, the information source is not just encoded at the front end, but more abstract analysis can be performed in addition; the encoding part itself may also include meta information about the content. Multimedia communication systems typically are distributed systems, which means that the actual processing steps involved are performed at different places. Elements of adaptation of the content to the needs of the network, to the client configuration, or to the user's needs can be found anywhere in the chain. Finally, temporary or permanent storage of content can also reside anywhere, as storage elements are a specific type of channel, intended for the purpose of later review instead of instantaneous transmission.
26 Concepts and terminology: Quality of Service (QoS) 26 The QoS relating to network transmission includes aspects like transmission bandwidth, delay and losses. It indirectly contributes to the perceived quality. This will be denoted as Network QoS. The QoS relating to perceived signal quality includes the entire transmission chain, including the compression performance of source encoding/decoding, and the inter relationship with the channel characteristics. This is de noted as Perceptual QoS. An overview over methods for measurement is given in Appendix A.1. The QoS relating to the overall service quality is at the highest level. It includes aspects like the level of user satisfaction with the content itself, but also the satisfaction concerning additional services, e.g. how good an adaptation to the user's needs is made. Some methods that are used to express this category of QoS with regard to content identification are described in Appendix A.2. This may be denoted as the Semantic QoS.
27 Signal representation by source coding 27 By multimedia signal compression, systems for transmission and storage of multimedia signals shall generate the most compact representation, such that the highest possible perceptual quality is achieved. Immediately after capturing, the signal is converted into a digital representation having a finite number of samples and amplitude levels. This step already influences the final quality. If the range of rates that a prospective channel can convey, or the resolution required by an application are not known by the time of acquisition, it is advisable to capture the signal by highest possible quality, and scale it later. In the source coder, the data rate needed for digital representation shall be reduced as much as possible. Properties of the signal which allow reduction of the rate can be expressed in terms of redundancy (which is e.g. the typically expected similarity of samples from the signal). The opinion about the quality of the overall system is ruled by the purpose of the consuming at the end of the chain. If the sink is a human observer, it is useful to adapt the source coding method to perceptual properties of humans, as it would be useless to convey a finer granularity of quality than the user can (or would desire to) perceive. In advanced methods of source coding, content-related properties can also be taken into consideration. This can e.g. be done by putting more emphasis on parts or pieces of the signal in which the user is expected to be most interested.
28 Signal representation by source coding 28 The encoded information is usually represented in form of binary digits (bits). The bit rate is measured either in bit/sample2, or bit per second (bit/s), where the latter results from the bit/sample ratio, multiplied by the samples/s (the sampling rate). An important criterion to judge the performance of a source coding scheme is the compression ratio. This is the ratio between the bit rate necessary for representation of the uncompressed source and its compressed counterpart. If e.g. for digital TV the uncompressed source requires 165 Mbit/s 3, and the rate after compression is 4 Mbit/s, the compression ratio is 165:4= If compressed signal streams are stored as files on computer discs, the file size can be evaluated to judge the compression performance. When translating into bit rates, it must be observed that file sizes are often measured in KByte, MByte etc., where one Byte consists of 8 bit, 1 KByte=1,024 Byte, 1 MByte=1,024 KByte etc.
29 Signal representation by source coding 29
30 Signal representation by source coding 30 Signal analysis: Important principles for this are prediction of signals and frequency analysis by transforms. In coding applications, the analysis step shall be reversible; by a complementary synthesis performed at the decoder, the signal shall be reconstructed achieving as much fidelity as possible. Hence, typical approaches of signal analysis used in coding are reversible transformations of the signal into equivalent forms, by which the encoded representation is as much free of redundancy as possible. If linear systems or transforms are used for this purpose, the removal of redundancy is often called decorrelation, as correlation expresses linear statistical dependencies between signal samples. To optimize such systems, availability of good and simple models reflecting the properties of the signal is crucial. Methods of signal analysis can also be related to the generation (e.g. properties of the acquisition process) and to the content of signals. Besides the samples of the signal or its equivalent representation, additional side information parameters can be generated by the analysis stage, such as adaptation parameters which are needed during decoding and synthesis.
31 Signal representation by source coding 31 Quantization: maps the signal, its equivalent representation or additional parameters into a discrete form. If the required compression ratio does not allow lossless reconstruction of the signal at the decoder output, perceptual properties or circumstances of usage should be considered during quantization to retain as much as possible the relevant information. Bit-level encoding: has the goal to represent the discrete set of quantized values by lowest possible rate. The optimization of encoding is mostly performed on basis of statistical criteria.
32 Signal representation by source coding 32 Important parameters to optimize a source coding algorithm are rate, distortion, latency and complexity. These parameters have mutual influence on each other. The relationship between rate and distortion is determined by the rate distortion function, which gives a lower bound of the rate if a certain maximum distortion limit is required. Improved rate/distortion performance (which means improved compression ratio while keeping distortion constant) can usually be achieved by increasing the complexity of the encoding/decoding algorithm. Alternatively, increased latency also helps to increase compression performance; if for example an encoder is able to look ahead on effects of current decisions on future encoding steps, this provides an advantage.
33 Optimization of transmission 33 The interface between the source coder and the channel is also of high importance for the overall Perceptual QoS. - Source encoder removes redundancy from the signal, - Channel encoder adds redundancy to the bit stream for the purpose of protection and recovery in case of losses. At the receiver side, the channel decoder removes the redundancy inserted by the channel encoder, while the source decoder supplements the redundancy which was removed by the source encoder. The operation of source encoding and channel decoding is similar and vice versa. Actually, the more complex part is usually on the side where redundancy is removed, which means finding the relevant information within an overcomplete representation. Source and channel encoding play counteracting roles and should be optimized jointly for optimum performance.
34 Optimization of transmission 34 In the context of multimedia systems it often is advantageous to view the channel as a 'black box' for which a model exists. This in particular concerns error/loss characteristics, bandwidth, delay (latency) etc., which are the most important parameters of Network QoS. When parameters of Network QoS are guaranteed by the network, adaptation between source coding and the network transmission can be made in an almost optimum way. This is usually done by negotiation protocols. If no Network QoS is supported, specific mechanisms can be introduced for adaptation at the server and client sides. This includes application-specific error protection based on estimated network quality or usage of re-transmission protocols. Introduction of latency is also a viable method to improve the transmission quality, e.g. by optimization of transmission schedules, temporary buffering of information at the receiver side before presentation is started, or scrambling/interleaving of streams when bursty losses are expected.
35 Optimization of transmission 35 Today's digital communication networks as used for multimedia signal transmission are based on the definition of distinct layers with clearly defined interfaces. On top of the physical transmission layer, a hierarchy of protocol stacks performs the adaptation up to the application layers. In such a configuration, optimization over the entire transmission chain could only be achieved by cross-layer signaling, which however imposes additional complexity to the transmission.
36 Signal sources and acquisition Multimedia systems mainly process digital representations of signals, while the acquisition and generation of natural signals will in many cases not directly be performed by a digital device; electro-magnetic (microphone), optical (lens), chemical (film) media may be involved. In such cases, the properties of the digital signal are influenced by the signal conversion process during acquisition. The analog-to-digital conversion itself consists of a sampling step which maps a spatio-temporally continuous signal into discrete samples, and a quantization step which maps an amplitude-continuous signal into numerical values. If natural signals are captured, part of the information originally available in the outside (three-dimensional) world is lost due to - limited bandwidth or resolution of the acquisition device; -"Non-pervasiveness" of the acquisition device, which resides at a singular position in the 3D exterior world, such that the properties of the signal are available only for this specific view or listening point; a possible solution is the usage of multiple cameras or microphones, where however acquisition of 3D spatial information will always be incomplete. 36
37 Signal sources and acquisition 37
38 Signal sources and acquisition 38 In digital imaging the signal is also sampled in the horizontal dimension, and is converted (quantized) into numerical values instead of continuous-amplitude electrical signals. The image plane of width S1 and height S2 is mapped into N1 and N2 discrete sampling locations and represents a frame sample within a time-dependent sequence. Sampled and spatially bounded images can be expressed as matrices. Often, in the indexing of the samples the top left pixel of the image is assigned with coordinate (0,0) and is the top left element of the matrix as well.
39 Signal sources and acquisition 39 In analog video technology (and still holding in the first generations of digital video cameras), interlaced acquisition is widely used, where the even and odd lines are captured at different time instances. Here, a video frame consists of two fields, each containing only half number of lines. This method incurs a time shift between the even and odd lines of the composite frames. When the entire frame is captured simultaneously (as done by movie cameras), the acquisition is progressive. It is expected that in the future most content will be captured progressively.
40 Digital representation of multimedia signals 40 The process of digitization of a signal consists of sampling (see sec. 2.2) and quantization (see more details in chapter 4). The resultant 'raw' digital format is denoted as Pulse Code Modulation (PCM) representation. These formats are often regarded as the original references in digital multimedia signal processing applications. To capture and represent color images, the most common representation consists of three primary components of active light, red (R), green (G) and blue (B). These components are separately acquired and sampled. This results in a count of samples which is higher by a factor of three as compared to monochrome images. True representation of color may even require more components in a multi-spectral representation. Color images and video are often represented by a luminance component Y and two chrominance (color difference) components. For the transformation between R,G,B and luminance/chrominance representations, different definitions exist, depending on the particular application domain.
41 Digital representation of multimedia signals 41 For example, in standard TV resolution video, the following transform is mainly used: For high definition (HD) video formats, the transform is more commonly used. The possible color variations in the R,G,B color space are restricted such that perceptually and statistically more important colors are represented more accurately. Chrominance components are in addition usually sub-sampled, which is reasonable as the human visual sense is not capable to perceive differences in color by the same high spatial resolution as for the luminance component.
42 Digital representation of multimedia signals 42 In interlaced sampling, sub-sampling of chrominances is mostly performed only in horizontal direction to avoid color artifacts in case of motion, while for progressive sampling both horizontal and vertical directions of chrominance can be sub-sampled into lower resolution. Component sampling ratios are often expressed in a notation C1:C2:C3 to express the relative numbers of samples. For example, - when the same number of samples is used for all three components like in R,G,B, the expression is '4:4:4'; - a Y,Cb,Cr sampling structure with horizontal-only sub-sampling of the two chrominances is expressed by the notation '4:2:2', while '4:1:1' indicates horizontal sub-sampling by a factor 4; - if sub-sampling is performed in both directions, i.e. half number of samples in chrominances along both horizontal and vertical directions, the notation is '4:2:0'. The respective source format standards also specify the sub-sampled component sample positions in relation to the luminance sample positions.
43 Digital representation of multimedia signals 43
44 Digital representation of multimedia signals 44 For video representation, besides the total number of bits e.g. required to store a movie, the number of bits per second is important for transmission. It is straightforward to multiply the number of bits per frame by the number of frames per second instead of total number of frames.
45 Digital representation of multimedia signals 45 For standard TV resolution, the source of the digital TV signal is the analog TV signal of 625 lines in Europe (525 lines in Japan or US), typically recorded by an interlaced schema. These analog signals are sampled by a rate of MHz for the luminance. After removal of vertical blanking intervals, 575 (480) active lines remain. The horizontal blanking intervals (for line synchronization) are also removed, which gives around 704 active pixels per line. The digital formats listed in Tab. 1.2 are storing only those active pixels with a very small overhead of few surplus pixels from the blanking intervals. Japanese and US (NTSC) formats are traditionally using 60 fields per second (30 frames per second), while in Europe, 50 fields per second (25 frames per second) is used in analog TV (PAL, SECAM). The digital standards defining HD formats are more flexible in terms of frame and field rates, allowing ranges of 24, 25, 30, 50 or 60 frames/second, 50 or 60 fields/second; movie material, interlaced and progressive video are supported. For higher resolutions, the '720p' format (720 lines progressive) is widely used in professional digital video cameras. All 'true' HDTV formats have 1080 lines in the digital signal. There are other commonly used formats, some of which are generated by digitally down converting the standard TV resolution, e.g. the half horizontal resolution (HHR), the Common Intermediate Format (CIF) or Standard Intermediate Format (SIF) and the Quarter CIF (QCIF). For computer display or mobile devices, also formats such as VGA and QVGA are commonly used. Higher resolutions beyond HD are currently expected to emerge from the professional area ( Digital Cinema ) into consumer applications. Current plans are to introduce formats with double number of samples horizontally/vertically as compared to HD1080, then called 4Kx2K or quadrupling the number ( 8Kx4K ). Those formats will only support progressive sampling, but frame rates may become even higher in the future (72 frames per second and beyond).
46 Digital representation of multimedia signals 46 This figure gives a coarse impression of the sampled image areas supported in formats between QCIF and HDTV. An increased number of samples can either be used to increase the resolution (spatial detail), or to display scenes by a wider angle. For example, in a cinema movie close-up views of human faces are rarely shown. Movies displayed on a cinema screen allow the observer's eye to explore the scene, while on standard definition TV screens and even more for the smaller formats, this capability is very limited. For medical and scientific purposes, digital images with much higher resolution than in movie production are used, resolutions of up to 10,000x10,000 = ,000 pixels are quite common. Such formats are not realistic yet for realtime acquisition by digital video cameras, as the clock rates for sampling would be extremely high.
47 Digital representation of multimedia signals 47 Speech and audio signals For audio signals, parameters such as sampling rate and precision (bit depth) take most influence on the resulting data rates of the digital representation. These parameters highly depend on the properties of the signals, and on the requirements for quality. In speech signal quantization, nonlinear mappings using logarithmic amplitude compression are used, which for the case of low amplitudes provides an equivalently low quantization noise as in 12 bit quantization, even though only 8 bit/sample are used. For music signals to be acquired by audio CD quality, linear 16 bit representation is most commonly necessary. For some specialized applications, even higher bit depths and higher sampling rates than for CD are used.
48 Digital representation of multimedia signals 48
49 Digital representation of multimedia signals 49 Need for compression technology Due to the tremendous amount of rates necessary for representation of the original uncoded formats, the requirement for data compression by application of image, video and audio coding is permanently present, even though the available transmission bandwidth is further increasing by advances in communications technology. In general, the past experience has shown that multimedia traffic increases faster than new capacity is becoming available, and compressed transmission of data is inherently cheaper. If sufficient bandwidth is available, it is more efficiently used in terms of quality that serves the user, if the resolution of the signal is increased. Further, certain types of communication channels (in particular in mobile transmission) exist where the bandwidth is inherently expensive due to physical limitations. This must however be weighted against the complexity that is necessary for the implementation of a compression algorithm, which may lead to higher cost of the device and higher power consumption, which is in particular critical for mobile devices.
COMP 249 Advanced Distributed Systems Multimedia Networking. Video Compression Standards
COMP 9 Advanced Distributed Systems Multimedia Networking Video Compression Standards Kevin Jeffay Department of Computer Science University of North Carolina at Chapel Hill jeffay@cs.unc.edu September,
More informationContents. xv xxi xxiii xxiv. 1 Introduction 1 References 4
Contents List of figures List of tables Preface Acknowledgements xv xxi xxiii xxiv 1 Introduction 1 References 4 2 Digital video 5 2.1 Introduction 5 2.2 Analogue television 5 2.3 Interlace 7 2.4 Picture
More informationModule 8 VIDEO CODING STANDARDS. Version 2 ECE IIT, Kharagpur
Module 8 VIDEO CODING STANDARDS Lesson 27 H.264 standard Lesson Objectives At the end of this lesson, the students should be able to: 1. State the broad objectives of the H.264 standard. 2. List the improved
More informationMotion Video Compression
7 Motion Video Compression 7.1 Motion video Motion video contains massive amounts of redundant information. This is because each image has redundant information and also because there are very few changes
More informationCh. 1: Audio/Image/Video Fundamentals Multimedia Systems. School of Electrical Engineering and Computer Science Oregon State University
Ch. 1: Audio/Image/Video Fundamentals Multimedia Systems Prof. Ben Lee School of Electrical Engineering and Computer Science Oregon State University Outline Computer Representation of Audio Quantization
More informationMultimedia Communications. Image and Video compression
Multimedia Communications Image and Video compression JPEG2000 JPEG2000: is based on wavelet decomposition two types of wavelet filters one similar to what discussed in Chapter 14 and the other one generates
More informationMultimedia Communications. Video compression
Multimedia Communications Video compression Video compression Of all the different sources of data, video produces the largest amount of data There are some differences in our perception with regard to
More informationELEC 691X/498X Broadcast Signal Transmission Fall 2015
ELEC 691X/498X Broadcast Signal Transmission Fall 2015 Instructor: Dr. Reza Soleymani, Office: EV 5.125, Telephone: 848 2424 ext.: 4103. Office Hours: Wednesday, Thursday, 14:00 15:00 Time: Tuesday, 2:45
More informationVideo coding standards
Video coding standards Video signals represent sequences of images or frames which can be transmitted with a rate from 5 to 60 frames per second (fps), that provides the illusion of motion in the displayed
More informationVideo compression principles. Color Space Conversion. Sub-sampling of Chrominance Information. Video: moving pictures and the terms frame and
Video compression principles Video: moving pictures and the terms frame and picture. one approach to compressing a video source is to apply the JPEG algorithm to each frame independently. This approach
More informationDigital Image Processing
Digital Image Processing 25 January 2007 Dr. ir. Aleksandra Pizurica Prof. Dr. Ir. Wilfried Philips Aleksandra.Pizurica @telin.ugent.be Tel: 09/264.3415 UNIVERSITEIT GENT Telecommunicatie en Informatieverwerking
More informationVideo 1 Video October 16, 2001
Video Video October 6, Video Event-based programs read() is blocking server only works with single socket audio, network input need I/O multiplexing event-based programming also need to handle time-outs,
More informationAn Overview of Video Coding Algorithms
An Overview of Video Coding Algorithms Prof. Ja-Ling Wu Department of Computer Science and Information Engineering National Taiwan University Video coding can be viewed as image compression with a temporal
More informationDigital Television Fundamentals
Digital Television Fundamentals Design and Installation of Video and Audio Systems Michael Robin Michel Pouiin McGraw-Hill New York San Francisco Washington, D.C. Auckland Bogota Caracas Lisbon London
More informationModule 8 VIDEO CODING STANDARDS. Version 2 ECE IIT, Kharagpur
Module 8 VIDEO CODING STANDARDS Lesson 24 MPEG-2 Standards Lesson Objectives At the end of this lesson, the students should be able to: 1. State the basic objectives of MPEG-2 standard. 2. Enlist the profiles
More informationAudio and Video II. Video signal +Color systems Motion estimation Video compression standards +H.261 +MPEG-1, MPEG-2, MPEG-4, MPEG- 7, and MPEG-21
Audio and Video II Video signal +Color systems Motion estimation Video compression standards +H.261 +MPEG-1, MPEG-2, MPEG-4, MPEG- 7, and MPEG-21 1 Video signal Video camera scans the image by following
More informationThe H.26L Video Coding Project
The H.26L Video Coding Project New ITU-T Q.6/SG16 (VCEG - Video Coding Experts Group) standardization activity for video compression August 1999: 1 st test model (TML-1) December 2001: 10 th test model
More informationSo far. Chapter 4 Color spaces Chapter 3 image representations. Bitmap grayscale. 1/21/09 CSE 40373/60373: Multimedia Systems
So far. Chapter 4 Color spaces Chapter 3 image representations Bitmap grayscale page 1 8-bit color image Can show up to 256 colors Use color lookup table to map 256 of the 24-bit color (rather than choosing
More informationA Novel Approach towards Video Compression for Mobile Internet using Transform Domain Technique
A Novel Approach towards Video Compression for Mobile Internet using Transform Domain Technique Dhaval R. Bhojani Research Scholar, Shri JJT University, Jhunjunu, Rajasthan, India Ved Vyas Dwivedi, PhD.
More informationLecture 2 Video Formation and Representation
2013 Spring Term 1 Lecture 2 Video Formation and Representation Wen-Hsiao Peng ( 彭文孝 ) Multimedia Architecture and Processing Lab (MAPL) Department of Computer Science National Chiao Tung University 1
More informationA video signal consists of a time sequence of images. Typical frame rates are 24, 25, 30, 50 and 60 images per seconds.
Video coding Concepts and notations. A video signal consists of a time sequence of images. Typical frame rates are 24, 25, 30, 50 and 60 images per seconds. Each image is either sent progressively (the
More informationHEVC: Future Video Encoding Landscape
HEVC: Future Video Encoding Landscape By Dr. Paul Haskell, Vice President R&D at Harmonic nc. 1 ABSTRACT This paper looks at the HEVC video coding standard: possible applications, video compression performance
More informationAdvanced Computer Networks
Advanced Computer Networks Video Basics Jianping Pan Spring 2017 3/10/17 csc466/579 1 Video is a sequence of images Recorded/displayed at a certain rate Types of video signals component video separate
More informationMPEGTool: An X Window Based MPEG Encoder and Statistics Tool 1
MPEGTool: An X Window Based MPEG Encoder and Statistics Tool 1 Toshiyuki Urabe Hassan Afzal Grace Ho Pramod Pancha Magda El Zarki Department of Electrical Engineering University of Pennsylvania Philadelphia,
More informationJoint Optimization of Source-Channel Video Coding Using the H.264/AVC encoder and FEC Codes. Digital Signal and Image Processing Lab
Joint Optimization of Source-Channel Video Coding Using the H.264/AVC encoder and FEC Codes Digital Signal and Image Processing Lab Simone Milani Ph.D. student simone.milani@dei.unipd.it, Summer School
More informationPrinciples of Video Compression
Principles of Video Compression Topics today Introduction Temporal Redundancy Reduction Coding for Video Conferencing (H.261, H.263) (CSIT 410) 2 Introduction Reduce video bit rates while maintaining an
More informationINTERNATIONAL TELECOMMUNICATION UNION. SERIES H: AUDIOVISUAL AND MULTIMEDIA SYSTEMS Coding of moving video
INTERNATIONAL TELECOMMUNICATION UNION CCITT H.261 THE INTERNATIONAL TELEGRAPH AND TELEPHONE CONSULTATIVE COMMITTEE (11/1988) SERIES H: AUDIOVISUAL AND MULTIMEDIA SYSTEMS Coding of moving video CODEC FOR
More informationDELTA MODULATION AND DPCM CODING OF COLOR SIGNALS
DELTA MODULATION AND DPCM CODING OF COLOR SIGNALS Item Type text; Proceedings Authors Habibi, A. Publisher International Foundation for Telemetering Journal International Telemetering Conference Proceedings
More informationINTRA-FRAME WAVELET VIDEO CODING
INTRA-FRAME WAVELET VIDEO CODING Dr. T. Morris, Mr. D. Britch Department of Computation, UMIST, P. O. Box 88, Manchester, M60 1QD, United Kingdom E-mail: t.morris@co.umist.ac.uk dbritch@co.umist.ac.uk
More informationIntroduction to image compression
Introduction to image compression 1997-2015 Josef Pelikán CGG MFF UK Praha pepca@cgg.mff.cuni.cz http://cgg.mff.cuni.cz/~pepca/ Compression 2015 Josef Pelikán, http://cgg.mff.cuni.cz/~pepca 1 / 12 Motivation
More informationDigital Video Telemetry System
Digital Video Telemetry System Item Type text; Proceedings Authors Thom, Gary A.; Snyder, Edwin Publisher International Foundation for Telemetering Journal International Telemetering Conference Proceedings
More informationUnderstanding Compression Technologies for HD and Megapixel Surveillance
When the security industry began the transition from using VHS tapes to hard disks for video surveillance storage, the question of how to compress and store video became a top consideration for video surveillance
More informationColour Reproduction Performance of JPEG and JPEG2000 Codecs
Colour Reproduction Performance of JPEG and JPEG000 Codecs A. Punchihewa, D. G. Bailey, and R. M. Hodgson Institute of Information Sciences & Technology, Massey University, Palmerston North, New Zealand
More informationWYNER-ZIV VIDEO CODING WITH LOW ENCODER COMPLEXITY
WYNER-ZIV VIDEO CODING WITH LOW ENCODER COMPLEXITY (Invited Paper) Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University, Stanford, CA 94305 {amaaron,bgirod}@stanford.edu Abstract
More informationChapter 2 Introduction to
Chapter 2 Introduction to H.264/AVC H.264/AVC [1] is the newest video coding standard of the ITU-T Video Coding Experts Group (VCEG) and the ISO/IEC Moving Picture Experts Group (MPEG). The main improvements
More informationContent storage architectures
Content storage architectures DAS: Directly Attached Store SAN: Storage Area Network allocates storage resources only to the computer it is attached to network storage provides a common pool of storage
More informationIntroduction to Video Compression Techniques. Slides courtesy of Tay Vaughan Making Multimedia Work
Introduction to Video Compression Techniques Slides courtesy of Tay Vaughan Making Multimedia Work Agenda Video Compression Overview Motivation for creating standards What do the standards specify Brief
More informationAUDIOVISUAL COMMUNICATION
AUDIOVISUAL COMMUNICATION Laboratory Session: Recommendation ITU-T H.261 Fernando Pereira The objective of this lab session about Recommendation ITU-T H.261 is to get the students familiar with many aspects
More informationMultimedia. Course Code (Fall 2017) Fundamental Concepts in Video
Course Code 005636 (Fall 2017) Multimedia Fundamental Concepts in Video Prof. S. M. Riazul Islam, Dept. of Computer Engineering, Sejong University, Korea E-mail: riaz@sejong.ac.kr Outline Types of Video
More informationImpact of scan conversion methods on the performance of scalable. video coding. E. Dubois, N. Baaziz and M. Matta. INRS-Telecommunications
Impact of scan conversion methods on the performance of scalable video coding E. Dubois, N. Baaziz and M. Matta INRS-Telecommunications 16 Place du Commerce, Verdun, Quebec, Canada H3E 1H6 ABSTRACT The
More informationVideo Compression. Representations. Multimedia Systems and Applications. Analog Video Representations. Digitizing. Digital Video Block Structure
Representations Multimedia Systems and Applications Video Compression Composite NTSC - 6MHz (4.2MHz video), 29.97 frames/second PAL - 6-8MHz (4.2-6MHz video), 50 frames/second Component Separation video
More informationMULTIMEDIA TECHNOLOGIES
MULTIMEDIA TECHNOLOGIES LECTURE 08 VIDEO IMRAN IHSAN ASSISTANT PROFESSOR VIDEO Video streams are made up of a series of still images (frames) played one after another at high speed This fools the eye into
More informationChapter 10 Basic Video Compression Techniques
Chapter 10 Basic Video Compression Techniques 10.1 Introduction to Video compression 10.2 Video Compression with Motion Compensation 10.3 Video compression standard H.261 10.4 Video compression standard
More informationChapter 3 Fundamental Concepts in Video. 3.1 Types of Video Signals 3.2 Analog Video 3.3 Digital Video
Chapter 3 Fundamental Concepts in Video 3.1 Types of Video Signals 3.2 Analog Video 3.3 Digital Video 1 3.1 TYPES OF VIDEO SIGNALS 2 Types of Video Signals Video standards for managing analog output: A.
More informationA look at the MPEG video coding standard for variable bit rate video transmission 1
A look at the MPEG video coding standard for variable bit rate video transmission 1 Pramod Pancha Magda El Zarki Department of Electrical Engineering University of Pennsylvania Philadelphia PA 19104, U.S.A.
More informationLecture 23: Digital Video. The Digital World of Multimedia Guest lecture: Jayson Bowen
Lecture 23: Digital Video The Digital World of Multimedia Guest lecture: Jayson Bowen Plan for Today Digital video Video compression HD, HDTV & Streaming Video Audio + Images Video Audio: time sampling
More informationLecture 1: Introduction & Image and Video Coding Techniques (I)
Lecture 1: Introduction & Image and Video Coding Techniques (I) Dr. Reji Mathew Reji@unsw.edu.au School of EE&T UNSW A/Prof. Jian Zhang NICTA & CSE UNSW jzhang@cse.unsw.edu.au COMP9519 Multimedia Systems
More informationDigital Representation
Chapter three c0003 Digital Representation CHAPTER OUTLINE Antialiasing...12 Sampling...12 Quantization...13 Binary Values...13 A-D... 14 D-A...15 Bit Reduction...15 Lossless Packing...16 Lower f s and
More informationTo discuss. Types of video signals Analog Video Digital Video. Multimedia Computing (CSIT 410) 2
Video Lecture-5 To discuss Types of video signals Analog Video Digital Video (CSIT 410) 2 Types of Video Signals Video Signals can be classified as 1. Composite Video 2. S-Video 3. Component Video (CSIT
More informationUnderstanding IP Video for
Brought to You by Presented by Part 3 of 4 B1 Part 3of 4 Clearing Up Compression Misconception By Bob Wimmer Principal Video Security Consultants cctvbob@aol.com AT A GLANCE Three forms of bandwidth compression
More informationImprovement of MPEG-2 Compression by Position-Dependent Encoding
Improvement of MPEG-2 Compression by Position-Dependent Encoding by Eric Reed B.S., Electrical Engineering Drexel University, 1994 Submitted to the Department of Electrical Engineering and Computer Science
More informationThe H.263+ Video Coding Standard: Complexity and Performance
The H.263+ Video Coding Standard: Complexity and Performance Berna Erol (bernae@ee.ubc.ca), Michael Gallant (mikeg@ee.ubc.ca), Guy C t (guyc@ee.ubc.ca), and Faouzi Kossentini (faouzi@ee.ubc.ca) Department
More informationChapter 2. Advanced Telecommunications and Signal Processing Program. E. Galarza, Raynard O. Hinds, Eric C. Reed, Lon E. Sun-
Chapter 2. Advanced Telecommunications and Signal Processing Program Academic and Research Staff Professor Jae S. Lim Visiting Scientists and Research Affiliates M. Carlos Kennedy Graduate Students John
More informationSkip Length and Inter-Starvation Distance as a Combined Metric to Assess the Quality of Transmitted Video
Skip Length and Inter-Starvation Distance as a Combined Metric to Assess the Quality of Transmitted Video Mohamed Hassan, Taha Landolsi, Husameldin Mukhtar, and Tamer Shanableh College of Engineering American
More informationVideo (Fundamentals, Compression Techniques & Standards) Hamid R. Rabiee Mostafa Salehi, Fatemeh Dabiran, Hoda Ayatollahi Spring 2011
Video (Fundamentals, Compression Techniques & Standards) Hamid R. Rabiee Mostafa Salehi, Fatemeh Dabiran, Hoda Ayatollahi Spring 2011 Outlines Frame Types Color Video Compression Techniques Video Coding
More informationMultimedia Systems Video I (Basics of Analog and Digital Video) Mahdi Amiri April 2011 Sharif University of Technology
Course Presentation Multimedia Systems Video I (Basics of Analog and Digital Video) Mahdi Amiri April 2011 Sharif University of Technology Video Visual Effect of Motion The visual effect of motion is due
More informationRECOMMENDATION ITU-R BT.1203 *
Rec. TU-R BT.1203 1 RECOMMENDATON TU-R BT.1203 * User requirements for generic bit-rate reduction coding of digital TV signals (, and ) for an end-to-end television system (1995) The TU Radiocommunication
More informationMPEG + Compression of Moving Pictures for Digital Cinema Using the MPEG-2 Toolkit. A Digital Cinema Accelerator
142nd SMPTE Technical Conference, October, 2000 MPEG + Compression of Moving Pictures for Digital Cinema Using the MPEG-2 Toolkit A Digital Cinema Accelerator Michael W. Bruns James T. Whittlesey 0 The
More informationThe Development of a Synthetic Colour Test Image for Subjective and Objective Quality Assessment of Digital Codecs
2005 Asia-Pacific Conference on Communications, Perth, Western Australia, 3-5 October 2005. The Development of a Synthetic Colour Test Image for Subjective and Objective Quality Assessment of Digital Codecs
More information10 Digital TV Introduction Subsampling
10 Digital TV 10.1 Introduction Composite video signals must be sampled at twice the highest frequency of the signal. To standardize this sampling, the ITU CCIR-601 (often known as ITU-R) has been devised.
More information06 Video. Multimedia Systems. Video Standards, Compression, Post Production
Multimedia Systems 06 Video Video Standards, Compression, Post Production Imran Ihsan Assistant Professor, Department of Computer Science Air University, Islamabad, Pakistan www.imranihsan.com Lectures
More informationA review of the implementation of HDTV technology over SDTV technology
A review of the implementation of HDTV technology over SDTV technology Chetan lohani Dronacharya College of Engineering Abstract Standard Definition television (SDTV) Standard-Definition Television is
More informationOverview: Video Coding Standards
Overview: Video Coding Standards Video coding standards: applications and common structure ITU-T Rec. H.261 ISO/IEC MPEG-1 ISO/IEC MPEG-2 State-of-the-art: H.264/AVC Video Coding Standards no. 1 Applications
More informationProfessor Laurence S. Dooley. School of Computing and Communications Milton Keynes, UK
Professor Laurence S. Dooley School of Computing and Communications Milton Keynes, UK The Song of the Talking Wire 1904 Henry Farny painting Communications It s an analogue world Our world is continuous
More informationPAL uncompressed. 768x576 pixels per frame. 31 MB per second 1.85 GB per minute. x 3 bytes per pixel (24 bit colour) x 25 frames per second
191 192 PAL uncompressed 768x576 pixels per frame x 3 bytes per pixel (24 bit colour) x 25 frames per second 31 MB per second 1.85 GB per minute 191 192 NTSC uncompressed 640x480 pixels per frame x 3 bytes
More informationCOPYRIGHTED MATERIAL. Introduction to Analog and Digital Television. Chapter INTRODUCTION 1.2. ANALOG TELEVISION
Chapter 1 Introduction to Analog and Digital Television 1.1. INTRODUCTION From small beginnings less than 100 years ago, the television industry has grown to be a significant part of the lives of most
More informationCompressed-Sensing-Enabled Video Streaming for Wireless Multimedia Sensor Networks Abstract:
Compressed-Sensing-Enabled Video Streaming for Wireless Multimedia Sensor Networks Abstract: This article1 presents the design of a networked system for joint compression, rate control and error correction
More informationResearch Article. ISSN (Print) *Corresponding author Shireen Fathima
Scholars Journal of Engineering and Technology (SJET) Sch. J. Eng. Tech., 2014; 2(4C):613-620 Scholars Academic and Scientific Publisher (An International Publisher for Academic and Scientific Resources)
More informationChapter 2 Video Coding Standards and Video Formats
Chapter 2 Video Coding Standards and Video Formats Abstract Video formats, conversions among RGB, Y, Cb, Cr, and YUV are presented. These are basically continuation from Chap. 1 and thus complement the
More informationVIDEO 101: INTRODUCTION:
W h i t e P a p e r VIDEO 101: INTRODUCTION: Understanding how the PC can be used to receive TV signals, record video and playback video content is a complicated process, and unfortunately most documentation
More informationVisual Communication at Limited Colour Display Capability
Visual Communication at Limited Colour Display Capability Yan Lu, Wen Gao and Feng Wu Abstract: A novel scheme for visual communication by means of mobile devices with limited colour display capability
More informationA Big Umbrella. Content Creation: produce the media, compress it to a format that is portable/ deliverable
A Big Umbrella Content Creation: produce the media, compress it to a format that is portable/ deliverable Distribution: how the message arrives is often as important as what the message is Search: finding
More informationInformation Transmission Chapter 3, image and video
Information Transmission Chapter 3, image and video FREDRIK TUFVESSON ELECTRICAL AND INFORMATION TECHNOLOGY Images An image is a two-dimensional array of light values. Make it 1D by scanning Smallest element
More informationAT65 MULTIMEDIA SYSTEMS DEC 2015
Q.2 a. Define a multimedia system. Describe about the different components of Multimedia. (2+3) Multimedia ---- An Application which uses a collection of multiple media sources e.g. text, graphics, images,
More informationEssence of Image and Video
1 Essence of Image and Video Wei-Ta Chu 2009/9/24 Outline 2 Image Digital Image Fundamentals Representation of Images Video Representation of Videos 3 Essence of Image Wei-Ta Chu 2009/9/24 Chapters 2 and
More informationINTERNATIONAL JOURNAL OF ELECTRONICS AND COMMUNICATION ENGINEERING & TECHNOLOGY (IJECET)
INTERNATIONAL JOURNAL OF ELECTRONICS AND COMMUNICATION ENGINEERING & TECHNOLOGY (IJECET) International Journal of Electronics and Communication Engineering & Technology (IJECET), ISSN 0976 ISSN 0976 6464(Print)
More informationATSC vs NTSC Spectrum. ATSC 8VSB Data Framing
ATSC vs NTSC Spectrum ATSC 8VSB Data Framing 22 ATSC 8VSB Data Segment ATSC 8VSB Data Field 23 ATSC 8VSB (AM) Modulated Baseband ATSC 8VSB Pre-Filtered Spectrum 24 ATSC 8VSB Nyquist Filtered Spectrum ATSC
More information1. Broadcast television
VIDEO REPRESNTATION 1. Broadcast television A color picture/image is produced from three primary colors red, green and blue (RGB). The screen of the picture tube is coated with a set of three different
More informationFree Viewpoint Switching in Multi-view Video Streaming Using. Wyner-Ziv Video Coding
Free Viewpoint Switching in Multi-view Video Streaming Using Wyner-Ziv Video Coding Xun Guo 1,, Yan Lu 2, Feng Wu 2, Wen Gao 1, 3, Shipeng Li 2 1 School of Computer Sciences, Harbin Institute of Technology,
More informationAudiovisual Archiving Terminology
Audiovisual Archiving Terminology A Amplitude The magnitude of the difference between a signal's extreme values. (See also Signal) Analog Representing information using a continuously variable quantity
More informationEMBEDDED ZEROTREE WAVELET CODING WITH JOINT HUFFMAN AND ARITHMETIC CODING
EMBEDDED ZEROTREE WAVELET CODING WITH JOINT HUFFMAN AND ARITHMETIC CODING Harmandeep Singh Nijjar 1, Charanjit Singh 2 1 MTech, Department of ECE, Punjabi University Patiala 2 Assistant Professor, Department
More informationMULTIMEDIA COMPRESSION AND COMMUNICATION
MULTIMEDIA COMPRESSION AND COMMUNICATION 1. What is rate distortion theory? Rate distortion theory is concerned with the trade-offs between distortion and rate in lossy compression schemes. If the average
More informationInto the Depths: The Technical Details Behind AV1. Nathan Egge Mile High Video Workshop 2018 July 31, 2018
Into the Depths: The Technical Details Behind AV1 Nathan Egge Mile High Video Workshop 2018 July 31, 2018 North America Internet Traffic 82% of Internet traffic by 2021 Cisco Study
More informationPAPER Wireless Multi-view Video Streaming with Subcarrier Allocation
IEICE TRANS. COMMUN., VOL.Exx??, NO.xx XXXX 200x 1 AER Wireless Multi-view Video Streaming with Subcarrier Allocation Takuya FUJIHASHI a), Shiho KODERA b), Nonmembers, Shunsuke SARUWATARI c), and Takashi
More informationVideo Over Mobile Networks
Video Over Mobile Networks Professor Mohammed Ghanbari Department of Electronic systems Engineering University of Essex United Kingdom June 2005, Zadar, Croatia (Slides prepared by M. Mahdi Ghandi) INTRODUCTION
More informationMotion Re-estimation for MPEG-2 to MPEG-4 Simple Profile Transcoding. Abstract. I. Introduction
Motion Re-estimation for MPEG-2 to MPEG-4 Simple Profile Transcoding Jun Xin, Ming-Ting Sun*, and Kangwook Chun** *Department of Electrical Engineering, University of Washington **Samsung Electronics Co.
More informationRec. ITU-R BT RECOMMENDATION ITU-R BT PARAMETER VALUES FOR THE HDTV STANDARDS FOR PRODUCTION AND INTERNATIONAL PROGRAMME EXCHANGE
Rec. ITU-R BT.79-4 1 RECOMMENDATION ITU-R BT.79-4 PARAMETER VALUES FOR THE HDTV STANDARDS FOR PRODUCTION AND INTERNATIONAL PROGRAMME EXCHANGE (Question ITU-R 27/11) (199-1994-1995-1998-2) Rec. ITU-R BT.79-4
More informationModeling and Evaluating Feedback-Based Error Control for Video Transfer
Modeling and Evaluating Feedback-Based Error Control for Video Transfer by Yubing Wang A Dissertation Submitted to the Faculty of the WORCESTER POLYTECHNIC INSTITUTE In partial fulfillment of the Requirements
More information5.1 Types of Video Signals. Chapter 5 Fundamental Concepts in Video. Component video
Chapter 5 Fundamental Concepts in Video 5.1 Types of Video Signals 5.2 Analog Video 5.3 Digital Video 5.4 Further Exploration 1 Li & Drew c Prentice Hall 2003 5.1 Types of Video Signals Component video
More informationOBJECT-BASED IMAGE COMPRESSION WITH SIMULTANEOUS SPATIAL AND SNR SCALABILITY SUPPORT FOR MULTICASTING OVER HETEROGENEOUS NETWORKS
OBJECT-BASED IMAGE COMPRESSION WITH SIMULTANEOUS SPATIAL AND SNR SCALABILITY SUPPORT FOR MULTICASTING OVER HETEROGENEOUS NETWORKS Habibollah Danyali and Alfred Mertins School of Electrical, Computer and
More informationAbout... D 3 Technology TM.
About... D 3 Technology TM www.euresys.com Copyright 2008 Euresys s.a. Belgium. Euresys is a registred trademark of Euresys s.a. Belgium. Other product and company names listed are trademarks or trade
More informationMPEG-2. ISO/IEC (or ITU-T H.262)
1 ISO/IEC 13818-2 (or ITU-T H.262) High quality encoding of interlaced video at 4-15 Mbps for digital video broadcast TV and digital storage media Applications Broadcast TV, Satellite TV, CATV, HDTV, video
More informationMultimedia Networking
Multimedia Networking #3 Multimedia Networking Semester Ganjil 2012 PTIIK Universitas Brawijaya #2 Multimedia Applications 1 Schedule of Class Meeting 1. Introduction 2. Applications of MN 3. Requirements
More informationMinimax Disappointment Video Broadcasting
Minimax Disappointment Video Broadcasting DSP Seminar Spring 2001 Leiming R. Qian and Douglas L. Jones http://www.ifp.uiuc.edu/ lqian Seminar Outline 1. Motivation and Introduction 2. Background Knowledge
More informationUnequal Error Protection Codes for Wavelet Image Transmission over W-CDMA, AWGN and Rayleigh Fading Channels
Unequal Error Protection Codes for Wavelet Image Transmission over W-CDMA, AWGN and Rayleigh Fading Channels MINH H. LE and RANJITH LIYANA-PATHIRANA School of Engineering and Industrial Design College
More informationRECOMMENDATION ITU-R BT (Questions ITU-R 25/11, ITU-R 60/11 and ITU-R 61/11)
Rec. ITU-R BT.61-4 1 SECTION 11B: DIGITAL TELEVISION RECOMMENDATION ITU-R BT.61-4 Rec. ITU-R BT.61-4 ENCODING PARAMETERS OF DIGITAL TELEVISION FOR STUDIOS (Questions ITU-R 25/11, ITU-R 6/11 and ITU-R 61/11)
More informationRec. ITU-R BT RECOMMENDATION ITU-R BT * WIDE-SCREEN SIGNALLING FOR BROADCASTING
Rec. ITU-R BT.111-2 1 RECOMMENDATION ITU-R BT.111-2 * WIDE-SCREEN SIGNALLING FOR BROADCASTING (Signalling for wide-screen and other enhanced television parameters) (Question ITU-R 42/11) Rec. ITU-R BT.111-2
More informationLesson 2.2: Digitizing and Packetizing Voice. Optimizing Converged Cisco Networks (ONT) Module 2: Cisco VoIP Implementations
Optimizing Converged Cisco Networks (ONT) Module 2: Cisco VoIP Implementations Lesson 2.2: Digitizing and Packetizing Voice Objectives Describe the process of analog to digital conversion. Describe the
More informationWill Widescreen (16:9) Work Over Cable? Ralph W. Brown
Will Widescreen (16:9) Work Over Cable? Ralph W. Brown Digital video, in both standard definition and high definition, is rapidly setting the standard for the highest quality television viewing experience.
More informationPERCEPTUAL QUALITY COMPARISON BETWEEN SINGLE-LAYER AND SCALABLE VIDEOS AT THE SAME SPATIAL, TEMPORAL AND AMPLITUDE RESOLUTIONS. Yuanyi Xue, Yao Wang
PERCEPTUAL QUALITY COMPARISON BETWEEN SINGLE-LAYER AND SCALABLE VIDEOS AT THE SAME SPATIAL, TEMPORAL AND AMPLITUDE RESOLUTIONS Yuanyi Xue, Yao Wang Department of Electrical and Computer Engineering Polytechnic
More information