RESTORATION OF ARCHIVED TELEVISION PROGRAMMES FOR DIGITAL BROADCASTING

Similar documents
An Overview of Video Coding Algorithms

In MPEG, two-dimensional spatial frequency analysis is performed using the Discrete Cosine Transform

The Lecture Contains: Frequency Response of the Human Visual System: Temporal Vision: Consequences of persistence of vision: Objectives_template

Understanding Compression Technologies for HD and Megapixel Surveillance

Chapter 10 Basic Video Compression Techniques

Contents. xv xxi xxiii xxiv. 1 Introduction 1 References 4

FLEXIBLE SWITCHING AND EDITING OF MPEG-2 VIDEO BITSTREAMS

By David Acker, Broadcast Pix Hardware Engineering Vice President, and SMPTE Fellow Bob Lamm, Broadcast Pix Product Specialist

Audio and Video II. Video signal +Color systems Motion estimation Video compression standards +H.261 +MPEG-1, MPEG-2, MPEG-4, MPEG- 7, and MPEG-21

TERMINOLOGY INDEX. DME Down Stream Keyer (DSK) Drop Shadow. A/B Roll Edit Animation Effects Anti-Alias Auto Transition

New-Generation Scalable Motion Processing from Mobile to 4K and Beyond

Motion Video Compression

Implementation of MPEG-2 Trick Modes

White Paper : Achieving synthetic slow-motion in UHDTV. InSync Technology Ltd, UK

Module 8 VIDEO CODING STANDARDS. Version 2 ECE IIT, Kharagpur

Ch. 1: Audio/Image/Video Fundamentals Multimedia Systems. School of Electrical Engineering and Computer Science Oregon State University

Audiovisual Archiving Terminology

Research and Development Report

Supervision of Analogue Signal Paths in Legacy Media Migration Processes using Digital Signal Processing

TIME-COMPENSATED REMOTE PRODUCTION OVER IP

AN IMPROVED ERROR CONCEALMENT STRATEGY DRIVEN BY SCENE MOTION PROPERTIES FOR H.264/AVC DECODERS

Film Sequence Detection and Removal in DTV Format and Standards Conversion

AUDIOVISUAL COMMUNICATION

Case Study: Can Video Quality Testing be Scripted?

The Development of a Synthetic Colour Test Image for Subjective and Objective Quality Assessment of Digital Codecs

United States Patent: 4,789,893. ( 1 of 1 ) United States Patent 4,789,893 Weston December 6, Interpolating lines of video signals

Rounding Considerations SDTV-HDTV YCbCr Transforms 4:4:4 to 4:2:2 YCbCr Conversion

RECOMMENDATION ITU-R BT.1201 * Extremely high resolution imagery

COMP 249 Advanced Distributed Systems Multimedia Networking. Video Compression Standards

DELTA MODULATION AND DPCM CODING OF COLOR SIGNALS

Will Widescreen (16:9) Work Over Cable? Ralph W. Brown

SCode V3.5.1 (SP-501 and MP-9200) Digital Video Network Surveillance System

SCode V3.5.1 (SP-601 and MP-6010) Digital Video Network Surveillance System

Intelligent Monitoring Software IMZ-RS300. Series IMZ-RS301 IMZ-RS304 IMZ-RS309 IMZ-RS316 IMZ-RS332 IMZ-RS300C

Research Topic. Error Concealment Techniques in H.264/AVC for Wireless Video Transmission in Mobile Networks

Interlace and De-interlace Application on Video

How smart dimming technologies can help to optimise visual impact and power consumption of new HDR TVs

Video Processing Applications Image and Video Processing Dr. Anil Kokaram

Colour Reproduction Performance of JPEG and JPEG2000 Codecs

1. INTRODUCTION. Index Terms Video Transcoding, Video Streaming, Frame skipping, Interpolation frame, Decoder, Encoder.

ESI VLS-2000 Video Line Scaler

Glossary Unit 1: Introduction to Video

Skip Length and Inter-Starvation Distance as a Combined Metric to Assess the Quality of Transmitted Video

Chapter 2. Advanced Telecommunications and Signal Processing Program. E. Galarza, Raynard O. Hinds, Eric C. Reed, Lon E. Sun-

Lecture 2 Video Formation and Representation

Spatio-temporal inaccuracies of video-based ultrasound images of the tongue

Reducing False Positives in Video Shot Detection

Module 3: Video Sampling Lecture 16: Sampling of video in two dimensions: Progressive vs Interlaced scans. The Lecture Contains:

APPLICATIONS OF DIGITAL IMAGE ENHANCEMENT TECHNIQUES FOR IMPROVED

Reducing tilt errors in moiré linear encoders using phase-modulated grating

Express Letters. A Novel Four-Step Search Algorithm for Fast Block Motion Estimation

da Vinci s Revival and its Workflow Possibilities within a DI Process

h t t p : / / w w w. v i d e o e s s e n t i a l s. c o m E - M a i l : j o e k a n a t t. n e t DVE D-Theater Q & A

High Dynamic Range What does it mean for broadcasters? David Wood Consultant, EBU Technology and Innovation

Color Image Compression Using Colorization Based On Coding Technique

Digital Video Telemetry System

Research and Development Report

ARTEFACTS. Dr Amal Punchihewa Distinguished Lecturer of IEEE Broadcast Technology Society

Signal Ingest in Uncompromising Linear Video Archiving: Pitfalls, Loopholes and Solutions.

Understanding PQR, DMOS, and PSNR Measurements

White Paper. Uniform Luminance Technology. What s inside? What is non-uniformity and noise in LCDs? Why is it a problem? How is it solved?

Rotation p. 55 Scale p. 56 3D Transforms p. 56 Warping p. 58 Expression Language p. 58 Filtering Algorithms p. 60 Basic Image Compositing p.

ECE3296 Digital Image and Video Processing Lab experiment 2 Digital Video Processing using MATLAB

CM3106 Solutions. Do not turn this page over until instructed to do so by the Senior Invigilator.

InSync White Paper : Achieving optimal conversions in UHDTV workflows April 2015

Video coding standards

Digital Television Fundamentals

HEVC: Future Video Encoding Landscape

DAT335 Music Perception and Cognition Cogswell Polytechnical College Spring Week 6 Class Notes

Video noise reduction

Research & Development. White Paper WHP 318. Live subtitles re-timing. proof of concept BRITISH BROADCASTING CORPORATION.

RECOMMENDATION ITU-R BT

Fast MBAFF/PAFF Motion Estimation and Mode Decision Scheme for H.264

A video signal processor for motioncompensated field-rate upconversion in consumer television

(a) (b) Figure 1.1: Screen photographs illustrating the specic form of noise sometimes encountered on television. The left hand image (a) shows the no

Chapter 3 Fundamental Concepts in Video. 3.1 Types of Video Signals 3.2 Analog Video 3.3 Digital Video

Stream Labs, JSC. Stream Logo SDI 2.0. User Manual

10 Digital TV Introduction Subsampling

Multirate Digital Signal Processing

Automatic LP Digitalization Spring Group 6: Michael Sibley, Alexander Su, Daphne Tsatsoulis {msibley, ahs1,

SIGNAL PRE-PROCESSING. Delivering the faultlessly clean signals demanded by the digital domain

New forms of video compression

Chapter 6: Real-Time Image Formation

RadarView. Primary Radar Visualisation Software for Windows. cambridgepixel.com

Visual Communication at Limited Colour Display Capability

Microbolometer based infrared cameras PYROVIEW with Fast Ethernet interface

NEW APPROACHES IN TRAFFIC SURVEILLANCE USING VIDEO DETECTION

Chapter 2 Introduction to

Hamburg Conference. Best Practices Preparing Content for Blu Ray

REAL-WORLD LIVE 4K ULTRA HD BROADCASTING WITH HIGH DYNAMIC RANGE

Module 8 VIDEO CODING STANDARDS. Version 2 ECE IIT, Kharagpur

Digital Media. Daniel Fuller ITEC 2110

Techniques for Creating Media to Support an ILS

AN MPEG-4 BASED HIGH DEFINITION VTR

ELEC 691X/498X Broadcast Signal Transmission Fall 2015

Broadcast Television Measurements

Implementation of an MPEG Codec on the Tilera TM 64 Processor

Video compression principles. Color Space Conversion. Sub-sampling of Chrominance Information. Video: moving pictures and the terms frame and

FLIR Daylight and Thermal Surveillance (P/T/Z) Multi-Sensor systems

Rec. ITU-R BT RECOMMENDATION ITU-R BT PARAMETER VALUES FOR THE HDTV STANDARDS FOR PRODUCTION AND INTERNATIONAL PROGRAMME EXCHANGE

Transcription:

RESTORATION OF ARCHIVED TELEVISION PROGRAMMES FOR DIGITAL BROADCASTING J-H Chenot 1, J.O.Drewery 2 and D.Lyon 3 1 INA, France, 2 BBC R&D, UK, 3 Snell & Wilcox, UK ABSTRACT The increasing number of television channels and the demands for access to national heritage of old films has caused demands for restoration which can be applied so as to enable a wider range of archived material to be used. This paper begins by explaining the drawbacks of the current methods of archive restoration and concludes that successful exploitation of the vast archive possessed by broadcasters will be achieved only by an automated system able to work in real time but being capable of manual intervention. This is the starting point of the European AURORA consortium whose work is then described. The nature of common defects is discussed, together with the specification of user requirements of the system. Prototype hardware is then described, composed of two units which deal with large-area and localised effects, respectively. Finallly, mention is made of assessment techniques, but it is too early to quote results as the project has not yet reached completion. INTRODUCTION There is an important market for archive material because of its historical and scientific value. It is commonly used in repeats of old broadcasts and for the insertion of short-length material into news programmes or documentaries, as well as for historical research. Accurate and cost-effective restoration of this valuable source of information is essential but the prohibitive cost of archive restoration currently limits the level of penetration into the restoration market. The need for restoration has, however, increased with the introduction of digital delivery. This is because such delivery uses compression techniques that rely heavily on the existence of redundancy in the picture, that is, the picture information is predictable to some degree. Artefacts commonly encountered in archive material use up a significant proportion of the digital capacity, leaving less capacity for conveying the real information. The pan-european AURORA (AUtomated Restoration of ORiginal film and video Archives) consortium was set up in 1995 to address some of the issues and problems encountered in archive restoration. All the AURORA partners have an interest in improving the quality of film and television archives: either they have archives of their own, are users of current archive restoration techniques or have an existing research base in this field. This paper explores the research performed by the AURORA consortium into the possibilities offered by automated, real-time archive restoration, a process that requires an efficient, modern and cost-effective solution. The project will conclude at the end of 1998 when pre-production prototype systems will be manufactured. CURRENT STATE OF THE ART Film and video based material bears the inherent defects of its medium and the equipment on which it is generated, together with additional defects from capture and storage. The first stage of archive restoration is therefore transfer onto a more stable (digital) medium, such as computer disk or digital video tape and the second stage is restoration. Most real-time restoration is limited to correction of colour and aperture distortion and noise reduction. The latter is limited to low levels because of the effect of dirty window - the image moves against a fixed background of residual noise if the noise reducer works by temporal integration. However, it is thought that this situation could be considerably improved if the noise reducer used motion compensation. Badly impaired programmes can only be restored by non real-time operations which compel the operator to spend a lot of time specifying complex suites of operations. Even then the result is often little better than the original. The tools used today for restoring television programmes also suffer from a large number of different and incompatible interfaces. The operator is constrained to interact with the different systems in a

specific order. Even when tools have the possibility of storing edit-lists or timecoded sequences of instructions, the lack of a common centralised tool prevents the storage, in a single file, of all the necessary instructions to repeat the restoration process. As a consequence, it is not possible to change one restoration component without respecifying all the others. Some disk-based tools designed for special effects and non-linear editing are sometimes used for moving image restoration. These systems may have the following features : Uncompressed image storage on disks (optional compression) Random access to images Real-time playback of source and processed images (limited buffer size from a few seconds to tens of minutes) Some or all effects are computed on demand. This takes time, but the results are visible in realtime after computation. Dedicated to non-linear editing and/or to special effects. VTR control for importing and exporting images. As far as restoration is concerned, these systems are generally used as rotoscoping tools, for semiinteractive dirt and scratch concealment, and for image stabilisation. Most of these features need a considerable operator commitment to give an acceptable result. archive restoration system would be more cost effective, more efficient and less labour intensive than current methods. AIMS OF THE AURORA PROJECT The main aims of the AURORA project are: To analyse film and video archive material to identify the nature, frequency and relative severity of impairments To develop and test algorithms which offer more effective solutions for the removal or supression of artefacts in the video domain To develop hardware prototype equipment to implement the algorithms and techniques to provide for the continuous automatic removal of artefacts complying with an outline system specification. To provide a system capable of performing interactive archive restoration in real-time. To assess the performance of the process system through extensive field trials. THE NATURE OF FILM AND VIDEO ARTEFACTS One of the first pieces of work completed by the archive restoration team was to identify the range of impairments, the frequency with which each occurred and the importance of correcting them. Video Defects Film Defects In some cases, filters may be applied to images, with very long batch processing times, to sharpen the image and to reduce noise and grain. Local area Whole picture Local area Whole picture The main drawback of these tools is the very long time spent per image. For dirt and scratch concealment, a common claim is that one can restore several frames per minute (the work is usually of very high quality). As far as noise reduction is concerned, specialised noise reducers produce a superior result in real-time (in video resolution only). A second drawback is the lack of tools specifically designed for video restoration. So it can be seen that archive restoration is expensive because of the amount of time spent by the operator on manual processing of the material. A system that worked in real-time would reduce the amount of time spent on restoring source material to something closer to the actual length of the material, rather than hours and days. An automated real-time Random noise Video noise Break up Camera shake Scratch Film judder Colour Sparkle Film grain Drop out Flicker Flicker Dirt Unsteadiness Missing frames Colour TABLE 1: The major classes of defects identified

Many defects were found and catalogued and in total some 167 were identified from both video and film sources. Analysis based on the frequency and severity over which they occured provided the basis of prioritising the need for correction. The most frequent and severe defects are shown in Table 1. USER REQUIREMENTS The structure of the proposed restoration system was defined by devising a list of known functions for removing various types of artefacts. Some basic manipulations already existed in current restoration equipment, including non linear editing, mixing, keying, DVE (2D) and colour correction, but most functions were new and unique to the project. Table 2 shows a sample of these. It was important that a real-time correction system should be self-adaptive to the severity of the defect, according to the content of the image (details and motion), but also retain the capacity to be optimised manually. From the knowledge of the drawbacks of current systems, one can draw up an outline of a possible future television archive restoration system : The system will be fully integrated : the operator will simply tell the system what he wants to have as a result, without specifying how to do it. For example, Here there is a piece of dirt, find it and do your best Examples of impairments Noise, grain Sparkle, loss of emulsion, dirt, dropouts Film and video scratches, hum, kinescope moire, fixed dirt, dirty window. Banding, PAL, phase error, non uniform colour, colour fading Trail, echoes, detail enhancement Unsteadiness Flicker Non standard functions Noise/grain and continous impairment correction Masks for erratic and impulsive impairments Compensation for fixed or structured impairments Colour and luminance stabilisation Filters Positional stabilisation Luminance stabilisation All the operations will be stored in a Restoration Plan, which can be edited up to the last minute, and the final restored programme will be conformable on demand (for example when the destination VTR is available). This requires that the integration is complete enough to prevent the operator from being tempted to use the different tools manually, without storing his settings into the Restoration Plan. The system will make considerable use of pixel-wise motion estimation and motion compensation. This is the only way of automating efficient noise reduction and dirt & drop-out concealment. For the moment, and still for several years, such real-time processing requires computing power largely beyond the capability of current software tools. Dedicated hardware has to be built. It will run in real-time, tape to tape, for the largest part of its work. Batch processing of segments will be reserved as much as possible for complex effects limited to very short sequences. Editing tools will be integrated, to allow, when necessary, a local correction, editing out or replacing scenes that are too damaged. Lens imperfections, image spreading, scanning spot size in camera tube, display spot size, multigeneration loss, channel bandwidth loss, VTR filter limitations Edge correction TABLE 2: A sample of the non-standard functions to be provided by the proposed archive restoration system. There will be a seamless integration between the real-time (tape-to-tape) mode, where the process is carried out continuously, and the off-line mode, where the user will be able to work down to the field level, or tune a set of parameters on a short looping sequence. Delays associated with pre-roll and postroll of broadcast VTRs will be suppressed. These requirements result in the need for temporary disk storage of the video and audio signal. The size of the cache is no longer a limitation : video disk stores currently allow for typically one hour of uncompressed digital video storage. But the loading and off-loading of the cache have to run in the

background without affecting the interactive part of the work. THE HARDWARE PROTOTYPE Initially a signal processing chain consisting of a parallel or serial arrangement of individual modules, each performing a unique operation, was devised. This was abandoned after investigating the nature of the defects; instead, processing would take place according to the type and severity of defect encountered by the system. Small, random and irregular features such as noise, dirt and scratches can be corrected using pixel by pixel methods and large-scale artefacts like unsteadiness and flicker that affect the entire picture would be best corrected using whole-area or block-based correction methods. As a result, a hardware solution was developed, which was to be composed of two prototype units. These would provide a separate solution for large scale and small scale impairments using the wholearea and pixel-by-pixel corrections respectively. The prototype would also have an automatic defect detection and flagging system and provide an experimental graphical-user interface to allow interactive user-access to the various functions. The modules comprising the AURORA hardware prototype are shown in Figure 1. The pre-processed video signal is flicker corrected, then passed onto both the motion estimation system and the unsteadiness correction unit. The appropriate global vectors from the motion estimator are passed onto the unsteadiness corrector for processing, before local area scratch and dirt removal and spatial noise reduction. The final step is motion compensated recursive noise reduction, which requires high quality forwards and backwards motion vectors in order to work correctly. If these vectors are not provided, then processing stops at the output of the spatial input flicker estimation and correction video unsteadiness correction motion vectors filter. The accurate prediction of motion between successive fields (or frames in film) is a key element in picture construction, especially where large areas of a frame are missing. Without this technique, blurred images and aliasing would soon become a problem and introduce a new array of faults into the archive material. One of the AURORA partners already had considerable experience of the Phase Correlation technique (1) and this was imported as a subsystem to the AURORA hardware. The system is actually a combination of two methods of motion estimation: broad movement measurement and pixel-by-pixel measurement, which have been combined to give the most accurate picture repair. The noise reduction unit and the unsteadiness and flicker unit both make use of motion estimation in their processing chain. The Prototype Unsteadiness And Flicker Unit Unsteadiness. Picture unsteadiness can be detected from global motion vectors derived from phase correlation and corrected by shifting the picture so that it appears steady.(2) The picture is also enlarged slightly to prevent picture edges becoming visible. Amplitude flicker Low frequency flicker from both film and TV cameras will have the same intensity in both fields, so a correction algorithm was developed to globally correct the intensity across each frame.(3) The misalignment caused by twin lens effects can be corrected by identifying motion vectors from the scratch key scratch detector spatial noise reduction recursive noise reduction, dirt and scratch and large area repair motion vectors recursive loop motion estimator motion vectors delay Figure 1 - Block diagram of the real-time restoration units

inter-field displacements, and then fitting a surface to each of the fields which describes the spatial variation of the vectors, while the brightness variation is corrected as for amplitude flicker.(4) The Prototype Noise Reduction Unit Recursive noise reduction The recursive noise reduction hardware performs an average on a sequence of fields from the past, present and future.(5) Normally this highly effective technique can only be used when the motion of objects between frames is very small or minimal, but motion compensation allows it to be used in many more situations. Large missing areas or entirely missing frames or fields do not, however, provide a full set of forwards and backwards motion vectors, and for this case a non-temporal spatial noisereduction system is required. Spatial noise reduction Spatial noise reduction takes place using wavelets, which is an effective method of noise reduction, although it does not perform as well as recursive noise reduction. Filters based on this principle, such as median filters, are readily available, but could always benefit from improvements. The spatial filter used in this project works by decomposing the picture via convolution with a wavelet transform function and use of low- and highpass filters into a series of components containing low and high frequency information. The highest frequency components are generally noise and their removal does not result in a significant loss of detail from the image. The relevant frequency ranges are removed by use of a coring or thresholding function, before the picture is reassembled. This system has the major advantage that it does not require motion vectors and works on one frame at a time, but it does not give the same level of noise reduction as the recursive noise reducer. The output from the spatial filtering system forms part of the input to the recursive noise reducer, and when the motion vectors cannot be used, then the output from the spatial filter passes through the unit unchanged. Large area reconstruction Large area reconstruction techniques are used for dirt, scratch and large drop-out removal.(6) They are usually performed by motion compensation followed by temporal interpolation. Dirt detection is performed in real-time for a variety of blotch sizes on individual frames. All the edges in the picture are flagged, then single pixel blemishes deleted by replacing them with the average of the surrounding pixels, first having ensured that there are no edges present. False alarms are triggered if the activity of the region does not change significantly after the operation. Dirt detection can also be performed using the recursive noise reducer. Scratches have very different properties to dirt. As they usually persist for more than one frame, they are not temporal discontinuities and cannot be treated as such. A scratch is not curved and it is assumed to extend across the entire vertical height of the frame. To detect scratches, which possess characteristic sidelobes, the image is vertically subsampled in order to suppress noise and enhance vertical line features. The signal is then thresholded and the candidate lines flagged. False alarms can be caused by vertical, straight and narrow objects such as railway lines and telegraph poles, and those incidents which are not false alarms are removed by spatial interpolation. COMPLETE SYSTEM As mentioned earlier, there will always be parts of the material that cannot be adequately restored by the hardware and must therefore be processed in non real-time, requiring manual intervention. Part of the function of the hardware is to flag these portions so that the operator s attention can be drawn to them. They may then be processed according to a restoration plan, dependent on the information from the flags. The processing in non real-time will be carried out on a work station, taking its input from a disk server and writing its output to another. The disk servers, in turn, will interact with digital video tape recorders which will form the primary input and output of the system. So the control of the devices and the routing of the signal between the real-time hardware units, servers and VTRs will amount to a further function of the work station. ASSESSMENT OF THE SYSTEM Once a system has been designed it is important to know how well it works. Part of the AURORA work plan therefore includes assessment which will comprise two separate exercises. The first exercise will involve a series of subjective tests, conducted with non-expert observers. The tests have been designed in co-operation with the EBU TAPESTRIES group and will comprise both double stimulus and single stimulus varieties. In the first variety, material will be processed that contains

predominantly a single kind of artefact whereas, in the second variety, combinations of artefacts will be allowed and the material will be longer. The second exercise will seek the views of expert users by allowing them to work with the equipment, processing many hours of material. Reactions will be fed back to the system designers to help improve the prototype. CONCLUSION flicker. IEEE Proc. ICIP 96 Vol.1, pp 109-112, September 1996 5. Drewery, J.O., Sanders, J.R. and Storey, R. 1978. An adaptive noise reducer for PAL and NTSC signals IBC 78, 25-29 September 1978. IEE Conference Publication No. 166 pp231-233. 6. Kokaram, A. 1998 Motion Picture Restoration: Digital algorithms for artefact suppression in archived film and video. Pub Springer Verlag This project, due to complete in late 1998, aims at producing and evaluating prototype hardware. It is hoped that the overall performance will significantly advance that currently commercially available and lead to the next generation of archive restoration equipment with the significant advantage of close-toreal-time cost effective processing. The final phase of the project will be to undertake restoration trials by three broadcasters using a wide range of damaged archive material. ACKNOWLEDGEMENT The authors wish to acknowledge the contributions to the project by their colleagues, particularly the following: A. Kokaram, Trinity College, Ireland, L. Laborelli, INA, France, R. Prytherch, BBC I&A, UK, S. Sommerville, Snell & Wilcox, UK, P. van Roosmalen, TU Delft, The Netherlands, T. Vlachos. Univ. of Surrey, UK, M. Weston, Snell & Wilcox, UK REFERENCES 1. Lau, H and Lyon, D. 1992 Motion compensated processing for enhanced slow-motion and standards conversion. IBC 92 (Amsterdam), 4-7 July 1992. IEE Conference Publication No. 358 pp 62-66. 2. Vlachos, T. 1996. Improving the efficiency of MPEG-2 coding by means of film unsteadiness correction. SPIE Proc. Digital compression technologies and systems for video communications, Vol. 2952, pp 534-542, October 1996 3. van Roosmalen, P., Lagendijk, R.L. and Biemond, J. Correction of intensity flicker in old film sequences. Submitted to IEEE Trans. on Circuits and Systems for Video Technology 4. Vlachos, T and Thomas, G.A. 1996. Motion estimation for the correction of twin-lens telecine