A Composition for Clarinet and Real-Time Signal Processing: Using Max on the IRCAM Signal Processing Workstation

Similar documents
Real-time Granular Sampling Using the IRCAM Signal Processing Workstation. Cort Lippe IRCAM, 31 rue St-Merri, Paris, 75004, France

Music for Alto Saxophone & Computer

Implementation of an 8-Channel Real-Time Spontaneous-Input Time Expander/Compressor

Music composition through Spectral Modeling Synthesis and Pure Data

Music Representations

S I N E V I B E S FRACTION AUDIO SLICING WORKSTATION

Part I Of An Exclusive Interview With The Father Of Digital FM Synthesis. By Tom Darter.

Time Fabric. Pitch Programs for Z-DSP

Tiptop audio z-dsp.

POST-PROCESSING FIDDLE : A REAL-TIME MULTI-PITCH TRACKING TECHNIQUE USING HARMONIC PARTIAL SUBTRACTION FOR USE WITHIN LIVE PERFORMANCE SYSTEMS

A FUNCTIONAL CLASSIFICATION OF ONE INSTRUMENT S TIMBRES

Acoustic Instrument Message Specification

Original Marketing Material circa 1976

Music 209 Advanced Topics in Computer Music Lecture 1 Introduction

Real-Time Computer-Aided Composition with bach

AN ARTISTIC TECHNIQUE FOR AUDIO-TO-VIDEO TRANSLATION ON A MUSIC PERCEPTION STUDY

An integrated granular approach to algorithmic composition for instruments and electronics

Fraction by Sinevibes audio slicing workstation

MONTGOMERY COUNTY COMMUNITY COLLEGE MUS 140 Introduction to Digital Music Technology 3-3-0

Interacting with a Virtual Conductor

Cathedral user guide & reference manual

Note Gate 2 Audio Unit

Automatic Construction of Synthetic Musical Instruments and Performers

Synthesis Technology E102 Quad Temporal Shifter User Guide Version 1.0. Dec

Orchestral Composition Steven Yi. early release

Major Differences Between the DT9847 Series Modules

Chapter 1. Introduction to Digital Signal Processing

Transcription An Historical Overview

Voice Controlled Car System

What's the SPO technology?

An interdisciplinary approach to audio effect classification

1ms Column Parallel Vision System and It's Application of High Speed Target Tracking

Applying lmprovisationbuilder to Interactive Composition with MIDI Piano

ANALYSIS-ASSISTED SOUND PROCESSING WITH AUDIOSCULPT

Advance Certificate Course In Audio Mixing & Mastering.

MULTIMIX 8/4 DIGITAL AUDIO-PROCESSING

Credits:! Product Idea: Tilman Hahn Product Design: Tilman Hahn & Dietrich Pank Product built by: Dietrich Pank Gui Design: Benjamin Diez

Semi-automated extraction of expressive performance information from acoustic recordings of piano music. Andrew Earis

Analysis, Synthesis, and Perception of Musical Sounds

Project. The Complexification project explores musical complexity through a collaborative process based on a set of rules:

CTP 431 Music and Audio Computing. Course Introduction. Graduate School of Culture Technology (GSCT) Juhan Nam

Extension 5: Sound Text by R. Luke DuBois Excerpt from Processing: a programming handbook for visual designers and artists Casey Reas and Ben Fry

CONTENT-BASED MELODIC TRANSFORMATIONS OF AUDIO MATERIAL FOR A MUSIC PROCESSING APPLICATION

UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD

Kenaxis & Kenaxis VBAP the manual

Oasis Rose the Composition Real-time DSP with AudioMulch

Digital Television Fundamentals

Oscilloscopes, logic analyzers ScopeLogicDAQ

For sforzando. User Manual

R H Y T H M G E N E R A T O R. User Guide. Version 1.3.0

Controlling Musical Tempo from Dance Movement in Real-Time: A Possible Approach

A HIGHLY INTERACTIVE SYSTEM FOR PROCESSING LARGE VOLUMES OF ULTRASONIC TESTING DATA. H. L. Grothues, R. H. Peterson, D. R. Hamlin, K. s.

Reference Manual. Contents MUSIC SYNTHESIZER. Reference 24. Basic Structure 3. Using the MONTAGE Manuals...2

Combining Instrument and Performance Models for High-Quality Music Synthesis

HEAD. HEAD VISOR (Code 7500ff) Overview. Features. System for online localization of sound sources in real time

TOWARD AN INTELLIGENT EDITOR FOR JAZZ MUSIC

ARTICULATIONS FEATURES - COLORS - TECHNIQUES.

The Ruben-OM patch library Ruben Sverre Gjertsen 2013

QUALITY OF COMPUTER MUSIC USING MIDI LANGUAGE FOR DIGITAL MUSIC ARRANGEMENT

Music Understanding and the Future of Music

ARTICULATIONS FEATURES - COLORS - TECHNIQUES.

Concepts for the MIDI Composer, Arranger, and Orchestrator

ARTICULATIONS FEATURES - COLORS - TECHNIQUES.

ARTICULATIONS FEATURES - COLORS - TECHNIQUES.

Score following using the sung voice. Miller Puckette. Department of Music, UCSD. La Jolla, Ca

Music Representations

Timbral Hauntings: An Interactive System Re-Interpreting the Present in Echoes of the Past

Please contact with any questions, needs & comments... otherwise go MAKE NOISE.

#PS168 - Analysis of Intraventricular Pressure Wave Data (LVP Analysis)

Design of Speech Signal Analysis and Processing System. Based on Matlab Gateway

Switching Solutions for Multi-Channel High Speed Serial Port Testing

Portfolio of Compositions. Hans Tutschku. Submitted to The University of Birmingham for the degree of DOCTOR OF PHILOSOPHY

Datasheet SHF A Multi-Channel Error Analyzer

Distributed Virtual Music Orchestra

QC External Synchronization (SYN) S32

ARTICULATIONS FEATURES - COLORS - TECHNIQUES.

How to use the DC Live/Forensics Dynamic Spectral Subtraction (DSS ) Filter

Data Converters and DSPs Getting Closer to Sensors

IEEE Santa Clara ComSoc/CAS Weekend Workshop Event-based analog sensing

MULTI CHANNEL VOICE LOGGER MODEL: DVR MK I

Ben Neill and Bill Jones - Posthorn

Articulation Guide. Berlin Brass - French Horn SFX.

A prototype system for rule-based expressive modifications of audio recordings


Figure 1: Feature Vector Sequence Generator block diagram.

Spectral toolkit: practical music technology for spectralism-curious composers MICHAEL NORRIS

Jam Master, a Music Composing Interface

ni.com Digital Signal Processing for Every Application

ONLINE ACTIVITIES FOR MUSIC INFORMATION AND ACOUSTICS EDUCATION AND PSYCHOACOUSTIC DATA COLLECTION

Shifty Manual v1.00. Shifty. Voice Allocator / Hocketing Controller / Analog Shift Register

Alexis Perepelycia Arranger, Composer, Director, Interpreter, Publisher, Teacher

Authors: Kasper Marklund, Anders Friberg, Sofia Dahl, KTH, Carlo Drioli, GEM, Erik Lindström, UUP Last update: November 28, 2002

ELEC 310 Digital Signal Processing

MATLAB & Image Processing (Summer Training Program) 4 Weeks/ 30 Days

Savant. Savant. SignalCalc. Power in Numbers input channels. Networked chassis with 1 Gigabit Ethernet to host

AURAFX: A SIMPLE AND FLEXIBLE APPROACH TO INTERACTIVE AUDIO EFFECT-BASED COMPOSITION AND PERFORMANCE

DTS Neural Mono2Stereo

Multirate Digital Signal Processing

Introductions to Music Information Retrieval

Corpus-Based Transcription as an Approach to the Compositional Control of Timbre

Transcription:

A Composition for Clarinet and Real-Time Signal Processing: Using Max on the IRCAM Signal Processing Workstation Cort Lippe IRCAM, 31 rue St-Merri, Paris, 75004, France email: lippe@ircam.fr Introduction. The composition Music for Clarinet and ISPW, by the author, was created using the IRCAM Signal Processing Workstation (ISPW) and the software Max. The piece was commissioned by the Center for Computer Music & Music Technology, Kunitachi College of Music, Tokyo and realized at IRCAM during 1991, and at the Kunitachi College of Music during a composer-in-residency, 1991-92. From 1988-1991, IRCAM developed a real-time digital processing system, the IRCAM Signal Processing Workstation (ISPW)[1]. Miller Puckette has developed a version of Max for the ISPW that includes signal processing objects, in addition to many of the standard objects found in the Macintosh version of Max [2][3]. Currently, there are over 40 signal processing objects in Max. Objects exist for most standard signal processing tasks, including: filtering, sampling, pitch tracking, threshold detection, direct-to-disk, delay lines, FFTs, etc. With the ISPW version of Max, the flexibility with which one creates control patches in the original Macintosh version of Max is carried over into the domain of signal processing. Prototyping Environment. The ability to test and develop ideas interactively plays an important role in musical applications. Because of its single architecture, the ISPW is a powerful prototyping and production environment for musical composition [4]. Prototyping in a computer music environment often combines musical elements which traditionally have fallen into the categories of orchestra (sound generators) or score (control of sound generators). Mainly due to computational limitations, realtime computer music environments have traditionally placed hardware boundaries between orchestra and score : the sound generation is done on one machine while the control is done remotely from another. When developing a synthesis algorithm which makes extensive use of real-time control, it is extremely helpful, if not essential, to de-

velop the synthesis algorithm and the control software together. This is greatly facilitated when sound generation and control run on the same machine and in the same environment. Control and Signal Processing. Real-time signal analysis of instruments for the extraction of musical parameters gives composers useful information about what an instrumentalist is doing. One of the signal processing objects found in Max offers rapid and accurate pitch-detection. In Music for Clarinet and ISPW, the incoming clarinet signal is converted via an analog-to-digital converter and analyzed by this pitch-detection algorithm. The pitch tracker outputs MIDI-style pitches which are sent to a score follower [5] (using the explode object [6]). As the score follower advances, it triggers the electronic score which is stored in event lists. The event lists directly control the signal processing modules. In parallel, compositional algorithms also control the signal processing. These compositional algorithms are themselves controlled by the information extracted from the clarinet input. Thus, the raw clarinet signal, its envelope, continuous pitch information from the pitch detector, the direct output of the score follower, and the electronic score all contribute to control of the compositional algorithms employed in the piece (see figure below). clarinet compositional algorithms analog-to-digital converter pitch tracker score follower event list microphone signal processing modules digital-to-analog converters sound distribution Figure 1. Control and signal processing flow. The signal processing used in Music for Clarinet and ISPW include several standard signal processing modules: reverb, delay, harmonizing, flanging, frequency shifting, spatializing, and frequency/amplitude modulation.

Several non-standard sampling techniques are used also, including a time-stretching algorithm, developed by Puckette, which allows for the separation of sample transposition and sample duration. Thus, one can slow down a sample playback while maintaining the original pitch, or change the pitch of a sample playback without changing its duration. Another sampling technique, a kind of granular sampling developed from techniques described by Xenakis [7] and Roads [8] for sound synthesis, is also used. Ten-second sound samples can be played back in a variety of ways and orderings, taking approximately 20-millisecond sound grains of the sample at a time. (All of the samples are made up of clarinet phrases sampled in real-time during the performance of the piece.) Finally, using an automated signal crossbar (similar to a studio patch-bay) to connect modules to each other, signals can be sent from the output of practically every module to the input of every other module. This signal crossbar maximizes the number of possible signal paths and allows for greater flexibility when using a limited number of signal processing modules [9] (see figure below). TO: reverb frequency shifter harmonizer noise modulation samplers filters spatializer FROM: reverb frequency shifter harmonizer noise modulation samplers filters Figure 2. Crossbar of interconnections among signal processing modules. Real-time Continuous Control Parameters Real-time audio signal analysis of acoustic instruments, for the extraction of continuous control signals that carry musically expressive information, can be used to drive signal processing and sound generating modules, and can ultimately provide an instrumentalist with a high degree of expressive control over an electronic score [10]. In the frequency domain, pitch tracking can be used to determine the stability of pitch on a continuous basis for recognition of pitch-bend, portamento, glissando, trill, tremolo, etc. In the amplitude domain, envelope following of the continuous dynamic envelope for articulation detection enables one to determine flutter-tongue, stac- spatializer

cato, legato, sforzando, crescendo, etc. In the spectral domain, FFTs, pitch tracking, and filtering can be used to track continuous changes in the spectral content of sounds for detection of multiphonics, inharmonic/harmonic ratios, timbral brightness, etc. High-level event detection combining the analyses of frequency, amplitude, and spectral domains can provide rich control signals that reflect subtle changes found in the input signal. The Musician s Role. The dynamic relationship between performer and musical material, as expressed in the musical interpretation, can become an important aspect of the man/machine interface for the composer and performer, as well as for the listener, in an environment where musical expression is used to control an electronic score. The richness of compositional information useful to the composer is obvious in this domain, but other important aspects exist: compositions can be finetuned to individual performing characteristics of different musicians, intimacy between performer and machine can become a factor, and performers can readily sense consequences of their performance and their musical interpretation. References. [1] E. Lindemann, M. Starkier, and F. Dechelle. The IRCAM Musical Workstation: Hardware Overview and Signal Processing Features. In S. Arnold and G. Hair, eds. Proceedings of the 1990 International Computer Music Conference. San Francisco: International Computer Music Association, 1990. [2] M. Puckette. The Patcher. In C. Lischka and J. Fritsch, eds. Proceedings of the 1988 International Computer Music Conference. San Francisco: International Computer Music Association, 1988. [3] M. Puckette. Combining Event and Signal Processing in the Max Graphical Programming Environment. Computer Music Journal 15(3):68-77, 1991. [4] C. Lippe et al, The IR- CAM Musical Workstation: A Prototyping and Production Tool for Real-Time Computer Music. Proceedings, 9th Italian Colloquium of Computer Music, 1991, Genoa. [5] M. Puckette, EXPLODE: A User Interface for Sequencing and Score Following. In S. Arnold and G. Hair, eds. Proceedings of the 1990 International Computer Music Conference. San Francisco: International Computer Music Association, 1990. [6] M. Puckette and C. Lippe. Score Following in Practice. In Proceedings of the 1992 International Computer Music Conference. San Francisco: International

1992. [7] I. Xenakis. Formalized Music. Bloomington: Indiana University Press. (Pendragon, 1991) 1971. [8] C. Roads. Automated Granular Synthesis of Sound. Computer Music Journal 2(2):61-62, 1978. [9] C. Lippe and M. Puckette. Musical Performance Using the IRCAM Workstation. In B. Alphonce and B. Pennycook, eds. Proceedings of the 1991 International Computer Music Conference. San Francisco: International 1991. [10] D. Wessel, D. Bristow and Z. Settel. Control of Phrasing and Articulation in Synthesis. Proceedings of the 1987 International Computer Music Conference. San Francisco: International 1987.