Toward a Computationally-Enhanced Acoustic Grand Piano

Similar documents
Semi-automated extraction of expressive performance information from acoustic recordings of piano music. Andrew Earis

Music Representations

ADSR AMP. ENVELOPE. Moog Music s Guide To Analog Synthesized Percussion. The First Step COMMON VOLUME ENVELOPES

Laboratory Assignment 3. Digital Music Synthesis: Beethoven s Fifth Symphony Using MATLAB

A prototype system for rule-based expressive modifications of audio recordings

MusicGrip: A Writing Instrument for Music Control

Music Representations

Finger motion in piano performance: Touch and tempo

AN ARTISTIC TECHNIQUE FOR AUDIO-TO-VIDEO TRANSLATION ON A MUSIC PERCEPTION STUDY

Note on Posted Slides. Noise and Music. Noise and Music. Pitch. PHY205H1S Physics of Everyday Life Class 15: Musical Sounds

Shimon: An Interactive Improvisational Robotic Marimba Player

Ben Neill and Bill Jones - Posthorn

Spectral Sounds Summary

Measurement of overtone frequencies of a toy piano and perception of its pitch

Acoustic Instrument Message Specification

ONLINE ACTIVITIES FOR MUSIC INFORMATION AND ACOUSTICS EDUCATION AND PSYCHOACOUSTIC DATA COLLECTION

Registration Reference Book

6.UAP Project. FunPlayer: A Real-Time Speed-Adjusting Music Accompaniment System. Daryl Neubieser. May 12, 2016

Introductions to Music Information Retrieval

UNIVERSITY OF DUBLIN TRINITY COLLEGE

Affective Sound Synthesis: Considerations in Designing Emotionally Engaging Timbres for Computer Music

Music 209 Advanced Topics in Computer Music Lecture 1 Introduction

PSYCHOACOUSTICS & THE GRAMMAR OF AUDIO (By Steve Donofrio NATF)

THE MUSIC OF MACHINES: THE SYNTHESIZER, SOUND WAVES, AND FINDING THE FUTURE

However, in studies of expressive timing, the aim is to investigate production rather than perception of timing, that is, independently of the listene

Cymatic: a real-time tactile-controlled physical modelling musical instrument

Pitch-Synchronous Spectrogram: Principles and Applications

EMS DATA ACQUISITION AND MANAGEMENT (LVDAM-EMS) MODEL 9062-C

About Giovanni De Poli. What is Model. Introduction. di Poli: Methodologies for Expressive Modeling of/for Music Performance

CTP 431 Music and Audio Computing. Basic Acoustics. Graduate School of Culture Technology (GSCT) Juhan Nam

SYNTHESIS FROM MUSICAL INSTRUMENT CHARACTER MAPS

Digitization: Sampling & Quantization

An interdisciplinary approach to audio effect classification

Topic 10. Multi-pitch Analysis

Linear Time Invariant (LTI) Systems

Devices I have known and loved


Processing Linguistic and Musical Pitch by English-Speaking Musicians and Non-Musicians

2. AN INTROSPECTION OF THE MORPHING PROCESS

OBJECTIVE EVALUATION OF A MELODY EXTRACTOR FOR NORTH INDIAN CLASSICAL VOCAL PERFORMANCES

Computer Music Journal, Volume 38, Number 4, Winter 2014, pp (Article)

Computer Coordination With Popular Music: A New Research Agenda 1

Implementation of an 8-Channel Real-Time Spontaneous-Input Time Expander/Compressor

Innovative Rotary Encoders Deliver Durability and Precision without Tradeoffs. By: Jeff Smoot, CUI Inc

Automatic Construction of Synthetic Musical Instruments and Performers

S I N E V I B E S FRACTION AUDIO SLICING WORKSTATION

UNIT 1: QUALITIES OF SOUND. DURATION (RHYTHM)

OBSERVED DIFFERENCES IN RHYTHM BETWEEN PERFORMANCES OF CLASSICAL AND JAZZ VIOLIN STUDENTS

Musicians Adjustment of Performance to Room Acoustics, Part III: Understanding the Variations in Musical Expressions

ESP: Expression Synthesis Project

Designing for the Internet of Things with Cadence PSpice A/D Technology

Usability of Computer Music Interfaces for Simulation of Alternate Musical Systems

Music Alignment and Applications. Introduction

Sound Magic Imperial Grand3D 3D Hybrid Modeling Piano. Imperial Grand3D. World s First 3D Hybrid Modeling Piano. Developed by

EMERGENT SOUNDSCAPE COMPOSITION: REFLECTIONS ON VIRTUALITY

APPLICATIONS OF A SEMI-AUTOMATIC MELODY EXTRACTION INTERFACE FOR INDIAN MUSIC

TOWARDS IMPROVING ONSET DETECTION ACCURACY IN NON- PERCUSSIVE SOUNDS USING MULTIMODAL FUSION

Application Note AN-708 Vibration Measurements with the Vibration Synchronization Module

Processing. Electrical Engineering, Department. IIT Kanpur. NPTEL Online - IIT Kanpur

New Filling Pattern for SLS-FEMTO

History of the Piano

PRELIMINARY INFORMATION. Professional Signal Generation and Monitoring Options for RIFEforLIFE Research Equipment

From quantitative empirï to musical performology: Experience in performance measurements and analyses

Pitch Perception and Grouping. HST.723 Neural Coding and Perception of Sound

Acoustic Measurements Using Common Computer Accessories: Do Try This at Home. Dale H. Litwhiler, Terrance D. Lovell

EMI/EMC diagnostic and debugging

PCM ENCODING PREPARATION... 2 PCM the PCM ENCODER module... 4

POST-PROCESSING FIDDLE : A REAL-TIME MULTI-PITCH TRACKING TECHNIQUE USING HARMONIC PARTIAL SUBTRACTION FOR USE WITHIN LIVE PERFORMANCE SYSTEMS

Applying lmprovisationbuilder to Interactive Composition with MIDI Piano

Vuzik: Music Visualization and Creation on an Interactive Surface

Inspired Engineering. The world s most advanced piano

PHYSICS OF MUSIC. 1.) Charles Taylor, Exploring Music (Music Library ML3805 T )

Influence of timbre, presence/absence of tonal hierarchy and musical training on the perception of musical tension and relaxation schemas

Getting Started with the LabVIEW Sound and Vibration Toolkit

Experimental Study of Attack Transients in Flute-like Instruments

Music for Alto Saxophone & Computer

NOTICE: This document is for use only at UNSW. No copies can be made of this document without the permission of the authors.

Hugo Technology. An introduction into Rob Watts' technology

Topics in Computer Music Instrument Identification. Ioanna Karydi

EPC GaN FET Open-Loop Class-D Amplifier Design Final Report 7/10/2017

Robert Alexandru Dobre, Cristian Negrescu

UWE has obtained warranties from all depositors as to their title in the material deposited and as to their right to deposit such material.

4. ANALOG TV SIGNALS MEASUREMENT

TYING SEMANTIC LABELS TO COMPUTATIONAL DESCRIPTORS OF SIMILAR TIMBRES

y POWER USER MUSIC PRODUCTION and PERFORMANCE With the MOTIF ES Mastering the Sample SLICE function

TongArk: a Human-Machine Ensemble

USING A SOFTWARE SYNTH: THE KORG M1 (SOFTWARE) SYNTH

B-AFM. v East 33rd St., Signal Hill, CA (888)

CHARACTERIZATION OF END-TO-END DELAYS IN HEAD-MOUNTED DISPLAY SYSTEMS

Troubleshooting EMI in Embedded Designs White Paper

Tempo and Beat Analysis

Cathedral user guide & reference manual

TOWARD AN INTELLIGENT EDITOR FOR JAZZ MUSIC

Essential Standards Endurance Leverage Readiness

Using the new psychoacoustic tonality analyses Tonality (Hearing Model) 1

A System for Generating Real-Time Visual Meaning for Live Indian Drumming

Computational Parsing of Melody (CPM): Interface Enhancing the Creative Process during the Production of Music

Bunch-by-bunch feedback and LLRF at ELSA

Elements of Sound and Music Computing in A-Level Music and Computing/CS Richard Dobson, January Music

Automatic Rhythmic Notation from Single Voice Audio Sources

Transcription:

Toward a Computationally-Enhanced Acoustic Grand Piano Andrew McPherson Electrical & Computer Engineering Drexel University 3141 Chestnut St. Philadelphia, PA 19104 USA apm@drexel.edu Youngmoo Kim Electrical & Computer Engineering Drexel University 3141 Chestnut St. Philadelphia, PA 19104 USA ykim@drexel.edu Abstract Although the capabilities of electronic musical instruments have grown exponentially over the past decades, many performers continue to prefer acoustic instruments, perceiving them to be more expressive than their electronic counterparts. We seek to create a new application for computer music interfaces by augmenting, rather than replacing, acoustic instruments. Starting with an acoustic grand piano, an optical keyboard scanner measures the continuous position of every key while electromagnetic actuators directly induce the strings to vibration. Unlike the traditional piano, the performer is given the ability to continuously modulate the sound of each note, resulting in a new creative vocabulary. Ongoing work explores the creation of intelligent mappings from sensed user input to acoustic control parameters which build on the existing musical intuition of trained pianists, creating a hybrid acoustic-electronic instrument that offers new expressive dimensions for human performers. Keywords Music Interfaces, Piano, Multidisciplinary Design Copyright is held by the author/owner(s). CHI 2010, April 10 15, 2010. Atlanta, Georgia, USA ACM 978-1-60558-930-5/10/04. ACM Classification Keywords H.5.5 Information Interfaces and Presentation: Sound and Music Computing General Terms Design, Experimentation 4141

Introduction Despite the rapid advancement of electronic musical performance systems, traditional acoustic instruments remain central to many styles of music. Though computer synthesis offers unprecedented diversity of sounds and computer performance interfaces can provide more dimensions of control than any acoustic instrument, performers ultimately evaluate musical interfaces by the difficult-to-quantify notion of expressivity. Hundreds of years of refinement have produced acoustic instruments which are extremely adept at transforming a performer s intention into sound; replacing them with electronic interfaces poses a substantial challenge. We have developed a system which uses computation to augment, rather than replace, acoustic instruments. We focus our efforts on the grand piano, a highly refined and versatile instrument whose present design dates back over a century. Using felt hammers to strike steel strings, the piano is capable of both complex polyphony and slow, sustained lines. In comparison to other instruments, however, the piano has a surprising limitation: there is no way to alter the sound of a note after it has been struck. Moreover, at the onset of each note, the only control parameter available to the performer is the velocity with which the hammer strikes the string. By integrating electronic sensing and actuation into the piano, we provide new creative tools for the performer to continuously shape the sound of the instrument. Our system has two parts: first, optical sensors on the piano keys generate a continuous data stream reflecting the performer s interaction with the keyboard. Second, electromagnetic actuators directly induce the strings to vibration, allowing control of their sound independently of the piano s hammer mechanism. A computer controls the mapping from performer input (key position, velocity, and acceleration) to parameters of actuation (amplitude, frequency, spectrum). The following sections describe each component, with a focus on integrating them into an intuitive, expressive interface for continuously modulating the sound of the piano. Lessons learned here are potentially applicable to the broader question of creating human-computer interfaces that encourage creative artistic expression. Previous Work Over the past decade, interest has been growing in using computation to augment traditional instruments. Electronic modification of acoustic sounding mechanisms has been previously attempted by [3, 5, 6] in applications including a violin bridge and a xylophone bar. [2] demonstrates active electromagnetic control of a steel musical instrument string; related commercial technologies include the EBow and Moog Guitar. Electromagnetic actuation of acoustic piano is described in [1, 4]. In contrast to previous efforts which control a limited number of strings, our work allows continuous control of the entire range of the piano (up to 88 notes). [9] presents our actuator system design in detail. Separately, new interfaces have been designed expanding the keyboard model to include continuous position sensing [7], horizontal motion and touch sensitivity on the key surface [10]. However, these interfaces are implemented as separate controllers rather than being integrated with the piano keyboard. This distinction is important from a control standpoint as performers interactions with the piano are influenced by the haptic feedback they receive from the keyboard, which differs considerably between acoustic piano and electronic controllers [8]. 4142

CHI 2010: Work-in-Progress (Spotlight on Posters Days 3 & 4) Electromagnetic Actuation The actuation system allows several parameters of control Figure 1 shows a picture of the complete system. for each note, all of which can be continually varied: Electromagnetic solenoids induce the piano strings to vibration using ferromagnetic attraction. One electromagnet Amplitude is used for each note of the piano, up to 88 notes total (48 in the current prototype). Each actuator is driven with a dedicated amplifier; signals are generated by computer to reinforce the natural vibration of each string, based on input from a pickup on the piano soundboard [9]. Frequency, relative to string fundamental Waveform: relative amplitude and phase of multiple harmonics, plus noise components Phase Offset: phase of actuator signal relative to current string vibration In combination, these parameters shape each note s musical qualities, including pitch, dynamic, articulation and timbre. By controlling groups and sequences of notes, these parameters also influence larger-scale musical qualities of phrasing and voicing. Vocabulary of the augmented piano includes infinite sustain, notes which grow from silence, harmonics, and time-varying timbres. Results show that the electronic system produces tones of comparable amplitude to the acoustic piano, facilitating integration of traditional and electronic sounds [9]. Waveforms produced by electromagnetic actuation tend to be more spectrally pure than those of hammer-actuated notes, which produce dozens of harmonics; these spectral differences, combined with a slower attack time on electromagnetically-actuated notes, give the electronic sounds a mellow, ethereal tone quality. While the preceding discussion illustrates the performance of the actuation system alone, the technology is most compelling (from both a computational and musical point of view) when integrated into a performance interface that Figure 1: Keyboard sensor interface and electromagnetic ac- gives a human player continuous, intuitive control over the tuators for an augmented grand piano. musical qualities of the instrument. 4143

Performance Interface The electromagnetic actuation system was used in concert November 2009 featuring music composed for the instrument by Andrew McPherson. In the performance, two keyboard interfaces were used: the primary piano keyboard, which was equipped with a MIDI (Musical Instrument Digital Interface) sensor bar [11], and a second MIDI keyboard mounted above the piano keys. Electronically-actuated sounds could be controlled from both keyboards; the secondary keyboard was intended specifically for situations where no hammer action was desired. The drawback to this approach lies in the MIDI protocol, which typically reports key presses and releases as discrete events. The actuation system aims to provide pianists with a means of continuously shaping each note, but to allow compatibility with MIDI interfaces, time-varying parameters had to be programmed in advance for the concert. At the same time, a performance interface based on the keyboard is preferable: it builds on existing piano technique without forcing pianists to learn a new set of unrelated gestures, and it can be integrated into the main piano keyboard, allowing simultaneous control of traditional and electronic sounds. We have developed a system which uses optical sensing to extend the capabilities of the piano keyboard, based on a modified Moog Piano Bar [11] which uses pairs of LEDs and photodiodes to measure the position of each key at a sampling rate of 600Hz for white keys and 1.8kHz for black keys (Figure 2). Though the Piano Bar is intended as a MIDI controller, we isolate the analog photodiode signals within the keyboard scanner and route them to a dedicated analog-to-digital converter. The input data stream consists of 88 channels of continuous position data updated every 0.55ms. Figure 2: Optical sensing of piano key position. Not only does this interface allow the performer to continuously provide gestural input, it serves as a platform to better understand traditional pianistic expression. Figure 3 shows a short musical excerpt captured using continuous key position sensing compared with a simulation of the same excerpt in standard MIDI data. Though ultimately, only the velocity with which the hammer strikes the string affects the sound of each note, these data suggest that pianists transmit additional information to the keyboard which can be used to deduce their expressive intent, including: Force on a key after note onset, which results in a slight compression of the felt pad underneath the key. Varying force after onset can be seen clearly in the long note (F) of each repetition. Continuous velocity and acceleration during onset and release: multiple samples taken during the short duration of a key press can indicate the specific force profile the performer exerts on the key. Similarly, the speed of release indicates the degree of the performer s continued contact with the keyboard. 4144

Key Position (1 = fully depressed) MIDI Events (onset > 0, release < 0) 1 0.8 0.6 0.4 0.2 0 0.2 p f p p f p Continuous Position of Four Keys: C4 (blue), D4 (red), E4 (green), F4 (black) 0 1 2 3 4 5 6 7 8 9 Time (seconds) 1 0.8 0.6 0.4 0.2 MIDI Simulation of the Same Performance: C4 (blue), D4 (red), E4 (green), F4 (black) 0 0 1 2 3 4 5 6 7 8 9 Time (seconds) Figure 3: Continuous position data versus MIDI data (simulated) for a musical phrase. In the MIDI plot, onset events are depicted as positive impulses proportional to their velocity; the velocity of key release is not recorded. Overlap between notes in a phrase. This can be roughly captured with MIDI data, but continuous sensing provides a clearer measurement. Partial key-press gestures which do not create a sound. Though in traditional piano technique such motions are often inadvertent, they can be harnessed as a further control device in an electronicallyaugmented instrument. Ongoing Work: Intelligent Mapping Acoustic instruments unify sensing and actuation in their mechanical design. On the piano, a series of levers between key and hammer determines both the sound production of the instrument and its feel to the performer. Recreating the intuitive link between sensing and sound production is critical to building expressive electronic instruments. In an ideal situation, a performer playing on an electronically-augmented piano would not be aware of the role of the computer in the loop: gestures made at the keyboard would map intuitively and with minimal latency to musical qualities of the piano sound. Nonetheless, realizing this goal is an important challenge in human-computer interaction. On the input side, measurements of pianists performance actions must be analyzed to extract correlations between key motion and expressive intent. On the output side, correlations must be identified between acoustic parameters of string actuation (amplitude, frequency, waveform) and musical qualities (dynamics, phrasing, timbre). Finally, the computer must produce a mapping between input and output which recreates the natural couplings found in acoustic instruments. Mapping from performance interface to actuators requires more sophistication than simple one-to-one relationships 4145

(e.g. position to amplitude, velocity to frequency, etc.). We plan to conduct a study of pianistic expression in which skilled pianists play on an instrument equipped with continuous key position sensors. The study will include existing piano repertoire as well as short excerpts focusing on particular emotional/expressive cues (e.g. delicate, heavy, mournful, etc.). From this data, correlations between expressive intention and key motion will be extracted. Eventually, machine learning techniques will be used to develop mappings which act as an intuitive extension of existing piano technique, creating an augmented piano accessible to any trained pianist. The quality of each potential mapping will be evaluated by soliciting feedback from pianists who will play both notated and improvised passages on the augmented instrument. Impacts This work has important benefits for both musical and technical fields. For performers and composers, the instrument will be a new creative tool providing a greatly expanded musical vocabulary while preserving the rich sound and expressive nuance on the acoustic grand piano. As a study in human-computer interaction, this work will begin to answer important questions related to creative artistic expression. The ideal computer music interface will be intuitive to the performer, drawing on years of training. Though any musician s technique is in part specific to a particular instrument, musicians share a common vocabulary of qualitative, expressive descriptors that are not easily quantified. How can these qualities be understood by computers? How can they be mapped to quantitative acoustic features? The planned piano performance studies, plus qualitative feedback from performers, will suggest correlations between expressive intent and physical gesture with broad application to computer music interfaces. Acknowledgements This material is based upon work supported by the National Science Foundation under Grant # 0937060 to the Computing Research Association for the CIFellows Project. References [1] E. Berdahl, S. Backer, and J. Smith. If I had a hammer: Design and theory of an electromagnetically-prepared piano. In Proc. ICMC 2005. [2] E. Berdahl, G. Niemeyer, and J. Smith. Active control of a vibrating string. In Proc. Acoustics 08. [3] C. Besnainou. Transforming the voice of musical instruments by active control of the sound radiation. In Proc. ACTIVE 1999. [4] P. Bloland. The electromagnetically-prepared piano and its compositional implications. In Proc. ICMC 2007. [5] H. Boutin and C. Besnainou. Physical parameters of an oscillator changed by active control: Application to a xylophone bar. In Proc. DAFx 2008. [6] H. Boutin and C. Besnainou. Physical parameters of the violin bridge changed by active control. In Proc. Acoustics 08. [7] A. Freed and R. Avizienis. A new music keyboard featuring continuous key-position sensing and high-speed communication options. In Proc. ICMC 2000. [8] W. Goebl and C. Palmer. Tactile feedback and timing accuracy in piano performance. Experimental Brain Research, 186(3):471 479, April 2008. [9] A. McPherson. The magnetic resonator piano: Electronic augmentation of an acoustic grand piano. Journal of New Music Research. In press. [10] R. A. Moog and T. L. Rhea. Evolution of the keyboard interface: The Bösendorfer 290 SE recording piano and the Moog multiply-touch-sensitive keyboards. Computer Music Journal, 14(2):52 60, Summer 1990. [11] Piano Bar. Products of interest. Computer Music Journal, 29(1):104 113, 2005. 4146