Cymatic: a real-time tactile-controlled physical modelling musical instrument

Similar documents
SYNTHESIS FROM MUSICAL INSTRUMENT CHARACTER MAPS

UNIVERSITY OF DUBLIN TRINITY COLLEGE

2018 Fall CTP431: Music and Audio Computing Fundamentals of Musical Acoustics

Quarterly Progress and Status Report. Towards a musician s cockpit: Transducers, feedback and musical function

Toward a Computationally-Enhanced Acoustic Grand Piano

CTP431- Music and Audio Computing Musical Acoustics. Graduate School of Culture Technology KAIST Juhan Nam

Pitch. The perceptual correlate of frequency: the perceptual dimension along which sounds can be ordered from low to high.

Next Generation Software Solution for Sound Engineering

Getting Started with the LabVIEW Sound and Vibration Toolkit

CTP 431 Music and Audio Computing. Basic Acoustics. Graduate School of Culture Technology (GSCT) Juhan Nam

AN ARTISTIC TECHNIQUE FOR AUDIO-TO-VIDEO TRANSLATION ON A MUSIC PERCEPTION STUDY

NOTICE. The information contained in this document is subject to change without notice.

A Need for Universal Audio Terminologies and Improved Knowledge Transfer to the Consumer

Simple Harmonic Motion: What is a Sound Spectrum?

Digital audio and computer music. COS 116, Spring 2012 Guest lecture: Rebecca Fiebrink

Loudness and Sharpness Calculation

DAT335 Music Perception and Cognition Cogswell Polytechnical College Spring Week 6 Class Notes

Cathedral user guide & reference manual

Analysing Room Impulse Responses with Psychoacoustical Algorithms: A Preliminary Study

Speech and Speaker Recognition for the Command of an Industrial Robot

Reference Manual. Using this Reference Manual...2. Edit Mode...2. Changing detailed operator settings...3

Using the new psychoacoustic tonality analyses Tonality (Hearing Model) 1

9.35 Sensation And Perception Spring 2009

Animating Timbre - A User Study

Using the BHM binaural head microphone

Glasgow eprints Service

Digital music synthesis using DSP

VISUALIZING AND CONTROLLING SOUND WITH GRAPHICAL INTERFACES

Usability of Computer Music Interfaces for Simulation of Alternate Musical Systems

Extending Interactive Aural Analysis: Acousmatic Music

Challis, B. P. and A. D. N. Edwards (2000). Weasel: A system for the non-visual presentation of music notation. Computers Helping People with Special

A System for Generating Real-Time Visual Meaning for Live Indian Drumming

Physical Modelling of Musical Instruments Using Digital Waveguides: History, Theory, Practice

Hybrid active noise barrier with sound masking

Music Representations

Patchmaster. Elektronik. The Pulse generator. February 2013

A Matlab toolbox for. Characterisation Of Recorded Underwater Sound (CHORUS) USER S GUIDE

Acoustic Instrument Message Specification

PRELIMINARY INFORMATION. Professional Signal Generation and Monitoring Options for RIFEforLIFE Research Equipment

Introduction To LabVIEW and the DSP Board

inter.noise 2000 The 29th International Congress and Exhibition on Noise Control Engineering August 2000, Nice, FRANCE

Music Representations

Pitch Perception and Grouping. HST.723 Neural Coding and Perception of Sound

Instrument Recognition in Polyphonic Mixtures Using Spectral Envelopes

The Cocktail Party Effect. Binaural Masking. The Precedence Effect. Music 175: Time and Space

inter.noise 2000 The 29th International Congress and Exhibition on Noise Control Engineering August 2000, Nice, FRANCE

Psychoacoustics. lecturer:

Ultra 4K Tool Box. Version Release Note

We realize that this is really small, if we consider that the atmospheric pressure 2 is

Analyzing and Saving a Signal

Virtual instruments and introduction to LabView

WAVES Cobalt Saphira. User Guide

A SIMPLE ACOUSTIC ROOM MODEL FOR VIRTUAL PRODUCTION AUDIO. R. Walker. British Broadcasting Corporation, United Kingdom. ABSTRACT

MusicGrip: A Writing Instrument for Music Control

Study of White Gaussian Noise with Varying Signal to Noise Ratio in Speech Signal using Wavelet

ME EN 363 ELEMENTARY INSTRUMENTATION Lab: Basic Lab Instruments and Data Acquisition

However, in studies of expressive timing, the aim is to investigate production rather than perception of timing, that is, independently of the listene

Measurement of overtone frequencies of a toy piano and perception of its pitch

MAutoPitch. Presets button. Left arrow button. Right arrow button. Randomize button. Save button. Panic button. Settings button

Equal or non-equal temperament in a capella SATB singing

Concert halls conveyors of musical expressions

Computer Audio and Music

Musical Sound: A Mathematical Approach to Timbre

Physical Modelling of Musical Instruments Using Digital Waveguides: History, Theory, Practice

NOTICE: This document is for use only at UNSW. No copies can be made of this document without the permission of the authors.

PHYSICS OF MUSIC. 1.) Charles Taylor, Exploring Music (Music Library ML3805 T )

Lab experience 1: Introduction to LabView

A CRITICAL ANALYSIS OF SYNTHESIZER USER INTERFACES FOR

Topic 1. Auditory Scene Analysis

PSYCHOACOUSTICS & THE GRAMMAR OF AUDIO (By Steve Donofrio NATF)

inter.noise 2000 The 29th International Congress and Exhibition on Noise Control Engineering August 2000, Nice, FRANCE

ACTIVE SOUND DESIGN: VACUUM CLEANER

(Skip to step 11 if you are already familiar with connecting to the Tribot)

Tetrapad Manual. Tetrapad. Multi-Dimensional Performance Touch Controller. Firmware: 1.0 Manual Revision:

DIGITAL PERSONAL STUDIO Version 1.30 Addendum

Basic Considerations for Loudness-based Analysis of Room Impulse Responses

Source/Receiver (SR) Setup

MIE 402: WORKSHOP ON DATA ACQUISITION AND SIGNAL PROCESSING Spring 2003

Diamond Cut Productions / Application Notes AN-2

MONITORING AND ANALYSIS OF VIBRATION SIGNAL BASED ON VIRTUAL INSTRUMENTATION

XYNTHESIZR User Guide 1.5

Musical Acoustics Lecture 15 Pitch & Frequency (Psycho-Acoustics)

The Tone Height of Multiharmonic Sounds. Introduction

Outline ip24 ipad app user guide. App release 2.1

S I N E V I B E S FRACTION AUDIO SLICING WORKSTATION

Computer Coordination With Popular Music: A New Research Agenda 1

TABLE OF CONTENTS TABLE OF CONTENTS TABLE OF CONTENTS. 1 INTRODUCTION 1.1 Foreword 1.2 Credits 1.3 What Is Perfect Drums Player?

The Physics Of Sound. Why do we hear what we hear? (Turn on your speakers)

User Guide & Reference Manual

Quarterly Progress and Status Report. Violin timbre and the picket fence

Real-Time Computer-Aided Composition with bach

The quality of potato chip sounds and crispness impression

The Research of Controlling Loudness in the Timbre Subjective Perception Experiment of Sheng

Bosch Security Systems For more information please visit

Part I Of An Exclusive Interview With The Father Of Digital FM Synthesis. By Tom Darter.

Robert Alexandru Dobre, Cristian Negrescu

Perceptual Considerations in Designing and Fitting Hearing Aids for Music Published on Friday, 14 March :01

This guide gives a brief description of the ims4 functions, how to use this GUI and concludes with a number of examples.

Implementation of an 8-Channel Real-Time Spontaneous-Input Time Expander/Compressor

The Design of Teaching Experiment System Based on Virtual Instrument Technology. Dayong Huo

Transcription:

19 th INTERNATIONAL CONGRESS ON ACOUSTICS MADRID, 2-7 SEPTEMBER 2007 Cymatic: a real-time tactile-controlled physical modelling musical instrument PACS: 43.75.-z Howard, David M; Murphy, Damian T Audio Lab, Intelligent Systems Research Group, Department of Electronics, University of York, Heslington, York, YO10 5DD, United Kingdom; dh@ohm.york-ac.uk ABSTRACT A physical modelling music synthesis system known as Cymatic is described that enables virtual instruments to be controlled in real-time via a standard PC gaming force-feedback joystick and a force-feedback mouse. These serve to provide the user with gestural controllers whilst in addition giving tactile feedback to the user. Cymatic virtual instruments are set up via a graphical user interface in a manner that is highly intuitive. Users design and play these virtual instruments by interacting directly with their physical shape and structure in terms of the physical properties of basic objects such as strings, membranes and solids which can be interconnected to form complex structures. The virtual instrument can be excited at any point mass by the following: bowing, plucking, striking, sine/square/sawtooth/random waveform, or an external sound source. Virtual microphones can be placed at any point masses to deliver the acoustic output. This paper describes the underlying structure and principles upon which Cymatic is based, and illustrates its acoustic output in the context of music making. INTRODUCTION When musicians interact with acoustic instruments they are intimately coupled to their instruments such that their smallest gestural movements have the potential to have an effect on the sound being produced. Each time a note is played there will be subtle differences in the gestures made when the player energises the instrument, for example by bowing, plucking, striking or blowing. Observation of the output waveform and/or spectrum for the same note played more than once on a given instrument reveals resulting subtle acoustic differences, perhaps in one or more of the note s onset, steady state or offset [1]. One of the key reasons for this lies with the underlying models that relate to the various methods used for music synthesis, such as additive, subtractive or frequency modulation (FM) [2]. Methods such as these make use of control parameters that are non-intuitive for the performer. This is particularly true in the case of FM synthesis, for which Hunt and Kirk [3] have made the following observation: there is no straightforward perceptual relationship between the modulation values and the timbre produced, and hence it is difficult to use FM synthesis to create a specific sound that you might require. It can thus be very difficult for a given parameter change to predict how the output sound will be perceived. Furthermore, the subtle variations between notes that would occur when a performer plays an acoustic instrument would not be present. Physical modeling synthesis offers the possibility of directly simulating the sound production processes in an acoustic instrument with no underlying sound production model except the physics of vibration. physical. Couple this with the provision of a user interface that enables musicians to interact with the instrument in a manner that is intuitive and intimate tends to counteract this difference, there is the potential for a new breed electronic musical instruments. Traditional electronic musical instruments produce sounds that are often described as being, for example, dull or lifeless which is clearly not the case with acoustic musical instruments, where adjectives such as intimate or organic might be employed. There is also anecdotal evidence from the world of pipeless electronic organs, potentially one of the easier acoustic instrumental sounds to create electronically, that although the output sounds very convincing on

first hearing, it becomes increasingly less convincing and less interesting to the ear after weekly listening over three to six months. This paper describes Cymatic, a new electronic musical instrument that makes use of physical modelling synthesis controlled by a tactile and gestural interface. Its physical modelling origins derive from TAO [4] and it shares common approaches with other physical modeling sound synthesis systems including Mosaic [5] and CORDIS-ANIMA [6]. Its tactile and gestural interfaces are designed to enable it to be played In a manner that is both intuitive and intimate. Cymatic is implemented under Windows and the player interacts with the underlying physical modelling synthesis engine by means of a force feedback mouse and joystick. The force feedback devices enable Cymatic to provide haptic feedback in addition to the acoustic output, thereby providing users with a playing experience that is both immersive and tactile. Haptic senses rather than the visual senses come second in providing the user with a means of observing and interacting with a musical instrument [7]. Complex and realistic musical expression only results when both tactile (vibrational and textural) and proprioceptive cues are made available to the player in conjunction with aural feedback [8, 9]. Cymatic provides its players with an tactile, immersive and organic musical experience that is much more typical of that found with acoustic instruments compared with traditional electronic instruments. BUILDING A CYMATIC INSTRUMENT Cymatic is a real-time physical modelling sound synthesis system that is based on the interconnection of virtual instruments that are designed by the user [10]. Cymatic makes use of a mass-spring physical modelling paradigm with which its individual elements are constructed. Standard force feedback PC gaming controllers are used to provide gestural control and tactile feedback. Acoustic output is via ASIO audio drivers and a compatible soundcard. User interaction with cymatic is a two-stage process of virtual instrument design and real-time sound synthesis. A graphical interface is used for virtual instrument design where individual vibrating elements made up of strings, sheets and blocks can be interconnected. The ends of the strings and edges of the sheets and blocks can be locked or left free, and any mass in the system can be connected to any other mass. The mass and spring tension values of each element are user defined in value, and these can either be left static or dynamically controlled using a gestural controller during synthesis. Interconnection of individual elements enables complex virtual instruments to be constructed. Individual masses can be deleted or locked in position, either as a fixed feature or dynamically during synthesis, enabling these complex instruments to have arbitrary shapes as desired. The resonant frequency of each basic element is a function of its mass/spring structure, and this can be adjusted by altering the number of masses or the mass and tension values, thereby allowing the instrument to be tuned and/or varied in timbre. Basic physical principles apply in such a system such as an octave down change resulting from doubling the length of a string. In order to create a sound from a Cymatic virtual instrument, an input sound source and a means of deriving an output sound from the system is required. An input excitation function can be placed on any mass that is not fixed, and a choice is available from the following: pluck, bow, random, sine wave, square wave, triangular wave, or live audio. Parameters relating to the selected excitation, including excitation force, and its velocity and time of application where appropriate can be specified by the user. Multiple excitations are possible since any non-fixed mass can be excited in this way. A monophonic sound output is achieved by placing a virtual microphone on any non-fixed mass within the virtual instrument. Multi-channel output can be realised by placing multiple microphones within the system, each of which can have its output panned between the left and right outputs of the soundcard. Cymatic supports whatever range of sampling rates that is available on the soundcard. For example, when used with an Eridol UA-5 USB audio interface, the following are available: 8kHz, 9.6kHz, 11.025kHz, 12kHz, 16kHz, 22.05kHz, 24kHz, 32kHz, 44.1kHz, 48kHz, 88.2kHz and 96kHz. 2

Figure 1: Example Cymatic virtual instrument consisting of a string with 40 masses, a sheet of 6 by 7 masses and a block of 5 by 4 by 3 masses. The masses at the ends of the string and the corners of the sheet and block are locked (shown in red). The block has had some of its masses cut to illustrate how arbitrary shapes can be achieved. Figure 2: GammaTone human hearing modelling spectrograms of the impulse responses for the three elements of the virtual Cymatic instrument shown in figure 1 obtained using the pluck excitation function and virtual microphones set on each element as follows: string - pluck at mass 10, microphone at mass 36; sheet pluck at mass (3,3), microphone at mass (4,3); block pluck at mass (2,2,2), microphone at mass (3,3,2). Figure 1 illustrates a three element Cymatic virtual instrument consisting of a string of 45 masses, a sheet of 7 by 9 masses, and a block of 4 by 4 by 3 masses. The block has had a portion removed to indicate how arbitrary shapes can be produced, which is achieved simply by 3

clicking on the individual masses to be cut away. The masses at the ends of the string and the corners of the sheet and block are locked. Each element has a different tension value for its springs and the masses in the block have double the value of those in the string and sheet. The impulse response of each element is shown spectrographically in figure 2, and these were obtained by placing a pluck excitation and a virtual microphone on each element as follows: string - pluck at mass 10, microphone at mass 36; sheet pluck at mass (3,3), microphone at mass (4,3); block pluck at mass (2,2,2), microphone at mass (3,3,2). GammaTone hearing modelling spectrography [us] is employed since it provides a picture that is closer to that presented to the brain by each ear. GammaTone spectrography [11-14] makes use of knowledge of the peripheral human hearing system by taking into account the frequency response of the outer and middle ears and the critical band mechanism [1] is used to set the bandwidths of a bank of filters with a GammaTone impulse response which closely approximate to human auditory filter responses [15]. It can be seen that the frequency components in the impulse response of each of the three elements are different, but that in each case, the output remains static, indicating that: (1) there are no dynamic changes being made, and (2) the different shapes with their altered mass and tension values are having an effect. AN EXAMPLE CYMATIC INSTRUMENT In order to illustrate how Cymatic can be used in a non-real-time manner, the three element Cymatic virtual instrument shown in figure 1 is used with more than one excitation. The elements themselves are connected together (string to sheet to block) my linking the masses that were used in the generation of the impulse responses shown in figure 2, hence mass 36 of the string is linked to mass (3,3) of the sheet, and mass (4,3) of the sheet is linked to mass (2,2,2) of the block. The virtual microphone is placed at mass (3,3,2) of the block. Figure 3: Output waveform (upper) and a GammaTone hearing modelling spectrogram (lower) for the example Cymatic virtual instrument shown in figure 1, which has been internally connected and excited (see text) with excitation functions indicated (1-4) as follows: (1) random input at mass (3,1,1) of the block from the start and ending at 5 seconds; (2) pluck at mass 10 of the string at 1 second; (3) random input at mass (4,1) of the sheet starting at 1 second and ending at 3 seconds; and (4) sawtooth input at mass (15) of the string starting at 4 seconds and ending at 6 seconds.. 4

The excitation functions available (see above) can be adjusted in terms of their start and end times and strength. Any excitation can be applied to any non-locked mass in the instrument. For this particular example, four excitation functions are used as follows. 1. random input at mass (3,1,1) of the block from the start and ending at 5 seconds 2. pluck at mass 10 of the string at 1 second 3. random input at mass (4,1) of the sheet starting at 1 second and ending at 3 seconds 4. sawtooth input at mass (15) of the string starting at 4 seconds and ending at 6 seconds. The output waveform is saved to file. Figure 3 shows the time waveform and a GammaTone hearing modelling spectrogram for this example instrument. The onsets of the excitations can be seen in the spectrogram as can the continuity of the random and sawtooth excitations, and the decay associated with the impulse response of each element (see figure 2) can be observed after the final excitation function has ceased (at 6 seconds). The onset of the pluck is particularly clear. REAL-TIME OPERATION Cymatic can be played in real-time allowing a player to interact with a number of its parameters by means of gaming interfaces. Force feedback controllers can be employed which enable the provision of tactile feedback with a high degree of customizability. To date, Cymatic has been used with the Microsoft Sidewinder Force Feedback Pro Joystick and the Logitech ifeel mouse [20]. The force instructions are communicated via MIDI to control the force feedback devices. The amount of force feedback is controlled by the acoustic amplitude of the signal from a userspecified virtual microphone, which need not necessarily be a main audio output signal. Figure 4: Example dialog box for mapping force feedback mouse movements to Cymatic physical modelling synthesis parameters associated with the Random1 excitation to the sheet. When the users moves the device, various gestures are captured which can be mapped to any of the parameters that are associated with the physical modelling process on an element-byelement basis. The joystick offers four degrees of freedom (x, y, z-twist movement and a rotary throttle controller) and eight buttons. The mouse has two degrees of freedom (X, Y) and three buttons. Cymatic parameters that can be controlled include: the mass or tension of any of the basic elements that make up the instrument and the parameters associated with the chosen excitation such as bowing pressure, excitation force, or excitation velocity. The buttons can be configured to suppress the effect of any of the gestural movements to enable the user to move to a new position while making no change, and then the change can be made instantaneously by releasing the button. In this way, step variations can be accommodated. This is illustrated in figure 4 which shows an example dialog box from Cymatic for mapping force feedback mouse movements to synthesis parameters associated with the Random 1 excitation to the sheet. 5

CONCLUSIONS A real-time Windows-based physical modelling electronic musical instrument, known as Cymatic, has been described in terms of the underlying physical principles upon which it is based, the available excitation models, and the graphical user interface implemented for setting up complex multi-dimensional virtual instruments. Cymatic can be played in real-time using force feedback gestural controllers whose control changes can be arbitrarily mapped to underlying physical modelling parameters, or off-line. Acoustic output is gained from virtual microphones that can be associated with any individual masses within the instrument. Tactile feedback is obtained from virtual microphones associated with any cells of the instrument which are set to control the force feedback applied to the gestural controller. Cymatic was implemented as a result of a desire to interact with original and inspiring new instruments, which in themselves may not be physically practical or possible in the real world (such as dimensionality greater than three, dynamically changing physic properties such as mass or tension, and dynamic removal of parts of the instrument with virtual scissors ). Physical modelling provides users with an intuitive and easy to understand and use system, with the potential for high degree of playing virtuosity. Haptic feedback reinforces aural and visual feedback cues thereby reinforcing the user s internal models of how the instrument reacts to gesture. ACKNOWLEDGEMENTS The authors thank the Engineering and Physical Sciences Research Council, UK for funding this work under grant number: EPSRC-GR/M94137/01. References: 1. Howard, D.M. and Angus J.A.S. (2006). Acoustics and psychoacoustics, 3 rd. Ed., Oxford: Focal Press. 2. Russ, M. (1996). Sound synthesis and sampling, Oxford: Focal Press. 3. Hunt, A.D., and Kirk, P.R. (1999). Digital sound processing for music and multimedia, Oxford: Focal Press. 4. Pearson, M.D., and Howard, D.M. (1996). Recent developments with TAO physical modelling system, Proceedings of the International Computer Music Conference, ICMC-96, 97-99. 5. Morrison, J.D. and Adrien, J.M. (1993). MOSAIC: A Framework for Modal Synthesis, Computer Music Journal, 17, (1), 45-56. 6. Cadoz, C., Luciani, A., and Florens, J.L. (1993). CORDIS-ANIMA: A Modelling system for sound and image synthesis, the general formalism, Computer Music Journal 17, (1), 19-29. 7. Cook, P.R. (1999). Music, Cognition and Computerised Sound: An Introduction to Psychoacoustics, London: MIT Press, pp229. 8. MacLean, K.E. (2000). Designing With Haptic-Feedback, www.cs.ubc.ca/~maclean/publics/icra00- DesignWithHaptic-reprint.PDF 9. Howard, D.M., Rimell, S., Hunt, A.D., Kirk, P.R., and Tyrrell, A.M. (2002). Tactile feedback in the control of a physical modelling music synthesiser, In: Proceedings of the 7th International Conference on Music Perception and Cognition, Stevens, C., Burnham, D., McPherson, G., Schubert, E., and Renwick, J. (Eds.), Adelaide: Casual Publications, 224-227. 10. Howard, D.M. and Rimell, S. (2004). Real-time gesture-controlled physical modelling music synthesis with tactile feedback, EURASIP Journal of Applied Signal Processing (Special Issue on Model-based sound synthesis), 7, (15), 1001-1006. 11. Brookes T, Tyrrell AM, Howard DM: On the differences between conventional and auditory spectrograms of English consonants, Logopedics Phoniatrics Vocology, 2000:25:72-78. 12. Howard DM, Hirson A, Brookes T, Tyrrell AM: Spectrography of disputed speech samples by peripheral human hearing modelling, Forensic Linguistics, 1995:2:1:28-38. 13. Howard DM, Tyrrell AM: Psychoacoustically informed spectrography and timbre, Organised Sound: 1997:2:2:65-76. 14. Howard, D.M. (2005). Human Hearing Modelling Real-Time Spectrography for Visual Feedback in Singing Training, Folia Phoniatrica et Logopaedica, 57, (5/6), 328-341. 15. Moore BCJ, Glasberg BP: Suggested formulae for calculating auditory-filter bandwidths and excitation patterns, J Acoust Soc Amer: 1983:74:3:750-753. 6