Musical Instrument User Interfaces: the Digital Background of the Analogue Revolution (Keynote at NIME 2011, Oslo)

Similar documents
Power Standards and Benchmarks Orchestra 4-12

Articulation Clarity and distinct rendition in musical performance.

Study Guide. Solutions to Selected Exercises. Foundations of Music and Musicianship with CD-ROM. 2nd Edition. David Damschroder

Physics HomeWork 4 Spring 2015

about half the spacing of its modern counterpart when played in their normal ranges? 6)

about half the spacing of its modern counterpart when played in their normal ranges? 6)

Music Curriculum Glossary

CSC475 Music Information Retrieval

Elements of Music David Scoggin OLLI Understanding Jazz Fall 2016

3b- Practical acoustics for woodwinds: sound research and pitch measurements

CHAPTER ONE TWO-PART COUNTERPOINT IN FIRST SPECIES (1:1)

Copyright 2009 Pearson Education, Inc. or its affiliate(s). All rights reserved. NES, the NES logo, Pearson, the Pearson logo, and National

Music, Grade 9, Open (AMU1O)

HST 725 Music Perception & Cognition Assignment #1 =================================================================

Popular Music Theory Syllabus Guide

Physics Homework 4 Fall 2015

Norman Public Schools MUSIC ASSESSMENT GUIDE FOR GRADE 8

Vigil (1991) for violin and piano analysis and commentary by Carson P. Cooman

2014 Music Style and Composition GA 3: Aural and written examination

Music Theory: A Very Brief Introduction

The String Family. Bowed Strings. Plucked Strings. Musical Instruments More About Music

Building a Better Bach with Markov Chains

Simple Harmonic Motion: What is a Sound Spectrum?

Music is a part of the world of sound, an art based on the organization of sounds in time.

Greenwich Public Schools Orchestra Curriculum PK-12

Haydn: Symphony No. 101 second movement, The Clock Listening Exam Section B: Study Pieces

Gyorgi Ligeti. Chamber Concerto, Movement III (1970) Glen Halls All Rights Reserved

Developing Your Musicianship Lesson 1 Study Guide

Alleghany County Schools Curriculum Guide

MHSIB.5 Composing and arranging music within specified guidelines a. Creates music incorporating expressive elements.

Connecticut State Department of Education Music Standards Middle School Grades 6-8

PLANE TESSELATION WITH MUSICAL-SCALE TILES AND BIDIMENSIONAL AUTOMATIC COMPOSITION

Page 4 Lesson Plan Exercises Score Pages 50 63

Music Representations

Class Notes November 7. Reed instruments; The woodwinds

Prelude. Name Class School

Preface. Ken Davies March 20, 2002 Gautier, Mississippi iii

Lab P-6: Synthesis of Sinusoidal Signals A Music Illusion. A k cos.! k t C k / (1)

An Integrated Music Chromaticism Model

STEVE TADD WOODWIND REPAIRS (.co.uk)

Oskaloosa Community School District. Music. Grade Level Benchmarks

Courtney Pine: Back in the Day Lady Day and (John Coltrane), Inner State (of Mind) and Love and Affection (for component 3: Appraising)

The Keyboard. An Introduction to. 1 j9soundadvice 2013 KS3 Keyboard. Relevant KS3 Level descriptors; The Tasks. Level 4

Section V: Technique Building V - 1

Algorithmic Composition: The Music of Mathematics

Band Study Guide. For ALL bands

The Elements of Music. A. Gabriele

Audio Feature Extraction for Corpus Analysis

Student Guide for SOLO-TUNED HARMONICA (Part II Chromatic)

Bassoon Fingering Issues By Michael Burns

The Keyboard. Introduction to J9soundadvice KS3 Introduction to the Keyboard. Relevant KS3 Level descriptors; Tasks.

The Story of the Woodwind Family. STUDY GUIDE Provided by jewel winds

MARK SCHEME for the May/June 2012 question paper for the guidance of teachers 0410 MUSIC

Marshal Royal: The Art of Lead Alto. An Analysis by Seth Carper. Marshal Royal is arguably the most important lead alto player in the history

2013 Music Style and Composition GA 3: Aural and written examination

Chamber Orchestra Course Syllabus: Orchestra Advanced Joli Brooks, Jacksonville High School, Revised August 2016

Instrumental Music I. Fine Arts Curriculum Framework. Revised 2008

The Baroque 1/4 ( ) Based on the writings of Anna Butterworth: Stylistic Harmony (OUP 1992)

Influence of timbre, presence/absence of tonal hierarchy and musical training on the perception of musical tension and relaxation schemas

MUSIC THEORY CURRICULUM STANDARDS GRADES Students will sing, alone and with others, a varied repertoire of music.

xlsx AKM-16 - How to Read Key Maps - Advanced 1 For Music Educators and Others Who are Able to Read Traditional Notation

Credo Theory of Music training programme GRADE 4 By S. J. Cloete

Student Performance Q&A: 2001 AP Music Theory Free-Response Questions

Devices I have known and loved

Intermediate Concert Band

TABLE OF CONTENTS CHAPTER 1 PREREQUISITES FOR WRITING AN ARRANGEMENT... 1

FINE ARTS Institutional (ILO), Program (PLO), and Course (SLO) Alignment

Elements of Music - 2

Unit 8 Practice Test

Music Theory. Fine Arts Curriculum Framework. Revised 2008

The Environment and Organizational Effort in an Ensemble

WIND INSTRUMENTS. Math Concepts. Key Terms. Objectives. Math in the Middle... of Music. Video Fieldtrips

Chapter 1: Key & Scales A Walkthrough of Music Theory Grade 5 Mr Henry HUNG. Key & Scales

Partimenti Pedagogy at the European American Musical Alliance, Derek Remeš

MUSIC PROGRESSIONS. Curriculum Guide

Registration Reference Book

Music Standard 1. Standard 2. Standard 3. Standard 4.

MMEA Jazz Guitar, Bass, Piano, Vibe Solo/Comp All-

CHAPTER 14 INSTRUMENTS

MUSIC CURRICULM MAP: KEY STAGE THREE:

Music 170: Wind Instruments

SMCPS Course Syllabus

Music for the Hearing Care Professional Published on Sunday, 14 March :24

Unit 1. π π π π π π. 0 π π π π π π π π π. . 0 ð Š ² ² / Melody 1A. Melodic Dictation: Scalewise (Conjunct Diatonic) Melodies

Instrumental Music III. Fine Arts Curriculum Framework. Revised 2008

5. Debussy Prélude à l'après-midi d'un faune (for Unit 3 : Developing Musical Understanding)

PKUES Grade 10 Music Pre-IB Curriculum Outline. (adapted from IB Music SL)

AP Music Theory

Lesson Week: August 17-19, 2016 Grade Level: 11 th & 12 th Subject: Advanced Placement Music Theory Prepared by: Aaron Williams Overview & Purpose:

PHY 103: Scales and Musical Temperament. Segev BenZvi Department of Physics and Astronomy University of Rochester

Student Performance Q&A:

American Fiddle Styles for the Anglo Concertina

DOWNLOAD PDF FILE

Student Performance Q&A:

Note on Posted Slides. Noise and Music. Noise and Music. Pitch. PHY205H1S Physics of Everyday Life Class 15: Musical Sounds

Music Theory Courses - Piano Program

FUNDAMENTAL HARMONY. Piano Writing Guidelines 0:50 3:00

44. Jerry Goldsmith Planet of the Apes: The Hunt (opening) (for Unit 6: Further Musical Understanding)

AH-8-SA-S-Mu3 Students will listen to and explore how changing different elements results in different musical effects

Music 1. the aesthetic experience. Students are required to attend live concerts on and off-campus.

Transcription:

Musical Instrument User Interfaces: the Digital Background of the Analogue Revolution (Keynote at NIME 2011, Oslo) Tellef Kvifte September 3, 2011 Tellef Kvifte: Keynote at NIME 2011, Oslo 1 Introduction According to the Polish musicologist Ludvig Bielawski, a musical instrument is a transformer, transforming bodily gestures in physical time and space into musical gestures in musical time and space. As illustrated in the short musical introduction, a certain set of musical gestures can be created in different ways, by different instruments, with different relations between the physical and musical gestures. As we would phrase it today, there are different mappings between input and output. But, as Bielawski points out, there are certain broad regularities in the large and diverse world of acoustic musical instruments. Figure 1: Bielawski s diagram. First of all, there is the obvious connection that the timing of physical gestures is connected to durations and rhythmic factors in the music. Further, a common relation is between positions in space and pitch values there is for example a definite relation between pitch and finger position on the finger-board on the violin. The force of a physical gesture is in many instances related to loudness, like force of blowing into the saxophone mouthpiece, and, in certain cases, force also influences pitch, like in many wind instruments. 1

Finally timbre, is in Bielawski s scheme normally connected to the spatial aspect of the physical gestures, like when pulling a stop on an organ. Bielawski makes some further valuable observations: The properties (parameters, variables) of both the physical and musical gestures can be classified as either discontinuous or continuous for example a major scale (musical gesture) consists of discrete entities: discontinuous scale steps, while a glissando is a continuous musical gesture. Instrument interfaces (keyboards, slides, buttons, fingerboards) can also be described in this way: Keys on a keyboard form a discontinuous interface, while slides can be moved continuously. And so on. I will translate Bielawski s concept pair continuous/discontinuous into the concept pair of digital/analogue. The reason for this, is that it allows me to connect the discussion of interfaces to a wider context of communication research, as exemplified by Gregory Bateson and his collegues. They use the concepts analogue and digital to characterize different aspects of communication in general; in ways that are relevant to our discussion of user interfaces, as I will argue later. But let me tie the concepts more directly to music: To use the concept pair to characterize pitch perception, is to point to two different aspects of our perception, where the digital aspect refers to pitch classes the steps of our scales as distinct and discrete conceptual entities, while the analogue aspect refers to pitch as continuous, like in glissandi, vibrato and intonation of various qualities. When we listen to for example a singer, we can easily identify the discrete scale steps intended (the digital aspect) and, at the same time, notice obvious continuous pitch inflections (the analogue aspect), even if there is a wide vibrato going over more than a half tone in either direction. These two aspects of perception are mutually depended on each other: The physical phenomena that affect out sense organs are always continuous (analogue) in nature, so the digital entities have to be inferred from them. On the other hand, to be able to characterize intonation as good or bad, to feel if a glissando is right or wrong, we need the digital entities as reference and points of orientation, as structure to understand what the analogue entities mean. A similar observation can be made for the other parameters in the Bielawski diagram. For example durations are perceived as both the discrete (digital) duration classes of whole, half and quarter notes, and also in continuous (analogue) values as slightly early or late, or swinging in the right way etc. With this starting point, with the help of Bielaswki and the concepts digital and analogue, I will discuss certain aspects of the development of instrument user interfaces in the last couple of hundred years. I think Bielawski s diagram is a rather good, though quite coarse, illustration of general properties of mappings in traditional acoustic instruments. It also reminds us that contemporary instruments obviously are much more diverse in this respect, in electronic instruments we can make any mapping we fancy between physical and musical gestures. Bielawski s paper was published in 1979, and is, as far as I can see, one of the earliest papers to address mappings seriously and systematically as a general aspect of musical instruments. I will, however, start even earlier in my attempt to infuse some meaning into a chaotic and many-faceted development, with the development of two classes of musical instrument in the 19th century: Adolphe Sax got his patent for the saxophone in 1846, at a time when Europe had seen an almost explosive development of new interfaces for musical 2

expression. The saxophone was one of the results of the development of the woodwinds, and in the family of free reed instruments a great number of diverse inventions and interfaces were made. We shall have a look at both these families of instruments. 2 The woodwinds The baroque traverso flute was, at the beginning of the 19th century, still the most common form of the flute, and may serve as an example of the basic interface of many of the woodwinds. The sound is produced by blowing across the hole at the top. The pitch is controlled by the opening/closing of the six finger-holes and single key, in effect changing the length of the active air column. The full length of the air column produces the root of a major scale, and the scale can be played by uncovering the holes one by one, and finally closing all again and blowing harder to get the octave above the starting tone. Figure 2: Baroque traversoe. The remaining semitones of the octave can be played by using the single key, and by various so-called fork fingerings, combinations of open and closed holes where not all the open holes are consecutive holes from the bottom. Such fingerings produce tone qualities that differs from the tones produced by the normal fingerings; they may be slightly out of tune, and they are generally awkward to play in fast tempi. In other words, this user interface is well suited, logically and practically, for melodies in keys close to the key of the fundamental of the instrument, but increasingly difficult for more remote keys. While details differ in the other orchestral woodwinds, this basic issue is the same for all of them, and as the orchestral music of the 19th century became increasingly chromatic, several inventions were introduced. Most of the inventions involved a larger number of keys, added to the six finger-holes. The aim was primarily to allow for easy access and equal tone quality for all twelve tones in the octave. Figure 3: Traversoe with additional keys The traversoe introduced by Theobald Boehm in 1847 has many of the important traits of the modern orchestral flute. The conical bore of the baroque traversoe is replaced by a cylindrical bore, and the holes are almost all of the same size. Finally, due to the greater size of the holes, they have to be closed by pads rather than directly by the fingers, with certain consequences for the playing technique. 3

Figure 4: Boehm model traversoe. The saxophone, that was patented at the same time, continues the sophistication of these user interfaces, with a large amount of pads, keys and mechanical details. Notice also that even if the instrument is constructed with the aim of easy access of all keys and all twelve notes in the octave, it still builds on the basic six-fingerhole principle of the traverso pitch interface. Figure 5: Soprano sax and traversoe. 3 The free reed instruments But it is in the free reed family of instruments that we find most of the radically new interfaces. The free reed as a sound-producing device was, from the beginning of the 19th century, implemented in a large number of instruments. Like other types of reeds, the free reed is driven by an air stream, provided either by direct blowing into the instrument or by some mechanical system of bellows. In the typical free reed instrument, there is one reed for each available pitch. This 4

fundamental aspect of the basic sound-production, allowed for playing several tones simultaneously, and also for great freedom and variety in the construction of the user interface. Further, the free reed is a very robust and stable soundproducer, that keeps in tune for a long time. Whether for this reason, or for its sound quality, or for its novelty, instruments based on the free reed gained enormous popularity during the century. Along came a range of interfaces, as part of different mouth-blown and bellows-driven instruments, that were eventually known as harmonium, mouth harmonica, accordeon, concertina, aeolina, Psallmelodikon, and cecilium, to name a few. Some interfaces were emulated after existing instruments; some were new inventions. One of the more conventional designs is of course the harmonium or house organ, that became very common, and produced in large quantities many places in the world. Accordeon Harmonium Mouth harmonica Concertina Aeolina Psallmelodikon Cecilium Figure 6: A selection of free reed instruments.! A more short-lived example was that of the Cecilium, in the shape of a cello, the bow replaced by a handle that operated the bellows, and the fingerboard furnished with a set of small keys, organised in analogy with the pitch pattern of fingering on the fingerboard of the cello. Less than 200 were made, and today only a few exemplars are found in museums. Of the many novel designs, most were short-lived, but not necessarily without interest, as they illustrate ideas and principles that were important in the general development of instruments at the time. Wheatstone s Symphonions are a case in point. They are mouth-blown instruments, with keys on the side. The layout of the keys is such that a scale is divided so to say between the two hands, as this illustration from the patent papers shows. 5

Figure 7: From Wheatstone s patent application. This system, it can be agued, is both logical and musically sensible: Melodies move more often than not in scale steps, so this system allows the two hands to alternate and make legato playing posssible. Also, the basic triads are easy to play, with one hand pressing adjacent keys. This principle was continued in Wheatstone s most important musical invention, the Concertina. Wheatstone describes the system like this: The notes of the scale are placed alternately on each side of the instrument; all the notes written on spaces being on the right side... and all those written on lines on the left-hand side... By this arrangement, to perform a diatonic scale in any key the first and second fingers of both hands only are needed, and no crossing of the fingers ever occurs. The principle of dividing the notes of the major scale in two groups is also found in the diatonic accordion, where each button controls two different pitches, depending on the direction of the air that is, by pushing or pulling the bellows. In a typical ordering, the push action on the bellows will let the right hand buttons produce all pitches of the major triad, while the pull action will produce the remaining four pitches of the major scale. The left hand buttons, producing roots and chords, work in a similar fashion, usually producing a tonic chord on push and a dominant on pull. Diatonic accordions may have more than one row of right hand buttons, two and three are quite common, with the additional rows tuned to a different key. The diatonic button system affords a compact user interface, as each button is used for two pitches, and a basic ordering of the available pitches in the two categories of root triad and the rest. In certain styles, this is a sensible and practical set-up, with an almost fool-proof system for harmonization of melodies. Also common, are varieties of chromatic button accordions. In this example the buttons are ordered in minor thirds vertically and minor seconds diago- 6

Push & Button 1 Button 2 Button 3 Button 4 Pull Button 5 Figure 8: Two-row button accordion nally. Characteristic for these chromatic button user interfaces is their symmetry, where fingerings in principle follow the same patterns in all keys like in the two major triads shown here. $ & ' % %! & "! " # ' ()*+,-./),01 $ % % & '! ()*+,-123+*01 Figure 9: Chromatic accordion It is striking that almost all keys and buttons are used for the control of one single parameter in music, digital pitch, that is, pitch in the form of scale steps. The only exception being the stops on the harmonium, for the selection between a limited number of timbres. Also evident is that, despite the very different layouts, they all relate quite directly to the dominating basic pitch organization in scales and chords of the Western world, and are easily connected to the pitch representational system of standard notation. The pattern of black and white buttons on the chromatic accordeon is a direct analogy to the standard piano keyboard, and also to notes without sharps or flats in standard notation, and Wheatstone organized the buttons of the two sides of the concertina to correspond with notes between the lines of standard notation on the one side, and notes on the lines on the other. 7

On the woodwind, like on the free reed instruments, the interface is heavily weighted toward pitch class control. The remaining most important control on both families are the continuous loudness control, while timbral control is more limited. On the flute, the new user interfaces also decreased the control over certain effects involving continuous pitch change, like glissandi and slides, because of the introduction of pads that broke the direct contact between the musician s fingers and the finger-holes. Pitch Loudness Timbre Figure 10: Control types 4 Characteristics of 19th century development To make a general characterization of this development, we can say that greater weight now was placed on digital control of pitches, on the expense of analogue control. Striking in this connection, is that standard music notation is useful almost exclusively for digital entities. Pitch classes the discrete steps of scales are precisely and easily accounted for, both as melodies and as harmonies. But analogue entities like pitch inflections, glissandi, slides, vibrato, intonation can not be accounted for in notation in any precise way. Similar arguments can be made for other musical parameters. The analogue aspect of hand-written musical notation, how it is written, does not contain much musical information, but carries information about the writer it is easy to see the difference between real professionals and amateurs. What I try to put together, is a picture of a century where digital pitch control was a central theme in the construction of user interfaces, at the expense of other musical parameters, like analogue pitch control and timbre. Without pointing to possible causes for this, I argue that the development of musical instrument interfaces is not an isolated cultural phenomenon, but part of a larger picture. Beside instrument interfaces, such a picture includes at least the preferred tools for production and distribution, in this case Standard Music Notation, and some aesthetic preferences, that is seen for instance in the development of the chromatic styles of the art music of the period, with much focus on the arrangement of pitch classes in novel ways. In other words, we find digital pitch control at the centre of all three areas. 8

Figure 11: Analog notation of digital music symbols Figure 12: The Theremin Let us have a look on the picture today, and see if the elements fits together in a similar fashion. 5 20th century instruments In the last part of the previous century, we saw a development in instrument and user inteface design that is of a magnitude similar or greater than that in the 19th century; this time based on electric and electronic technology in analogue and digital varieties. Obviously, the user interfaces developed in this period were not made primarily to control digital pitch, like the keys and button interfaces of the 19th century. We may notice that the first really radical user interface developed in the 20th century that of the Theremin had no digital control at all. The performer control pitch and loudness by moving the hands relative to the antennas of the instrument. The later VCS3, one of the early synthesizers, originally had no keyboard nor any other control organ for digital pitch at all, but a large number of controls that affect timbral qualities, most of which are analogue continuous 9

controls. As is pointed out in the manual: @@ the VCS3 is not played like a conventional musical instrument, and since its controls are all continuously variable the varieties of sound obtainable are literally endless. @@ Figure 13: The VCS3 And even if later synthesizers and sample-based instruments usually came with a keyboard, the most obvious development was nevertheless that of analogue timbral control possibilities. The Mini-moog, though equipped with a keyboard, sported more than forty other control knobs, most of which influence timbral qualities. This is even more pronounced in contemporary software syntesizers/samplers, where the controls are spread over many layers and screens, and numbers in the hundreds. Figure 14: The MiniMoog At the same time, both aesthetic preferences and the dominating technologies for production and distribution of music changed. Sound as records and 10

broadcasts took over as the dominating medium for distribution of music. This medium is radically different from standard notation, as both composition and performance are distributed seamlessly together, hence both the digital and analogue parameters of music are included. In the latter half of the century, also production of music was increasingly done with sound recording technology rather than with manipulation of symbols in standard notation. This put analogue parameters of music like pitch inflections and timbral changes into the realm of music production in a new way, and we see the same tendency in music production as in the instruments explosive increase in control possibilities for timbral qualities, and relative less focus on development of pitch class ordering and manipulation. Figure 15: Software synths - The Jupiter an emulation of a hardware synth I argued that standard music notation was one important factor in the 19th century, and without such a tool for manipulation, storing and communication of pitch patterns, it is hard to see how the melodic and harmonic complexity of the romantic era could have been possible. I argue that the timbral aspect of much contemporary music is of a comparable or greater complexity, and that some kind of notational tool has been necessary for this development. Compared to pitch, timbre is a rather messy variable. Unlike pitch, there exists no agreed-upon ordering of timbral qualities, not even an agreed-upon classification. Instead of a simple low-high-ordering, there exists a large number of possible continuous variables that can characterize timbral qualities, like bright/dull, sharp/soft, cold/warm, and innumerable others with considerably more exotic names. As I have argued that standard notation is not very useful for analogue parameters, the question is, what kind of notational tool is used to generate the complexities of contemporary timbral qualities? Obviously, the user interfaces of the present technology are good candidates. Looking back on the VCS3: The user interface provide like standard notation a representation of the relevant parameters, that can be used both for producing music and equally important to think, reflect and act on the reflection. Unlike notation on paper, however, this representation is not stored, 11

but vanishes as soon as the controls are manipulated to get another sound. Therefore, the VCS3 were accompanied by so-called dope sheets, were the settings of the controls could be written down. Not very convenient, but when such settings could be given permanence and stored on later instruments, that be synthesizers, samplers and in the software tools of digital audio workstations, detailed control of timbral qualities was no longer only a matter for performers, but also for producers and composers. Figure 16: Dope sheet for a VCS3 6 Conclusion To sum up: Without claiming a specific causal connection, I argue that the areas of instrument interface, tools for production and distribution and certain aesthetic preferences concerning the musical sound, fit togther in a convincing way: The stepwise, scale-and chord-oriented interfaces, standard notation and aesthetic focus on digital pitch patterns of the 19th century on the one hand, and the interfaces dominated by continuous controls for timbral variations, sound as a preferred medium, and an aesthetic focus on timbral qualities of the late 20th century on the other. In other words, I see a development in instrument user interface design from focus on digital pitch control to analogue timbral control; in tools of production and distribution from standard musical notation to analogue recording technology analogue in the sense that it communicates also the analogue aspect of music and a shift from focus on digital pitch class to focus on analogue timbral qualities. In my opinion, therefore, the main characterization of the development the last couple of hundred years, is not the conventional from analogue to digital it is certainly a point referring to the inner workings of some of the technology 12

Aesthetic preferences Tools for production and distribution Instrument interface Figure 17: in use, but more important, is the change in music, that is in a way the opposite, from digital to analogue. What is the driving force in the transition? I do not like to point out single causes in this picture. Development of technology involves complex, interacting processes of technical, cultural and economic nature, and is best understood in some kind of a systems perspective. But nevertheless it is tempting to simplify, and as the picture I have drawn already is extremely simplified already, I might as well add another simplication as a small finale to this lecture. At the start of this conference, the obvious simplification to make, it say that the transition is motivated by the quest for musical expression. Let me amplify this claim, by a small detour: In the tradition of communication research initiated by the anthropologist Gregory Bateson and his collaborators, the concepts of analogue and digital are used in a way that is strikingly relevant to the theme of this lecture. Taking departure like we did in discrete versus continuous as a defining characterization of digital and analogue, he proceeds to characterize aspects of verbal communication, where concepts, words and letters are discrete entities and hence digital, while voice quality and inflections are continuously variable in nature and hence analogue. What I say is the digital part of the communication, and how I say it is the analogue part. One of Bateson s points, is that while digital communication is mainly conscious and concerned with factual information, analogue communication is more unconscious, and primarily concerned with relations and expression. In his view, most of the expressive communication is through the analogue channel. Thus, it is no coincidence that the field of musicology that is concerned with the analogue aspects of played rhythms or how played rhythms differs from the theoretical values of notation is called expressive timing. The present interest in control over analogue musical variables like timbre is in this perspective easily understood as interest in human and musical expression. But what about the 19th century and the focus on one single digital variable? 13

Following Bateson, it seems that in the music production chain of the 19th century of composer notation performer, an important part of the expressive potential in music is left to the performer, outside the control of the composer. That is because the central tool of production and distribution of music standard notation simply does not work for the expressive qualities of analogue communicational parameters. This is rather paradoxical, as precisely in this period, European composers seem to focus increasingly on expression rather than on structure. So, is Bateson wrong in that expression is most importantly communicated through analogue parameters, or, are music historians wrong in insisting on the expressiveness of romantic composers? I think both are right. First: There was also development in expressive possibilities of instruments. The development of the piano aimed at a keyboard instrument with analogue dynamic possibilities, and even if the Boehm traversoe did away with several expressive possibilities in timbre and intonation, it added expressivity in increased dynamic range. Also in the free reed instruments, expressivity through control of loudness was a major sales argument. Even more important, though, is in my opinon that composers that relied on a tool that more than anything else favoured manipulation of digital pitch, explored exactly this parameter in their quest for expression. Chromatic harmony is a most expressive form of harmony, and it was pushed to the extreme by the composers that are characterized by music historians as yes, you guessed it: the expressionists. In this light, both periods of development of user interfaces can be seen as driven by the quest for musical expression, or, to be more precise, by the quest for New Interfaces for Musical Expression. As a kind of conclusion, let me return to Bielawski s definition of a musical instrument. To allow for even more development, I ll introduce a simplified version as guide for further work: Time Duration Gestures of movement Space Pitch Sound color Musical gestures Dynamic Loudness..or even simpler: Time Duration Gestures of movement Space Anything Pitch Sound color Musical gestures Dynamic Loudness 14