Investigations of Between-Hand Synchronization in Magaloff s Chopin

Similar documents
COMPUTATIONAL INVESTIGATIONS INTO BETWEEN-HAND SYNCHRONIZATION IN PIANO PLAYING: MAGALOFF S COMPLETE CHOPIN

THE MAGALOFF CORPUS: AN EMPIRICAL ERROR STUDY

Maintaining skill across the life span: Magaloff s entire Chopin at age 77

Maintaining skill across the life span: Magaloff s entire Chopin at age 77

The Magaloff Project: An Interim Report

However, in studies of expressive timing, the aim is to investigate production rather than perception of timing, that is, independently of the listene

Finger motion in piano performance: Touch and tempo

The Magaloff Project: An Interim Report

On time: the influence of tempo, structure and style on the timing of grace notes in skilled musical performance

Automatic characterization of ornamentation from bassoon recordings for expressive synthesis

A STATISTICAL VIEW ON THE EXPRESSIVE TIMING OF PIANO ROLLED CHORDS

About Giovanni De Poli. What is Model. Introduction. di Poli: Methodologies for Expressive Modeling of/for Music Performance

Analysis of local and global timing and pitch change in ordinary

Widmer et al.: YQX Plays Chopin 12/03/2012. Contents. IntroducAon Expressive Music Performance How YQX Works Results

Unobtrusive practice tools for pianists

Goebl, Pampalk, Widmer: Exploring Expressive Performance Trajectories. Werner Goebl, Elias Pampalk and Gerhard Widmer (2004) Introduction

Multidimensional analysis of interdependence in a string quartet

Machine Learning Term Project Write-up Creating Models of Performers of Chopin Mazurkas

SAMPLE ASSESSMENT TASKS MUSIC CONTEMPORARY ATAR YEAR 11

Temporal coordination in string quartet performance

Semi-automated extraction of expressive performance information from acoustic recordings of piano music. Andrew Earis

Computer Coordination With Popular Music: A New Research Agenda 1

Human Preferences for Tempo Smoothness

Improving Piano Sight-Reading Skills of College Student. Chian yi Ang. Penn State University

Zooming into saxophone performance: Tongue and finger coordination

Connecticut State Department of Education Music Standards Middle School Grades 6-8

From quantitative empirï to musical performology: Experience in performance measurements and analyses

Measuring & Modeling Musical Expression

Experiment on adjustment of piano performance to room acoustics: Analysis of performance coded into MIDI data.

CS229 Project Report Polyphonic Piano Transcription

Quarterly Progress and Status Report. Perception of just noticeable time displacement of a tone presented in a metrical sequence at different tempos

An Integrated Music Chromaticism Model

WHO IS WHO IN THE END? RECOGNIZING PIANISTS BY THEIR FINAL RITARDANDI

MPATC-GE 2042: Psychology of Music. Citation and Reference Style Rhythm and Meter

Toward a Computationally-Enhanced Acoustic Grand Piano

Instrumental Performance Band 7. Fine Arts Curriculum Framework

Curriculum Standard One: The student will listen to and analyze music critically, using the vocabulary and language of music.

Study Guide. Solutions to Selected Exercises. Foundations of Music and Musicianship with CD-ROM. 2nd Edition. David Damschroder

A Computational Model for Discriminating Music Performers

Tempo and Beat Analysis

Chords not required: Incorporating horizontal and vertical aspects independently in a computer improvisation algorithm

Music, Grade 9, Open (AMU1O)

An Interpretive Analysis Of Mozart's Sonata #6

Music Representations

> f. > œœœœ >œ œ œ œ œ œ œ

Music Performance Solo

ESP: Expression Synthesis Project

Unit Outcome Assessment Standards 1.1 & 1.3

jsymbolic 2: New Developments and Research Opportunities

jsymbolic and ELVIS Cory McKay Marianopolis College Montreal, Canada

A PRELIMINARY COMPUTATIONAL MODEL OF IMMANENT ACCENT SALIENCE IN TONAL MUSIC

Towards a Complete Classical Music Companion

Chapter 40: MIDI Tool

Music Performance Panel: NICI / MMM Position Statement

2016 HSC Music 1 Aural Skills Marking Guidelines Written Examination

On the contextual appropriateness of performance rules

Introductions to Music Information Retrieval

Temporal dependencies in the expressive timing of classical piano performances

Trevor de Clercq. Music Informatics Interest Group Meeting Society for Music Theory November 3, 2018 San Antonio, TX

17. Beethoven. Septet in E flat, Op. 20: movement I

PRESCHOOL (THREE AND FOUR YEAR-OLDS) (Page 1 of 2)

How to Obtain a Good Stereo Sound Stage in Cars

ANNOTATING MUSICAL SCORES IN ENP

Introduction. Figure 1: A training example and a new problem.

A cross-cultural comparison study of the production of simple rhythmic patterns

Student Performance Q&A: 2001 AP Music Theory Free-Response Questions

y POWER USER MUSIC PRODUCTION and PERFORMANCE With the MOTIF ES Mastering the Sample SLICE function

L van Beethoven: 1st Movement from Piano Sonata no. 8 in C minor Pathétique (for component 3: Appraising)

Assignment Ideas Your Favourite Music Closed Assignments Open Assignments Other Composers Composing Your Own Music

SWING, SWING ONCE MORE: RELATING TIMING AND TEMPO IN EXPERT JAZZ DRUMMING

MUSIC FOR THE PIANO SESSION TWO: FROM FORTEPIANO TO PIANOFORTE,

Feature-Based Analysis of Haydn String Quartets

EXPLORING EXPRESSIVE PERFORMANCE TRAJECTORIES: SIX FAMOUS PIANISTS PLAY SIX CHOPIN PIECES

Tonality Tonality is how the piece sounds. The most common types of tonality are major & minor these are tonal and have a the sense of a fixed key.

Music Performance Ensemble

SAMPLE ASSESSMENT TASKS MUSIC GENERAL YEAR 12

Swing Ratios and Ensemble Timing in Jazz Performance: Evidence for a Common Rhythmic Pattern

SAMPLE ASSESSMENT TASKS MUSIC JAZZ ATAR YEAR 11

Improvised Duet Interaction: Learning Improvisation Techniques for Automatic Accompaniment

2 The Tonal Properties of Pitch-Class Sets: Tonal Implication, Tonal Ambiguity, and Tonalness

Grade HS Band (1) Basic

ST. JOHN S EVANGELICAL LUTHERAN SCHOOL Curriculum in Music. Ephesians 5:19-20

RHYTHM. Simple Meters; The Beat and Its Division into Two Parts

Instrumental Music I. Fine Arts Curriculum Framework. Revised 2008

MARK SCHEME for the June 2005 question paper 0410 MUSIC

Extracting Significant Patterns from Musical Strings: Some Interesting Problems.

On music performance, theories, measurement and diversity 1

Smooth Rhythms as Probes of Entrainment. Music Perception 10 (1993): ABSTRACT

CSC475 Music Information Retrieval

Practice makes less imperfect: the effects of experience and practice on the kinetics and coordination of flutists' fingers

2014 Music Performance GA 3: Aural and written examination

Analysis and Clustering of Musical Compositions using Melody-based Features

Measurement of overtone frequencies of a toy piano and perception of its pitch

A Case Based Approach to the Generation of Musical Expression

Choir Scope and Sequence Grade 6-12

GRAAD 12 NATIONAL SENIOR CERTIFICATE GRADE 12

Authentication of Musical Compositions with Techniques from Information Theory. Benjamin S. Richards. 1. Introduction

MTO 18.1 Examples: Ohriner, Grouping Hierarchy and Trajectories of Pacing

INSTRUCTIONS TO CANDIDATES

Instrumental Music III. Fine Arts Curriculum Framework. Revised 2008

Transcription:

Werner Goebl, Sebastian Flossmann, and Gerhard Widmer Institute of Musical Acoustics, University of Music and Performing Arts Vienna Anton-von-Webern-Platz 1 13 Vienna, Austria goebl@mdw.ac.at Department of Computational Perception Johannes Kepler University Linz Altenberger Strasse 69 44 Linz, Austria {sebastian.flossmann, gerhard.widmer}@jku.at Austrian Research Institute for Artificial Intelligence Freyung 6/6, 11 Vienna, Austria Investigations of Between-Hand Synchronization in Magaloff s Chopin This article presents research towards automated computational analysis of large corpora of music performance data. In particular, we focus on betweenhand asynchronies in piano performances an expressive device in which the performer s timing deviates from the nominally synchronized timing of the score. Between-hand asynchronies play an important role, particularly in Romantic music, but they have not been assessed quantitatively in any substantial way. We give a first report on a computational approach to analyzing a unique corpus of historic performance data: basically the complete works of Chopin, performed by the Russian-Georgian pianist Nikita Magaloff. Corpora of that size hundreds of thousands of played notes with substantial expressive (and other) deviations from the written score require a level of automation of analysis that has not been attained so far. We describe the required processing steps, from converting scanned scores into symbolic notation, to score-performance matching, definition, and automatic measurement of between-hand asynchronies, and a computational visualization tool for exploring and understanding the extracted information. Temporal asynchronies between the members of musical ensembles have been found to exhibit specific regularities: The principal instruments in classical wind and string trios tend to be 3 msec ahead of the others (Rasch 1979); soloists in jazz Computer Music Journal, 34:3, pp. 3 44, Fall 21 c 21 Massachusetts Institute of Technology. ensembles show systematic temporal offsets relative to the rhythm group (Friberg and Sundström 22). As the two hands of a pianist are capable of producing different musical parts independently (Shaffer 1984), differences in the timing organization may be utilized as a means for artistic expression. Typically such asynchronies include bass anticipations, where the bass tone precedes the other notes by 7 msec or more (Vernon 1936; Goebl 21) or sequences of right-hand lags in jazz piano solos, where the soloist delays the onsets of a series of notes relative to the beat (played, e.g., by the left-hand chords, bass, and drums) only to come back into time again (e.g., found in Red Top in the Erroll Garner Trio album Concert of the Sea from 19). A similar effect is documented for the Classical Romantic piano repertoire, where particularly Chopin recommends the right hand to take as much temporal freedom as desired, while the left hand is instructed to keep like a conductor a strict timing ( tempo rubato in the earlier meaning, Hudson 1994). Furthermore, the melody voice in expressive piano performance (the most salient voice, usually the highest-pitched part) has been found to occur around 3 msec earlier than the tones of the other voices (melody lead, Palmer 1996); this effect, however, is associated with differences in the loudness of the tones and is best explained as an artifact of the different key and hammer velocities (Repp 1996; Goebl 21). In particular, melody lead within the same hand is caused by velocity differences; the within-hand asynchronies are also usually smaller Goebl, Flossmann, and Widmer 3

than those found between the hands (Repp 1996; Goebl 21). Thus, asynchronies in piano performance contain a wealth of potentially expressive features and at the same time reflect quite subtle effects such as melody lead. This article seeks to investigate particularly the more expressive aspects of the between-hand asynchronies, such as bass anticipations and regions of tempo rubato in the earlier meaning. We present preliminary results on the between-hand asynchronies in Magaloff s Chopin to demonstrate the variety of insights that such large corpora can offer. Toward the end of the article, we attempt to model these asynchronies on the basis of mostly local score features. Finally, we discuss the future pathways of this research endeavor and its potential for computational modeling and musicological investigation. The Chopin Corpus The analyzed Chopin corpus comprises live concert performances by the Georgian-Russian pianist Nikita Magaloff (1912 1992), who played almost the entire solo repertoire of Chopin in a series of six recitals between January and May 1989 at the Mozart-Saal of the Wiener Konzerthaus in Vienna, Austria. This concert hall provides about 7 seats (www.konzerthaus.at) and ranks among the most distinguished halls in Vienna. In this unprecedented project, Magaloff, by that time already 77 years old, performed all the works of Chopin for solo piano that appeared in print during Chopin s lifetime, keeping a strict ascending order by opus number, starting with the Rondo, op. 1, up to the three Waltzes, op. 64, including the 3 sonatas, 41 mazurkas, 2 préludes, 24 études, 18 nocturnes, 8 waltzes, 6 polonaises, 4 scherzos, 4 ballades, 3 impromptus, 3 rondos, and other works (Variations brillantes, Bolero, Tarantelle, Allegro de Concert, Fantaisie, Berceuse, Barcarole, and Polonaise-Fantaisie). The works not played were either piano works with orchestra accompaniment (op. 2, 11, 13, 14, 21, and 22), works with other instruments (op. 3, 8, and 6), or works with higher (op. posth., starting from op. 66, the Fantaisie-Impromptu) or no opus numbers. (It is only recently that several additional recordings were discovered, which Magaloff had played as encores; they have not yet been included in the corpus. Those are: Fantaisie-Impromptu op. 66, Variations Souvenir de Paganini, Waltz in E minor, Waltz in E-flat major, Ecossaises op. 72, no. 3, Waltz op. 69, no. 1.) Magaloff performed this concert series on a Bösendorfer SE computer-controlled grand piano (Moog and Rhea 199) that recorded his performances onto a computer hard disk. The SE format stores the performance information in a symbolic format with high precision (see Goebl and Bresin 23), providing detailed information on the onset and offset timing of each performed note (i.e., key depression), the dynamics in terms of the final hammer velocity of each note, and the continuous position for the three pedals (right: sustain, middle, left: una corda). The entire corpus comprises more than 1 individual pieces or movements, over 336, performed notes, or almost 1 hours of continuous performance. Computational Analysis of Performance Data Score Extraction In order to analyze symbolic performance data automatically, the performances have to be connected to the corresponding musical scores (scoreperformance matching). As symbolic scores were not available for the complete work of Chopin, the first step was to extract this information from the printed music scores. We used music recognition software (SharpEye 2.63 by Visiv) to convert the 946 pages of scanned music into a MusicXML (http://recordare.com) representation. Extensive manual verification of the conversion process was necessary to eliminate a considerable number of conversion errors, as well as scripted post-correction of conversion incapabilities of the used software (ottava lines, parts crossing staves, etc.). Score Performance Matching The symbolic MusicXML scores were then matched on a note-by-note basis to the Magaloff performances 36 Computer Music Journal

employing a semi-automatic procedure. The matching algorithm is based on an edit-distance metric (Mongeau and Sankoff 199). The matching results were inspected and if necessary corrected manually with an interactive graphical user interface that displays the note-by-note match between the score information and the performance. All incorrectly played notes or performed variants were identified and labeled. (This, by the way, will also make it possible to perform large-scale, in-depth analyses of the kinds of errors accomplished pianists make. First results of such an analysis are described by Flossmann, Goebl, and Widmer [29].) Defining and Measuring Asynchronies Our aim in the present study was to analyze the between-hand asynchronies of notes that are notated as nominally simultaneous in the score (that is, all tones belonging to the same score event ). To that end, we first needed to compute these asynchronies automatically from the corpus. The staff information of the musical notation (upper versus lower staff) was used to calculate the between-hand asynchronies. As the performance data do not contain information as to what hand played what parts of the music, we assumed that overall the right hand played the upper staff tones and the left hand the lower. Certainly, there are numerous passages where this simple assumption is wrong or not likely to be true (as there is no information about the fingering or handing of Magaloff s performance), but given the sheer size of the data set, the potential bias may be tolerable. Therefore, we computed a between-hand asynchrony for a given score event by subtracting the (averaged) onset times of the upper staff from the (averaged) onset times of the lower staff ( lower minus upper ). Averaging the note onsets within chords is reasonable, as the within-hand asynchronies are usually smaller (including the restricted melody lead effect, see Goebl 21) than between-hand asynchronies. Following this computation ( lower minus upper ), positive asynchrony values indicate that the upper-staff or right-hand notes are early, and negative numbers indicate that the lower-staff (left hand) notes are early. All notated arpeggios, ornaments, trills, or grace notes were excluded from our preliminary data analysis (about 1 percent of the entire data), as these cases feature special and usually larger asynchronies than regular score events. These special cases deserve a separate detailed analysis that would exceed the scope of the present article. Tool for Visualization For a first intuitive analysis and understanding of this huge amount of measurement data, adequate visualization methods are needed. Thus, we developed a dedicated computational visualization tool. A screenshot is presented in Figure 1. It comprises three panels arranged vertically, sharing the same time axis. The upper panel shows the individual tempo curves of the two hands (in case of multiple onsets in an event within a staff, the average onset is taken to compute tempo information). The middle panel shows the average asynchronies for each score event that contained simultaneous notes in each staff. The lower panel, finally, features a piano-roll representation of the performances with nominally simultaneous notes connected by (almost) vertical lines. The color (not shown here) of these lines is either red (indicating a right-hand lead) or green (indicating a left-hand lead). The gray area in the middle panel marks a range of ±3 msec within which asynchronies are not likely to be perceived as such (Goebl and Parncutt 22). Furthermore, the tool indicates occurrences of bass anticipations ( B.A., lower panel) and out-of-sync regions (horizontal bars, middle panel); see the following descriptions. First Results In the following, we present some preliminary results that should demonstrate the scope of results that such large-scale analyses yield. Overall Asynchronies The distribution of all asynchronies between the two hands is shown in Figure 2, including the Goebl, Flossmann, and Widmer 37

TEMPO (bpm) ASYNCH (msec) 24 16 12 8 6 4 3 2 1 1 2 11 1 Op. 27 No. 2 played by Nikita Magaloff 22 24 26 28 3 32 34 36 38 22 24 26 28 3 32 34 36 38 ) Right hand Left hand 9 PITCH 8 7 6 4 26 B.A. 226 B.A. 224 2 264 17 228 B.A. 3 2 22 24 26 28 3 32 34 36 38 TIME (sec) Figure 1. Screenshot of the visualization tool showing bars 4 of Chopin s Nocturne op. 27, no. 2, as performed by Nikita Magaloff, and the corresponding score excerpt. The upper panel shows the tempo curves of the two hands (where, for computing the tempo, we average all note onset times within a hand), the middle panel shows the mean asynchronies for events that contain simultaneous notes (positive values indicate an early right hand; negative an early left hand; the central area sketches the ±3-msec region around zero), and the lower panel features a piano roll representation. All nominally simultaneous notes are connected by (almost) vertical lines that are plotted in red when the melody (right hand) was ahead, green when it lagged. The black horizontal bars in the middle panel depict the extent of out-of-sync-regions (see the text for more information). The authors prepared the score excerpt by using notation software and following the Henle edition. mean and the mode value. The positive mode value reflects an overall tendency for the right hand to be early, which is most likely attributable to the well-understood melody lead effect (Goebl 21). Moreover, the mean value is slightly below the mode value reflecting a slightly skewed histogram towards the left side. Particularly in the region of 1 to 3 msec there is a slight increase of values, most likely due to frequent bass anticipations (thick line below main histogram). The asynchrony distributions of the individual pieces vary considerably and depend on the specifics of the pieces. The pieces played most synchronously by Magaloff are those that feature predominantly chordal textures (op. 4-1, 28-9, 28-2, 1-2, see Figure 3); the least synchronous pieces are those having predominantly textures with a single melody over a continuous accompaniment that leave more room for artistic interpretation (see the subsequent discussion of the tempo rubato in the earlier meaning ). 38 Computer Music Journal

Figure 2. Histogram of the signed between-hand asynchronies per event over the entire Chopin corpus (displaying a total of 63,344 asynchronies using a bin size of 1 msec). The y-axisisplotted logarithmically to emphasize the distribution of bass anticipations, which are drawn by an additional thicker line below the left-hand portion of the histogram (see the section Bass Anticipations in the text for a definition of this term). Event-wise Count 1 3 1 2 1 1 Mean = 4.388 Mode = 13. 1 2 2 Between-hand Asynchronies (msec) There is a significant effect of speed within the investigated pieces. Figure 3 shows the mean absolute (unsigned) asynchronies per piece (a) and the standard error of the asynchronies (b) against the average event rate (in events per second). An event rate value was computed for each score event by counting the performed events (chords) within a time window of 3 seconds around it. The average event rate is the piecewise mean of those values. We found that the faster the piece, the lower the absolute asynchrony and also the lower the variability of the asynchronies, which suggests that Magaloff uses more room to employ expressive asynchronies in slower pieces than in faster pieces. We also examined potential meter effects on the between-hand asynchronies. Chopin s music consists with only a few exceptions of four different types of meter: 2, 3, 4, and 6 beats per bar (if we consider only the numerator of the time signature). The majority of pieces are in a triple meter (3 beats per bar): all the mazurkas, waltzes, polonaises, and scherzos, some preludes, and some sonata movements, as well as other pieces. The other three meter categories (2, 4, and 6 beats per bar) contain roughly equal numbers of pieces, as well as roughly equal numbers of performed notes. The majority of the nocturnes have 4 beats per bar, the majority of études 2 beats per bar. In Figure 4, the mean asynchronies and the 9-percent confidence intervals are plotted against metrical position. The asynchronies that occur between full beats are treated as intermediate categories, because they usually involved fewer notes than those on full beats. They are plotted halfway between the beats in Figure 4. The metrical profiles show a slightly arched shape with a tendency to exhibit higher (positive) asynchrony values in the inner regions of the bar. However, even though the differences reach statistical significance (due to the extremely high numbers of data points), this tendency might be imposed by the larger negative outliers on the strong beats (melody delayed or bass anticipated). This special case is further examined in the following. Bass Anticipations A bass anticipation is labeled as such when the lowest tone of a lower-staff score event is more than msec ahead of the mean onsets of the upper-staff tones of that event. The overall distribution of the bass leads is shown in Figure 2 (lower plot on the left side of the histogram), and the individual pieces are shown in Figure. The proportion of bass anticipations is lowest on average for the études, the preludes, and the rondos (well below an average of 1 percent of simultaneous events), and highest in the mazurkas and the nocturnes (almost 2 percent). Bass anticipation ratios of zero were found for the preludes (16 out of 2 did not contain any bass anticipations) and the études (7 of 24). An exception is the Prelude op. 28, no. 2, which exhibits both the highest mean asynchronies and the largest proportion of bass anticipations among all pieces (clearly visible in Figures 3 and ). This very slow and short piece features a constant 1/8-note accompaniment with a single-note melody above it. The sad character and the slow tempo may be the reason for the high temporal independence of the melody in Magaloff s performance. There is also an effect of event rate, suggesting that bass leads become less frequent as the tempo of the pieces increases (see Figure ). Again, slower pieces leave more room for expressive freedom than do faster pieces. To further analyze the occurrences of bass anticipations, we categorized all score events Goebl, Flossmann, and Widmer 39

Figure 3. Absolute (unsigned) asynchronies (a) and standard error (SE) of the mean asynchronies (b) against the mean event rate per piece. The hyphenated numbers refer to the opus numbers of the respective pieces. Figure 4. Mean betweenhand asynchronies against metrical position, separately by the number of beats per bar. All asynchronies between full beats are put into single intermediate categories. Error bars denote 9 percent-confidence intervals of the means. a) 1 2 beats per bar (n = 29,329) 1 3 beats per bar (n = 8,21) Mean Unsigned Asynchrony (msec) 8 7 6 4 3 2 1 28 2 1 6 4 62 1 48 2 8 3 28 13 27 2 2 1 2 9 1 27 1 9 3 37 1 17 4 63 2 r =.28*** n = 1 p <.1 28 4 9 232 124 1 1 12 2 2 1 4 3 2 7 1 9 28 24 32 2 62 2 28 7 1 1 2 1 7 2 3 3 34 2 1 17 2 2 7 3 1 4 9 2 6 9 1 6 3 64 2 8 1 1 1 8 2 63 3 33 3 3 148 1 6 2 17 3 24 4 24 3 7 1 3 6 4 2 2319 29 28 19 28 6 47 362 31 2 11 3 4 26 1 3 9 3 38 28 21 28 23 34 1 2 3 3 4 3 3 41 1 28 1 28 17 33 4 4 261 1 37 2 6 1 1 7 3 41 3 6 2 64 3 7 1 12 2 8 43 8 4 28 3 33 1 4 1 1 3 17 1 28 1 24 23 2 1 49 28 16 4 2 28 11 39 42 64 1 6 1 46 28 16 63 1 41 4 44 3 34 3 1 7 2 6 1 4 1 7 4 28 14 4 4 1 8 3 2 41 2 18 28 22 33 2 6 3 26 2 2 4 28 2 2 9 28 12 28 18 1 2 28 9 4 1 Mean Signed Asynchronies (msec) 1 1 1 1 1 1 1 2 4 beats per bar (n = 2,644) 1 1 1 1 1 1 1 2 3 6 beats per bar (n = 22,163) 2 4 6 8 1 12 Mean Event Rate (events/sec) 1 1 2 3 4 1 Metrical Position (beat) 1 2 3 4 6 b) SE of Asnchronies (msec) 2 1 1 28 7 28 2 1 6 4 r =.417*** n = 1 p <.1 63 2 28 1 33 3 37 1 28 13 28 4 1 2 28 6 8 3 6 4 62 17 2 17 4 28 23 9 1 9 217 2 48 2 3 1 28 24 24 1 32 1 2 4 3 27 1 3 3 2 62 2 27 2 24 3 28 21 7 1 6 2 2 7 9 3 28 11 1 12 33 1 32 2 17 1 1 1 9 28 3 17 3 6 1 1 3 4 2 6 2 7 3 24 4 7 4 1 3 28 9 28 1 41 1 6 3 63 3 9 2 8 2 3 4 34 2 9 1 3 3 1 41 3 28 2 26 1 37 2 33 48 1 3 2 363 1 28 22 24 2 2 419 1 7 2 2 6 3 7 1 2 1 4 112 223 2 8 28 12 28 12 9 28 14 2 3 1 1 2 1 1 2 1 2 11 28 17 3 2 3 41 4 4 2 1 7 64 2 63 1 9 3 64 1 6 6 1 47 38 34 1 34 3 1 4 1 8 28 16 28 18 41 2 8 1 61 64 3 4 29 1833 231 16 42 2 6 28 19 3 4 4 1 26 2 44 349462 39 4 4 43 8 4 2 4 6 8 1 12 Mean Event Rate (events/sec) bar-wise into first beats, on-beats (all beat events except the first beat), and off-beats. It turns out that metrical position has a significant effect: The highest number of bass anticipations fall on the first beat (1.8 percent of all simultaneous events); other on-beat events receive the nexthighest number of bass anticipations (1.48 percent of simultaneous events), and.66 percent of simultaneous events are off-beat events with bass anticipations. This suggests that Magaloff uses bass anticipations to emphasize predominantly strong beats. The Earlier Type of Tempo Rubato An expressive means that has a long performance tradition is the tempo rubato in the earlier meaning (Hudson 1994). It refers to expressive temporal deviations of the melody line, while the accompaniment, offering the temporal reference frame, remains strictly in time. Chopin in particular often recommended that his students keep the accompaniment undisturbed like a conductor, and give the right hand the freedom of expression with fluctuations of speed (Hudson 1994, p. 193). In contrast, the later meaning of tempo rubato was used more and more to refer to the parallel slowing down and speeding up of all parts of the music (today more generally referred to as expressive timing). In expressive performance, both forms of rubato can be present simultaneously and can be used as means for deliberate expression. We aim at identifying sequences of earlier tempo rubato automatically from the entire corpus. To extract overall information about sequences where Magaloff apparently employed an earlier tempo rubato, we count the out-of-sync regions of each piece. An out-of-sync region is defined as a sequence of consecutive asynchronies, each of which is larger than the typical perceptual threshold (3 msec), but only if the sequence contains more elements (events) than occur per second in that piece on the average 4 Computer Music Journal

Figure. Proportion of bass anticipations against mean event rate per piece. Zero proportions (34 pieces) were excluded from the calculation of the regression line. Figure 6. The number of out-of-sync regions (earlier tempo rubato) per piece is plotted against the event rate. Proportion of Bass Anticipations.2.1.1. 28 2 63 2 1 6 17 4 r =.349*** n = 11 p <.1 9 2 4 2 8 2 37 1 28 13 41 124 1 62 1 1 2 8 3 9 2 27 2 17 3 33 4 6 28 23 7 2 26 1 17 2 1 1 17 124 4 28 21 3 3 248 1 7 6 3 2 1 4 1 38 1 3 4 3 34 2 36 4 2 6 1 2 23 2 8 48 2 3 1 2 7 7 24 2 1 2 9 1 9 3 28 11 2 4 18 28 2231 3 1 1 3 6 26 3 12 19 28 129 1 16 4 4 1 12 1 4 1 2 11 32 1 28 17 32 2 41 4 44 47 46 3 1 61 1 9 1 64 2 63 3 34 3 3 3 37 2 9 3 6 2 6 1 3 2 3 2 49 27 1 41 3 34 1 2 4 42 41 2 8 1 64 3 64 1 1 7 2 6 6 4 39 3 4 28 228 7 28 4 28 9 28 6 1 11 28 1 33 1 3 4 1 33 3 2 4 28 1 7 3 62 2 7 1 4 1 63 1 26 2 24 3 7 4 28 18 33 2 1 92 9 28 28 24 28 14 2 3 1 1 43 2 1 8 4 28 19 28 3 2 2 1 1 1 8 28 16 2 4 6 8 1 12 Mean Event Rate (events/sec) Number of O o S Regions 1 8 6 4 2 48 2 32 2 r =.349*** n = 89 p <.1 8 317 4 9 2 62 1 2 6 27 2 48 1 28 17 6 3 26 1 2 9 1 3 2 19 3 1 1 28 432 1 1 1 3 3 3 63 2 34 2 7 382 3 11 4 37 1 1 6 4 3 24 1 6 2 37 2 27 1 9 1 49 362 23 2 1 28 2 28 1 2 7 63 3 3 1 33 4 62 2 1 2 24 4 26 2 41 47 6146 8 1 19 29 2 2 3 3 1 328 13 33 1 17 1 4 7 3 3 2 3 24 3 4 2 2 1 41 2 64 3 1 9 31 39 28 28 1 1 12 8 4 2 11 3 4 28 2 28 7 28 9 28 6 1 11 41 1 17 217 3 3 4 1 33 3 28 1 9 29 3 24 2 6 3 6 1 7 1 4 1 63 1 6 4 41 3 28 21 44 6 2 6 1 28 11 3 7 4 28 18 34 1 2 4 7 12 64 2 28 2228 23 33 2 2 9 34 3 2 8 28 12 28 24 28 14 16 4 4 2 3 1 7 42 1 1 2 6 8 2 43 64 1 1 2 28 19 28 3 1 4 1 1 1 1 8 28 16 2 4 6 8 1 12 Mean Event Rate (events/sec) (i.e., more than 2 13 performed notes, depending on the piece; see the x-axis information of Figure 6). We link the search for out-of-sync regions to the average performance tempo (event rate), because faster pieces usually contain many shorter runs that are out-of-sync, but due to the fast tempo, these regions extend only to some fraction of a second. The region counts would otherwise be strongly biased towards higher figures at faster tempi. On average, a piece (or movement, in the case of a sonata) contains 1.8 such regions. The piece category with the lowest numbers are generally the waltzes, preludes, and études (below 1), and the pieces with the highest counts are by far the nocturnes (on average well over ), suggesting that particularly this genre within Chopin s music leaves the most room for letting the melody move freely above the accompaniment. This pattern is not an artifact of piece length; it remains the same when the out-ofsync region counts are normalized by the number of asynchronous events. Figure 6 shows the number of out-of-sync regions per piece against the average event rate of the piece. It demonstrates that faster pieces contain fewer such regions, suggesting that this form of tempo rubato is bound to slower and medium tempi (such as the nocturnes, the slowest category of piece in the Chopin corpus). This overall finding is not surprising; the earlier tempo rubato is expected to be found more often in melodic contexts than in virtuoso pieces, as the historic origins of the earlier tempo rubato go back to vocal music. To illustrate, the example of the visualization tool presented in Figure 1 is briefly discussed. It shows an excerpt (bars 4) of the Nocturne op. 27, no. 2 (including the score of the corresponding bars). This example contains two runs of tempo rubato as determined by the algorithm (indicated by horizontal bars in the middle panel). The first starts on the downbeat of bar, where Magaloff delayed the melody note by 26 msec, only to be early over the next few notes of the descending triplet passage. The beginning of the 48-tuplet figure (which is interpreted as sixteenth-note triplets as well) also leads the accompaniment. Towards its end, the second run of tempo rubato as determined by our algorithm begins, just when Magaloff starts to make the melody lag behind the accompaniment. This lag coincides with a downward motion and a notated decrescendo. The following embellishment of the B-flat (notated as thirty-second notes and thirty-second-note triplets) is again clearly ahead of the accompaniment. The first note of the next phrase is also ahead, potentially to underline the notated anticipation of the upcoming harmony change towards E-flat minor. Overall, many occurrences of tempo rubato in its earlier meaning can be found in Magaloff s performances, suggesting that he may have used these Goebl, Flossmann, and Widmer 41

runs of between-hand asynchronies as an expressive device. However, we do not have any information about his particular intentions regarding this parameter of expression. Moreover, we do not have comparable on-stage professional performance data to be able to make statements as to whether Magaloff s strategy differs from other performers strategies. Modeling of Between-Hand Asynchronies In the previous section, we have described the variety of between-hand asynchronies across Magaloff s performance of Chopin s works. Here, we attempt to model Magaloff s asynchronies and evaluate the degree to which these asynchronies can be predicted from a battery of (mostly local) score-based features. A probabilistic model (see Lauritzen 1996) was used for learning the dependency of between-hand asynchronies on characteristics of the score. The system, as described in Flossmann, Grachten, and Widmer (29), already proved suitable for a similar task: to learn to predict tempo, loudness, and articulation from score features for the purpose of performance rendering (Widmer, Flossmann, and Grachten 29). As the system is designed to process melody notes only (although the entire score is known), the asynchrony value for a melody note was calculated by averaging the asynchronies between the left and right hands at the note s onset (as described in the section Defining and Measuring Asynchronies). For melody notes that had no nominally simultaneous score event in the lower staff, a corresponding lower-staff onset value was linearly interpolated from the surrounding (lower-staff) notes. The score features consist of the following: metrical position of a score event within a bar; a binary feature per staff (upper and lower) indicating whether the event consists of one note or more than one at a time; the note density relation between upper and lower staff (describing the ratio of number of onsets in the upper staff versus those in the lower staff); the pitch interval from the current melody note to the following one; the ratio of the score durations of two successive melody notes; and finally a notion of melodic closure derived from an Implication-Realization (IR) analysis of the data, based on Narmour s melodic analysis of musical structures as described by Narmour (199) and computed automatically (Grachten 26). With the exception of the IR analysis, where one value may relate to observations from several bars, all features describe local characteristics of the score. The data set was grouped by the number of beats per bar, as in the metrical analysis (see Figure 4). The correlation between the predicted and the actual asynchrony values is used as a measure of the quality of the prediction. The predictive quality of a single feature or a combination of several features is indicated by the piecewise correlations averaged over a threefold cross-validation. For a first attempt at finding significant score characteristics, all possible combinations of the previously mentioned features were evaluated. Close inspection of one of the four data sets the pieces with two beats per bar reveals the following. The feature combination resulting in the highest average correlation (.13) consists of metrical position, duration ratio, and note density relation. Two pieces, the Etude op. 2, no. 11 and the Impromptu op. 29, were predicted particularly well, with an average correlation over all feature combinations of.22 and.29, respectively. The best results for the two pieces are.32 (metrical position, multi-voice upper/lower staff, note density relation, duration ratio, and IR closure) and.48 (metrical position, multi-voice lower staff, note density relation, and IR closure), respectively. The data set also contained two pieces that provided the worst results across all feature combinations: the Prelude op. 28, no. 4 (average correlation.41) and the Etude op. 1, no. 3 (average correlation.21). Judging by the fact that those four pieces exhibit rather constant values across all feature combinations, it is very likely that there are fundamental, structural differences responsible for the inconsistent results of the model. Further analysis may provide clues concerning the nature of those systematic differences. Summary and Future Work This article has presented a computational approach to making large performance corpora accessible to 42 Computer Music Journal

detailed analysis. We defined and automatically measured between-hand synchronization in one pianist s performances over 1 pieces by Frédéric Chopin. Working with data sets of that size, i.e., performances of the complete works of a composer or several hundred thousand played notes, requires, among other things, effective score-performance matching algorithms and interactive graphical user interfaces for post-hoc data inspection and correction. Exploratory data analysis of the between-hand synchronization attempted to demonstrate the rich use of asynchronies in Magaloff s Chopin, a historic document of a unique performance project. We sketched overall trends of asynchronicity with respect to pieces, tempo, and metrical constraints, as well as specific cases of bass anticipations and occurrences of tempo rubato in its earlier meaning. Furthermore, we tried to predict Magaloff s asynchronies from a battery of (mostly local) score features with a graphical probabilistic model. It turned out that in certain pieces, such a simplistic model performed well in predicting the between-hand asynchronies, but in many others it failed to do so. This research endeavor is preliminary as it stands. Based on the gained insights, further efforts will be made to model asynchronies in Romantic scores in the spirit of Nikita Magaloff s intrinsic style. Training machine-learning algorithms on morecomplex, global aspects of the score as well as meta-information about the piece might lead to more predictive computational models of betweenhand asynchrony. Existing performance-rendering systems can greatly benefit from such models by incorporating this important expressive device, which has hitherto been neglected. Valuable musicological insight can be gained by trying to describe parts of the data by an interpretable rule system. To be able to automatically examine performance corpora of this scale offers completely new pathways for computational musicology. Historic documents such as the present corpus are in manageable reach for detailed analysis. Other large corpora, such as piano rolls of historic reproducing pianos, or the performance database of the Yamaha ecompetition (www.piano-e-competition.com), will be additional sources for future large-scale performance investigation. Finally, detailed knowledge derived from performances by established musicians will help us develop real-time visualization tools that give intelligent feedback to practicing piano students to enhance their awareness of what they are doing, and potentially to help them improve their playing. Acknowledgments This research was supported by the Austrian Research Fund (FWF, grants P19349-N1 and Z19 Wittgenstein Award ). We are indebted to Mme. Irène Magaloff for her generous permission to use her late husband s performance data for our research, and to an anonymous reviewer for very helpful comments. References Flossmann, S., W. Goebl, and G. Widmer. 29. Maintaining Skill Across the Life Span: Magaloff s Entire Chopin at age 77. Proceedings of the International Symposium on Performance Science 29. Utrecht, The Netherlands: European Association of Conservatoires (AEC), pp. 119 124. Flossmann, S., M. Grachten, and G. Widmer. 29. Expressive Performance Rendering: Introducing Performance Context. Proceedings of the SMC 29 6th Sound and Music Computing Conference. Porto, Portugal: Instituto de Engenharia de Sistemas e Computadores, pp. 1 16. Friberg, A., and A. Sundström. 22. Swing Ratios and Ensemble Timing in Jazz Performance: Evidence for a Common Rhythmic Pattern. Music Perception 19(3):333 349. Goebl, W. 21. Melody Lead in Piano Performance: Expressive Device or Artifact? Journal of the Acoustical Society of America 11(1):63 72. Goebl, W., and R. Bresin. 23. Measurement and Reproduction Accuracy of Computer-Controlled Grand Pianos. Journal of the Acoustical Society of America 114(4):2273 2283. Goebl, W., and R. Parncutt. 22. The Influence of Relative Intensity on the Perception of Onset Asynchronies. Proceedings of the 7th International Conference on Music Perception and Cognition (ICMPC7). Adelaide, Australia: Causal Productions, pp. 613 616. Grachten, M. 26. Expressivity-Aware Tempo Transformations of Music Performances Using Case Based Goebl, Flossmann, and Widmer 43

Reasoning. PhD thesis, Department of Technology, Pompeu Fabra University, Barcelona, Spain. Hudson, R. 1994. Stolen Time: The History of Tempo Rubato. Oxford, UK: Clarendon Press. Lauritzen, S. L. 1996. Graphical Models. Oxford, UK: Clarendon Press. Mongeau, M., and D. Sankoff. 199. Comparison of Musical Sequences. Computers and the Humanities 24:161 17. Moog, R. A., and T. L. Rhea. 199. Evolution of the Keyboard Interface: The Bösendorfer 29 SE Recording Piano and the Moog Multiply-Touch-Sensitive Keyboards. Computer Music Journal 14(2):2 6. Narmour, E. 199. The Analysis and Cognition of Basic Melodic Structures: The Implication-Realization Model. Chicago, Illinois: University of Chicago Press. Palmer, C. 1996. On the Assignment of Structure in Music Performance. Music Perception 14(1):23 6. Rasch, R. A. 1979. Synchronization in Performed Ensemble Music. Acustica 43:121 131. Repp, B. H. 1996. Patterns of Note Onset Asynchronies in Expressive Piano Performance. Journal of the Acoustical Society of America 1(6):3917 3932. Shaffer, L. H. 1984. Timing in Solo and Duet Piano Performances. Quarterly Journal of Experimental Psychology 36A(4):77 9. Vernon, L. N. 1936. Synchronization of Chords in Artistic PianoMusic. InC.E.Seashore,ed.Objective Analysis of Musical Performance, Studies in the Psychology of Music, volume IV. Iowa City, Iowa: University Press, pp. 36 34. Widmer, G., S. Flossmann, and M. Grachten. 29. YQX Plays Chopin. AI Magazine 3(3):3 48. 44 Computer Music Journal