Multidimensional analysis of interdependence in a string quartet

Similar documents
Temporal coordination in string quartet performance

Chords not required: Incorporating horizontal and vertical aspects independently in a computer improvisation algorithm

OBSERVED DIFFERENCES IN RHYTHM BETWEEN PERFORMANCES OF CLASSICAL AND JAZZ VIOLIN STUDENTS

From quantitative empirï to musical performology: Experience in performance measurements and analyses

However, in studies of expressive timing, the aim is to investigate production rather than perception of timing, that is, independently of the listene

Computer Coordination With Popular Music: A New Research Agenda 1

Video-based Vibrato Detection and Analysis for Polyphonic String Music

A FUNCTIONAL CLASSIFICATION OF ONE INSTRUMENT S TIMBRES

PROBABILISTIC MODELING OF BOWING GESTURES FOR GESTURE-BASED VIOLIN SOUND SYNTHESIS

Automatic characterization of ornamentation from bassoon recordings for expressive synthesis

A prototype system for rule-based expressive modifications of audio recordings

The song remains the same: identifying versions of the same piece using tonal descriptors

This is an electronic reprint of the original article. This reprint may differ from the original in pagination and typographic detail.

Curriculum Vitae. Electrical Engineering, BSc September 1997 June 2000 Specialization: Control and Automation

Power Standards and Benchmarks Orchestra 4-12

Timing In Expressive Performance

Tempo and Beat Analysis

Proceedings of Meetings on Acoustics

Practice makes less imperfect: the effects of experience and practice on the kinetics and coordination of flutists' fingers

AN ARTISTIC TECHNIQUE FOR AUDIO-TO-VIDEO TRANSLATION ON A MUSIC PERCEPTION STUDY

Oskaloosa Community School District. Music. Grade Level Benchmarks

Rhythm related MIR tasks

The Relationship Between Auditory Imagery and Musical Synchronization Abilities in Musicians

A STUDY OF ENSEMBLE SYNCHRONISATION UNDER RESTRICTED LINE OF SIGHT

Middle School Intermediate/Advanced Band Pacing Guide

Topic 11. Score-Informed Source Separation. (chroma slides adapted from Meinard Mueller)

Zooming into saxophone performance: Tongue and finger coordination

Instrumental Music III. Fine Arts Curriculum Framework. Revised 2008

About Giovanni De Poli. What is Model. Introduction. di Poli: Methodologies for Expressive Modeling of/for Music Performance

Computational analysis of rhythmic aspects in Makam music of Turkey

Scheme of work: 2 years (A-level)

Improvised Duet Interaction: Learning Improvisation Techniques for Automatic Accompaniment

The Environment and Organizational Effort in an Ensemble

West Windsor-Plainsboro Regional School District String Orchestra Grade 9

Annotation and the coordination of cognitive processes in Western Art Music performance

Greenwich Public Schools Orchestra Curriculum PK-12

Instrumental Music I. Fine Arts Curriculum Framework. Revised 2008

Instrumental Music II. Fine Arts Curriculum Framework. Revised 2008

Plainfield Music Department Middle School Instrumental Band Curriculum

Instrumental Music II. Fine Arts Curriculum Framework

A Bayesian Network for Real-Time Musical Accompaniment

MUSI-6201 Computational Music Analysis

HST 725 Music Perception & Cognition Assignment #1 =================================================================

MUSIC CURRICULM MAP: KEY STAGE THREE:

Music Similarity and Cover Song Identification: The Case of Jazz

Temporal Coordination and Adaptation to Rate Change in Music Performance

Introduction to Performance Fundamentals

2 2. Melody description The MPEG-7 standard distinguishes three types of attributes related to melody: the fundamental frequency LLD associated to a t

Consistency of timbre patterns in expressive music performance

Music Curriculum Glossary

Assessment may include recording to be evaluated by students, teachers, and/or administrators in addition to live performance evaluation.

Musical Developmental Levels Self Study Guide

On time: the influence of tempo, structure and style on the timing of grace notes in skilled musical performance

Chroma Binary Similarity and Local Alignment Applied to Cover Song Identification

Rhythm analysis. Martin Clayton, Barış Bozkurt

Contest and Judging Manual

Quarterly Progress and Status Report. Perception of just noticeable time displacement of a tone presented in a metrical sequence at different tempos

Measuring & Modeling Musical Expression

Maintaining skill across the life span: Magaloff s entire Chopin at age 77

Toward a Computationally-Enhanced Acoustic Grand Piano

Scheme of work: Co-teaching AS and A- level (1 year)

The Beat Alignment Test (BAT): Surveying beat processing abilities in the general population

Music Standards for Band. Proficient Apply instrumental technique (e.g., fingerings, bowings, stickings, playing position, tone quality, articulation)

Greeley-Evans School District 6 Year One Beginning Orchestra Curriculum Guide Unit: Instrument Care/Assembly

METRICAL STRENGTH AND CONTRADICTION IN TURKISH MAKAM MUSIC

6.5 Percussion scalograms and musical rhythm

GCT535- Sound Technology for Multimedia Timbre Analysis. Graduate School of Culture Technology KAIST Juhan Nam

CHILDREN S CONCEPTUALISATION OF MUSIC

Greeley-Evans School District 6 High School (Year 3 & 4) Symphony Orchestra Curriculum Guide Unit: Intonation, balance, blend

POST-PROCESSING FIDDLE : A REAL-TIME MULTI-PITCH TRACKING TECHNIQUE USING HARMONIC PARTIAL SUBTRACTION FOR USE WITHIN LIVE PERFORMANCE SYSTEMS

Analysis of local and global timing and pitch change in ordinary

ESTIMATING THE ERROR DISTRIBUTION OF A TAP SEQUENCE WITHOUT GROUND TRUTH 1

Noise. CHEM 411L Instrumental Analysis Laboratory Revision 2.0

GRATTON, Hector CHANSON ECOSSAISE. Instrumentation: Violin, piano. Duration: 2'30" Publisher: Berandol Music. Level: Difficult

Scheme of work: 2 years (AS and A-level)

Clark County School District Las Vegas, Nevada

7th Grade Beginning Band Music

Influence of timbre, presence/absence of tonal hierarchy and musical training on the perception of musical tension and relaxation schemas

THE SOUND OF SADNESS: THE EFFECT OF PERFORMERS EMOTIONS ON AUDIENCE RATINGS

The Keyboard. Introduction to J9soundadvice KS3 Introduction to the Keyboard. Relevant KS3 Level descriptors; Tasks.

Finger motion in piano performance: Touch and tempo

performance may vary as a function of piece, genre, socio- cultural context, and performers

The role of texture and musicians interpretation in understanding atonal music: Two behavioral studies

Assessment Schedule 2017 Music: Demonstrate knowledge of conventions in a range of music scores (91276)

Curriculum Standard One: The student will listen to and analyze music critically, using the vocabulary and language of music.

Version 5: August Requires performance/aural assessment. S1C1-102 Adjusting and matching pitches. Requires performance/aural assessment

MHSIB.5 Composing and arranging music within specified guidelines a. Creates music incorporating expressive elements.

Listen to recording, write about/discuss sounds (15) -Introduce timbre (10) -Improvisatory exercise exploring timbres (CMP 1.

Intermediate Concert Band

Good playing practice when drumming: Influence of tempo on timing and preparatory movements for healthy and dystonic players

Advanced Orchestra Performance Groups

Ensemble Novice DISPOSITIONS. Skills: Collaboration. Flexibility. Goal Setting. Inquisitiveness. Openness and respect for the ideas and work of others

SIEMPRE. D3.3 SIEMPRE and SIEMPRE-INCO extension Final version of techniques for data acquisition and multimodal analysis of emap signals

INSTRUMENTAL MUSIC SKILLS

Automatic Music Clustering using Audio Attributes

Information Sheets for Proficiency Levels One through Five NAME: Information Sheets for Written Proficiency Levels One through Five

TOWARDS IMPROVING ONSET DETECTION ACCURACY IN NON- PERCUSSIVE SOUNDS USING MULTIMODAL FUSION

Essential Standards Endurance Leverage Readiness

TEST SUMMARY AND FRAMEWORK TEST SUMMARY

Alleghany County Schools Curriculum Guide

Transcription:

International Symposium on Performance Science The Author 2013 ISBN tbc All rights reserved Multidimensional analysis of interdependence in a string quartet Panos Papiotis 1, Marco Marchini 1, and Esteban Maestre 1 1 Music Technology Group, Universitat Pompeu Fabra, Spain In a musical ensemble such as a string quartet, the performers can influence each other s actions in several aspects of the performance simultaneously. Based on a set of recorded string quartet exercises, we carry out a quantitative analysis of ensemble interdependence in four distinct dimensions of the performance (dynamics, intonation, tempo and timbre). We investigate the fluctuations of interdependence across these four dimensions, and in relation to the exercise that is being performed. Our findings suggest that although certain differences can be observed between the four dimensions, the most influential factor on ensemble interdependence is the musical task, shaped by the underlying score. Keywords: interdependence; string quartet; ensemble performance; signal processing; motion capture Studying the inner workings of joint music performance is a complex task. Previous research (Keller, 2009) points out some important characteristics of ensemble performance: the sharing of a common goal among performers, the implicit (i.e. non-verbal) communication between performers, and specific ensemble skills which are required for ensemble cohesion to be achieved. Previous research on musical collaboration has been carried out for tapping tasks (Repp, 2005) and piano duets (Goebl and Palmer, 2009), among others. On the subject of interdependence (as opposed to synchronization), different computational approaches for intonation and dynamics have been evaluated in (Papiotis et al, 2012). In this article, we analyze several recordings of a professional string quartet in terms of ensemble interdependence - the degree to which the musicians are influencing each other s performance. We extract numerical features that characterize the produced sound in terms of four performance dimensions (dynamics, intonation, tempo, and timbre), and quantify the amount of interdependence between these features for each pair of

002 WWW.PERFORMANCESCIENCE.ORG musicians. Finally, we aggregate the obtained results to investigate relationships between dimensions, and the effect of the underlying musical score on the overall amount of interdependence. Experimental Material METHOD The experimental recordings are based on an exercise handbook for string quartets (Heimann, 1995), specifically designed to assist in improving the ensemble s capabilities for collaborative expression. This material is divided into six categories (Dynamics, Intonation, Phrasing, Rhythm, Unity of Execution, Timbre). We analyzed nine of the recorded exercises; a brief description of each exercise if provided in Table 1: Table 1. Description of the recorded exercises per category and exercise focus. ID Category Exercise focus Duration D1 Dynamics Vertical listening, the ability to adjust one s intonation according to the intonation of the rest of the ensemble D2 Dynamics Immediate ( subito ) changes in dynamics I1 Intonation Gradual ( crescendo/diminuendo ) changes in dynamics 5:00 P1 Phrasing Synchronous bow strokes of slurred notes ( legato ) 3:00 R1 Rhythm Small changes in tempo ( poco piu/meno mosso ) 3:00 R2 Rhythm Different degrees of rhythmic syncopation 3:00 U1 Unity of Execution Sound as one instrument (chords) U2 Unity of Execution Sound as one instrument (ascending/descending scales) T1 Timbre Similar tone quality for different bow/string contact points ( sul tasto/sul ponticello ) and different dynamics levels. Each exercise was recorded in two experimental conditions: solo and ensemble. In the first condition (solo), each musician performs their part alone without having access to the full ensemble score. In the second condition (ensemble), the quartet performs the exercise together following a brief rehearsal period (~10 minutes). Data Acquisition & Processing All exercises have been recorded by the same group of professional musicians. Individual audio for each musician is acquired through the use of piezoelectric pickups attached to the bridge of the instrument while motion

INTERNATIONAL SYMPOSIUM ON PERFORMANCE SCIENCE 003 capture data are acquired through the use of a wired MOCAP system that tracks the movement of the bow in relation to the instrument strings. Instrumental (sound-producing) gestures are computed from the raw motion capture data as described in (Maestre, 2009). For every recording, a semiautomatic alignment between the performance and the music score is performed using a dynamic programming routine that combines audio and instrumental gesture features to detect note change events. Interdependence estimation The general framework for estimating interdependence in a single performance dimension is the following: first, four continuous features (one time series for each musician) are extracted from the recorded performances. Then, using a sliding window analysis, we sequentially calculate the Mutual Information between each pair of features for every window. Finally, a single overall interdependence value is obtained by averaging across all musician pairs and analysis windows (Papiotis et al, 2012). For the Dynamics dimension, we extract the Root Mean Square (RMS) energy of each musician s individual pickup signal, mapped to a logarithmic scale. For exercises with score-imposed changes in dynamics (D1, D2 and T2), we apply a note-by-note detrending to the logrms feature in order to remove any bias introduced by dynamics-related indications appearing in the score. For the Intonation dimension, we extract the so-called Intonation deviations - the difference between the estimated pitch from the recordings and the reference pitch that is obtained by the aligned score (according to equal temperament). For the Tempo dimension, we compute a tempo curve for each musician using the note onset times provided by the score-performance alignment. Given the relatively short duration of the exercises, Mutual Information is applied to the entire tempo curves instead of windowing them. For the Timbre dimension, we use two separate features the bow-bridge distance, the distance (in cm) of the point of contact between bow and string from the instrument s bridge, and the Spectral Crest, a descriptor of spectral peakiness that has low values for noisy signals (and therefore flat spectrums) and high values for tonal signals; after computing the amount of interdependence for both features, we average the two results to obtain a single value. The above procedure is carried out in each recorded exercise, both for the ensemble as well as the (artificially synchronized) solo recordings; in this way, solo interdependence is used as a baseline that is subtracted from the ensemble interdependence, removing any bias introduced by the score. As a

004 WWW.PERFORMANCESCIENCE.ORG final post-processing step, we normalize the obtained Mutual Information values per dimension, according to the Euclidean norm across all exercise categories. RESULTS Figure 1 shows the mean normalized values for Mutual Information per exercise and performance dimension: Mutual Information (normalized) 1 0.8 0.6 0.4 0.2 0 D1 D2 I1 P1 R1 R2 U1 U2 T2 DYN INT TMP TBR Figure 1. Normalized values of Mutual Information per exercise and performance dimension (DYN=Dynamics, INT = Intonation, TMP = Tempo and TBR = Timbre). One can first observe that the estimated Mutual Information values for each exercise type vary according to the exercise goal: the Dynamics exercises demonstrate highest interdependence for the Dynamics dimension, the Intonation exercise for the Intonation dimension, the Rhythm exercises for the Tempo dimension, and the Timbre exercise for the Timbre dimension; moreover, the Unity of execution exercises demonstrate highest interdependence for the Dynamics and Tempo dimensions. The sole exception is the Phrasing exercise, which demonstrates highest amounts of interdependence for the Intonation and Timbre dimensions but notably lacks interdependence in the Dynamics dimension. Mean interdependence per dimension across all exercises is from highest to lowest as follows: Tempo (0.385), Dynamics (0.349), Timbre (0.340) and Intonation (0.306). The small differences across dimensions suggest that each dimension is of equal importance to the overall ensemble interdependence. In addition to interdependence, we calculated two statistics for each exercise: the Mean Absolute Asynchrony between each pair of simultaneous notes in the score, and the Mean Note Duration. The obtained values for each exercise can be seen in Table 2:

INTERNATIONAL SYMPOSIUM ON PERFORMANCE SCIENCE 005 Table 2. Mean Absolute Asynchrony and Mean Note Duration for each exercise. Exercise ID D1 D2 I1 P1 R1 R2 U1 U2 T1 MAA (seconds) 0.100 0.091 0.114 0.036 0.042 0.037 0.054 0.022 0.118 MND (seconds) 4.535 4.419 6.555 0.972 0.939 0.572 1.485 0.309 4.624 It can be seen that across all exercises, the asynchrony between musicians can vary from small values (~20 milliseconds, U2) to quite large values (~120 milliseconds, T1). The fact that the Dynamics, Intonation and Timbre exercises sustain high amounts of interdependence despite the large asynchronies supports the notion that synchronization and interdependence are two separate qualities, each describing a different aspect of ensemble performance. A correlation analysis between Mean Note Duration and each performance dimension revealed a positive correlation for the Dynamics (0.86, p<0.05) and Intonation (0.79, p<0.05) dimensions. Finally, Figure 3 shows the overall amount of interdependence per exercise, averaged across all four dimensions: 1 Mutual Information (normalized) 0.8 0.6 0.4 0.2 0 D1 D2 I1 P1 R1 R2 U1 U2 T Figure 3. Mutual Information values averaged across performance dimensions for each exercise. One can see that the highest interdependence values occur for the exercises that are based on simpler concepts (Dynamics, Intonation, Rhythm and Timbre), while the Phrasing and Unity of Execution exercises which require coordination in multiple aspects simultaneously sustain lower amounts of interdependence. From the above figure, it can be observed that ensemble interdependence is not an ever-present quality, but rather a varying quantity that is strongly influenced by the underlying musical score.

006 WWW.PERFORMANCESCIENCE.ORG DISCUSSION We directed our focus on a little-researched topic in ensemble music performance, the concept of interdependence between musicians. While some dimensions appear to sustain higher levels of interdependence more commonly than others, it is seen that the underlying musical task is ultimately the most influential factor, as a common goal shared by the musicians. We believe that through the analysis of more recordings, the inclusion of musical pieces besides exercises, and a more sophisticated analysis of the musical score, such a methodology can yield important conclusions on the complex subject of joint musical performance. Acknowledgments The work presented on this document has been partially supported by the EU-FP7 FET SIEMPRE project and an AGAUR research grant from Generalitat de Catalunya. The authors would like to thank Marcelo Wanderley, Erika Donald and Alfonso Perez Carrillo for their support in carrying out the experiments, as well as the CIRMMT and BRAMS labs at Montreal, Quebec, Canada for hosting them. Address for correspondence Panos Papiotis, Music Technology Group, Universitat Pompeu Fabra, Tanger 122-144, Barcelona, Catalunya, 08018, Spain; Email: panos.papiotis@upf.edu References Goebl, W. and Palmer, C. (2009). Synchronization of timing and motion among performing musicians. Music Perception, 26(5), pages 427 438, 2009. Keller, P. (2008). Joint action in music performance. EMERGING COMMUNICATION, 10:205. Maestre, E. (2009). Modeling instrumental gestures: an analysis/synthesis framework for violin bowing. PhD thesis, Universitat Pompeu Fabra. Heimann, M. (1958). Exercises for the string quartet. E.S.T.A. Denmark Papiotis, P., Marchini, M, and Maestre, E. (2012). Computational analysis of solo versus ensemble performance in string quartets: Dynamics and Intonation. In Proceedings of the 12th International Conference of Music Perception and Cognition (ICMPC12), Thessaloniki, Greece. Repp, B. H. (2005). Sensorimotor synchronization: a review of the tapping literature. Psychonomic bulletin review 12(6):969-992