This is an electronic reprint of the original article. This reprint may differ from the original in pagination and typographic detail.

Similar documents
Temporal coordination in string quartet performance

Multidimensional analysis of interdependence in a string quartet

Finger motion in piano performance: Touch and tempo

THE SOUND OF SADNESS: THE EFFECT OF PERFORMERS EMOTIONS ON AUDIENCE RATINGS

The Relationship Between Auditory Imagery and Musical Synchronization Abilities in Musicians

This project builds on a series of studies about shared understanding in collaborative music making. Download the PDF to find out more.

EMBODIED EFFECTS ON MUSICIANS MEMORY OF HIGHLY POLISHED PERFORMANCES

The Beat Alignment Test (BAT): Surveying beat processing abilities in the general population

This is an electronic reprint of the original article. This reprint may differ from the original in pagination and typographic detail.

Zooming into saxophone performance: Tongue and finger coordination

An Investigation of Musicians Synchronization with Traditional Conducting Beat Patterns

Musical Entrainment Subsumes Bodily Gestures Its Definition Needs a Spatiotemporal Dimension

Detecting Audio-Video Tempo Discrepancies between Conductor and Orchestra

Good playing practice when drumming: Influence of tempo on timing and preparatory movements for healthy and dystonic players

Temporal Coordination and Adaptation to Rate Change in Music Performance

MOST FORMS OF ENSEMBLE PERFORMANCE SYNCHRONIZATION OF TIMING AND MOTION AMONG PERFORMING MUSICIANS

MUCH OF THE WORLD S MUSIC involves

However, in studies of expressive timing, the aim is to investigate production rather than perception of timing, that is, independently of the listene

On time: the influence of tempo, structure and style on the timing of grace notes in skilled musical performance

Sensorimotor synchronization with chords containing tone-onset asynchronies

Rhythm analysis. Martin Clayton, Barış Bozkurt

Beating time: How ensemble musicians cueing gestures communicate beat position and tempo

Does Music Directly Affect a Person s Heart Rate?

Activation of learned action sequences by auditory feedback

Temporal coordination in joint music performance: effects of endogenous rhythms and auditory feedback

Compose yourself: The Emotional Influence of Music

The Sound of Emotion: The Effect of Performers Emotions on Auditory Performance Characteristics

Computer Coordination With Popular Music: A New Research Agenda 1

Controlling Musical Tempo from Dance Movement in Real-Time: A Possible Approach

Analysis of local and global timing and pitch change in ordinary

EXPLORING THE PERCEPTION OF EXPRESSIVITY AND INTERACTION WITHIN MUSICAL DYADS

performance may vary as a function of piece, genre, socio- cultural context, and performers

TOWARDS IMPROVING ONSET DETECTION ACCURACY IN NON- PERCUSSIVE SOUNDS USING MULTIMODAL FUSION

Tapping to Uneven Beats

Speech Recognition and Signal Processing for Broadcast News Transcription

Differences in Metrical Structure Confound Tempo Judgments Justin London, August 2009

MELODIC AND RHYTHMIC CONTRASTS IN EMOTIONAL SPEECH AND MUSIC

This is an electronic reprint of the original article. This reprint may differ from the original in pagination and typographic detail.

Durham Research Online

Expressive performance in music: Mapping acoustic cues onto facial expressions

TEMPO AND BEAT are well-defined concepts in the PERCEPTUAL SMOOTHNESS OF TEMPO IN EXPRESSIVELY PERFORMED MUSIC

Entrainment and joint action in music performance. Martin Clayton, Durham CMPCP, 17 th July 2014

Effects of Auditory and Motor Mental Practice in Memorized Piano Performance

Human Preferences for Tempo Smoothness

AN ARTISTIC TECHNIQUE FOR AUDIO-TO-VIDEO TRANSLATION ON A MUSIC PERCEPTION STUDY

Acoustic and musical foundations of the speech/song illusion

Motion Analysis of Music Ensembles with the Kinect

SWING, SWING ONCE MORE: RELATING TIMING AND TEMPO IN EXPERT JAZZ DRUMMING

Polyrhythms Lawrence Ward Cogs 401

Chords not required: Incorporating horizontal and vertical aspects independently in a computer improvisation algorithm

THE INTERACTION BETWEEN MELODIC PITCH CONTENT AND RHYTHMIC PERCEPTION. Gideon Broshy, Leah Latterner and Kevin Sherwin

Timing variations in music performance: Musical communication, perceptual compensation, and/or motor control?

Perceiving temporal regularity in music

Aalborg Universitet. The influence of Body Morphology on Preferred Dance Tempos. Dahl, Sofia; Huron, David

Maintaining skill across the life span: Magaloff s entire Chopin at age 77

Practice makes less imperfect: the effects of experience and practice on the kinetics and coordination of flutists' fingers

Quarterly Progress and Status Report. Perception of just noticeable time displacement of a tone presented in a metrical sequence at different tempos

The roles of expertise and partnership in collaborative rehearsal

Measuring & Modeling Musical Expression

OBSERVED DIFFERENCES IN RHYTHM BETWEEN PERFORMANCES OF CLASSICAL AND JAZZ VIOLIN STUDENTS

Effects of articulation styles on perception of modulated tempos in violin excerpts

An Empirical Comparison of Tempo Trackers

Smooth Rhythms as Probes of Entrainment. Music Perception 10 (1993): ABSTRACT

Inter-group entrainment in Afro-Brazilian Congado ritual

SPECTRAL LEARNING FOR EXPRESSIVE INTERACTIVE ENSEMBLE MUSIC PERFORMANCE

Chapter Two: Long-Term Memory for Timbre

COMPUTATIONAL INVESTIGATIONS INTO BETWEEN-HAND SYNCHRONIZATION IN PIANO PLAYING: MAGALOFF S COMPLETE CHOPIN

Maintaining skill across the life span: Magaloff s entire Chopin at age 77

Playing the Accent - Comparing Striking Velocity and Timing in an Ostinato Rhythm Performed by Four Drummers

Interacting with a Virtual Conductor

Effect of room acoustic conditions on masking efficiency

Robert Alexandru Dobre, Cristian Negrescu

Effect of temporal separation on synchronization in rhythmic performance

Examination of a MIDI wind controller for use in wind performance research

Perceptual Smoothness of Tempo in Expressively Performed Music

Metrical Accents Do Not Create Illusory Dynamic Accents

Automatic characterization of ornamentation from bassoon recordings for expressive synthesis

The influence of musical context on tempo rubato. Renee Timmers, Richard Ashley, Peter Desain, Hank Heijink

Temporal control mechanism of repetitive tapping with simple rhythmic patterns

10 Visualization of Tonal Content in the Symbolic and Audio Domains

The individuality of metrical engagement: describing the individual differences of movements in response to musical meter

Open Research Online The Open University s repository of research publications and other research outputs

METRICAL STRENGTH AND CONTRADICTION IN TURKISH MAKAM MUSIC

Enhanced timing abilities in percussionists generalize to rhythms without a musical beat

Synchronization in Music Group Playing

MPATC-GE 2042: Psychology of Music. Citation and Reference Style Rhythm and Meter

A STUDY OF ENSEMBLE SYNCHRONISATION UNDER RESTRICTED LINE OF SIGHT

Multimodal databases at KTH

Citation-Based Indices of Scholarly Impact: Databases and Norms

gresearch Focus Cognitive Sciences

Precise Digital Integration of Fast Analogue Signals using a 12-bit Oscilloscope

Sensor Choice for Parameter Modulations in Digital Musical Instruments: Empirical Evidence from Pitch Modulation

Improving Piano Sight-Reading Skills of College Student. Chian yi Ang. Penn State University

WRoCAH White Rose NETWORK Expressive nonverbal communication in ensemble performance

RHYTHM COMPLEXITY MEASURES: A COMPARISON OF MATHEMATICAL MODELS OF HUMAN PERCEPTION AND PERFORMANCE

Music Improvisation: Spatiotemporal Patterns of Coordination. A thesis submitted to the. Division of Graduate Education and Research

Basketball Questions

Marc I. Johnson, Texture Technologies Corp. 6 Patton Drive, Hamilton, MA Tel

Lab #10 Perception of Rhythm and Timing

Perceptual Considerations in Designing and Fitting Hearing Aids for Music Published on Friday, 14 March :01

Transcription:

This is an electronic reprint of the original article. This reprint may differ from the original in pagination and typographic detail. Author(s): Thompson, Marc; Diapoulis, Georgios; Johnson, Susan; Kwan, Pui Yin; Himberg, Tommi Title: Effect of tempo and vision on interpersonal coordination of timing in dyadic performance Year: 2015 Version: Please cite the original version: Thompson, M., Diapoulis, G., Johnson, S., Kwan, P., & Himberg, T. (2015). Effect of tempo and vision on interpersonal coordination of timing in dyadic performance. In M. Aramaki, R. Kronland-Martinet, & S. Ystad (Eds.), Proceedings of the 11th International Symposium on CMMR, Plymouth, UK, June 16-19, 2015 (pp. 16-23). Marseille: Laboratory of Mechanics and Acoustics. Retrieved from http://cmr.soc.plymouth.ac.uk/cmmr2015/proceedings.pdf All material supplied via JYX is protected by copyright and other intellectual property rights, and duplication or sale of all or part of any of the repository collections is not permitted, except that material may be duplicated by you for your research use or educational purposes in electronic or print form. You must obtain permission for any other use. Electronic or print copies may not be offered, whether for sale or otherwise to anyone who is not an authorised user.

Effect of tempo and vision on interpersonal coordination of timing in dyadic performance Marc R. Thompson 1, Yorgos Diapoulus 1, Susan Johnson 1, Pui Yin Kwan 1 & Tommi Himberg 2 1 Finnish Centre in Interdisciplinary Music Research, University of Jyväskylä, Finland 2 Department of Neuroscience and Biomedical Engineering, Aalto University, Espoo, Finland. Corresponding author: marc.thompson@jyu.fi Abstract. Interpersonal coordination within a dyadic musical performance requires that the two musicians share a similar mental model of the music s timing structure. In addition to non-fluctuating inter-onset-interval, matched mental models can be observed through corporeal articulations and apparent embodiment of musical features (i.e. synchronous body sway, mimicked or complementary gestures). Our aim was to examine the effect of tempo on interpersonal coordination within a musical dyad. Violin dyads performed three unfamiliar collaborative musical sequences in facing vs. non-facing conditions. Our hypotheses were that interpersonal coordination would be weakened in the non-facing conditions, and that synchronization would be affected by both very slow and very fast tempi. The current paper reports the project s initial and general findings. We present results relating to the dyads ability to synchronize and have performed tests on the motion capture data to examine how movement patterns change between the front- and back-facing conditions. Keywords: Interpersonal Coordination; Embodiment in performance; Motion and gesture; Musical gesture analysis; Motion Capture System; Inter-onset-Interval; dyad; 1 Introduction Interpersonal coordination within a dyadic musical performance requires that the two musicians share a similar mental model of the music s timing structure. Examples of matched interactive behaviour, or entrainment, include synchronous body sway, mimicked or complementary gestures, and a shared focus of attention. These behaviours unfold spontaneously and are as unpredictable as they are inevitable. The study of interpersonal coordination of timing within joint musical interactions has traditionally been studied using a finger-tapping paradigm, which revealed much knowledge regarding tempo cognition. For instance, London [1] noted that there exists a range of greatest pulse salience wherein both musicians and non-musicians can best perform interactive continuation tasks beginning with a pacing metronome of around 120 beats per minute. Maintaining interpersonal synchrony (and keeping the pacing metronome s tempo) is much more difficult at extremely slow and extremely fast tempi [2]. Recent work within embodied music cognition has shown there exists a clear relationship between corporeal movement (for example in dancing) and music s metrical hierarchy [3]. Optical motion capture has increasingly been employed to study the role of the body in interpersonal coordination while performing a musical task [4]. For instance, Keller and Appel [5] tracked the movements of pianist dyads and found that musical interaction was enhanced when musicians performed in view of one another. Other studies investigated the cases of knowing (or not knowing) the partnering musician s partition [6] and leader/ follower dynamics [7]. Interestingly, Himberg & Thompson, [8] investigated entrainment in dance/vocal performance within groups of experts and novices. Their findings revealed that interpersonal coordination between expert and novice dancers did not follow a leader/ follower dynamic, but rather groups remained synchronized through mutual adaptation in response to each other s movements. Keller [9] provides a framework for which to study interpersonal coordination within musician dyads at both the descriptive and explanatory levels. The descriptive level deals

with quantifying behavioral measures such as the dyad s ability to remain temporally and expressively synchronized. The explanatory level deals with psychological mechanisms related to interpersonal coordination. Crucial to this level are musicians cognitive-motor facilities for adapting personal behavior while attending and anticipating online changes in their partner s behavior. In the current work, we focus on the descriptive level by examining the variability of note onsets within performances, and also corporeal coordination as indicated from motion capture data. This project offers a novel contribution to work focused on interpersonal coordination in dyadic performance by 1) using instruments that allow for a full range of motion (i.e. violins performed while standing) and 2) using full-body motion capture for kinematic analysis. Our main hypotheses are that interpersonal coordination would be weakened when musicians perform outside of each other s visual field, and that synchronization would be affected by both very slow and very fast tempi. The current paper reports the project s initial and general findings. We present results relating to asynchrony measures, circular statistics measures, and how movement patterns may change between the front- and back-facing conditions. 2 Methodology 2.1 Participants Seven violin dyads participated in this study (14 musicians total; 12 females; age: mean = 24.4, sd = 2.4). The violinists were recruited from student populations at the University of Jyväskylä and the Jyväskylä University of Applied Science. Musicians had received on average 16.7 (sd = 3.5) years of instrumental training on the violin. 2.2 Procedure The experiment consisted of dyads performing three sets of musical sequences at four different tempi in facing and non-facing conditions (3x4x2 design). The tempi performed at inter-onset intervals (IOI) of 1 s (60 BPM),.667 s (90 BPM),.5 s (120 BPM) and.4 s (150 BPM). For all seven dyads, the set order was identical (the simplest sequence was performed first, and the most complex performed last). However, within each set, the trial order was randomized. To factor out effects of leader-follower relationships, we randomly selected which player would begin the sequence. Each performance began with an eight-beat pacing metronome, which turned off as the musicians began playing. A description of the sequences is given in Table 1. The first, Alternating Repeated Note, is meant to simulate a tapping paradigm in that both players perform the same action (performing identical notes on violin). The second and third (Echoing Scale: the second player repeats the first player s notes & Alternating Scale: an interleaved two-note scale pattern) are increasingly complex, making the task of remaining synchronized more difficult. For each sequence, the participants were instructed to focus on remaining coordinated with their partner as opposed to keeping to the intended BPM. While there were scores on hand to clarify what each violinist should play, musicians were encouraged to perform without the use of a score.

Table 1. Description of the performed Musical Sequences. The notes in brackets indicate notes performed by the second player. Sequence Alternating Repeating Note Echo Scale Alternating Scale Decription Do (Do) Do (Do) Do (Do) Do (Do) Re (Re) Fa (Fa) Do (Re) Mi (Fa) Sol (La) 2.3 Data-preprocessing Audio. Audio of the experimental trials was recorded using two AKG C417 L wireless microphones. The microphones were positioned around each violinist s right ear lobe and secured with adhesive tape. Note onsets were detected from the recorded audio files in ProTools using the tab to transient feature, which detects the transient portion of a waveform. An edit break was created at each note onset, and mini audio files, each representing a single note, were exported from ProTools. The audio files were loaded into MATLAB (using the wavread function), and matrices of inter-onset-intervals (IOI) were created with each value representing the amount of audio samples within consecutive audio clips. Motion Capture. Optical motion capture data was produced using 8 Qualisys Oqus cameras at 120 frames per second. Twenty-six markers were placed on the joints of each musician, and five markers were placed on the violin (2 on the bow, and 3 on the violin itself). The data was labeled within Qualisys Track Manager software and analyzed in MATLAB using functions within the MoCap Toolbox [10]. Regular video recordings were made of the trials for reference. 3 Results In this section, we present preliminary analysis completed as of the end of February 2015 as not all data have been properly preprocessed. Figure 1 gives examples of representative data. The figure plots the inter-onset intervals (IOI) for Dyad 1 performing the Echo Scale in the front-facing conditions. This figure demonstrates that in performances at the slowest tempo (IOI = 1 sec), musicians tended to perform faster than the pacing metronome, while in the fastest tempo (IOI = 0.4), musicians performed slower than the pacing metronome. 3.1 Mean Asynchronies The mean asynchronies were calculated to evaluate the timing coordination within each performance. First, we calculated the difference in milliseconds between each of the musicians note onsets (i.e. in each dyadic performance the onsets from Player 2 were subtracted from the onsets of Player 1). Second, we computed the absolute values of each asynchrony, and calculated the mean and standard deviation to obtain an overall index of asynchrony for the entire performance. These unsigned asynchronies means provided a measure of synchronization stability for each performance, while their standard deviation provided a measure of synchronization precision.

Fig. 1. IOI plots for Dyad 1 performing the Echo Scale sequence. We conducted three-way ANOVAs to explore the effects of Task, Tempo and Eye contact on these measures. For synchronisation accuracy, we found significant Main effects of Task (F(2,72) = 53.9, p < 0.0001) and Tempo (F(3,72) = 24.25, p < 0.0001), a significant two-way interaction of Task*Eye contact F(2,72) = 177.63, p < 0.0001 and a significant three-way interaction of Task*Tempo*Eye contact (F(6,72) = 3.78, p = 0.0025). Synchronisation accuracy got better as tempo got faster, and it was best in the Alternating Repeating Note -task, second best in Echo Scale, and worst in Alternating Scale. The two-way interaction is plotted in Figure 2A. For synchronisation stability, we found significant main effects of Task (F(2,72) = 11.2625, p < 0.0001) and Tempo (F(3,72) = 4.3051, p = 0.0075). Also the two-way interactions of Task*Tempo, Task*Eye contact and the three-way interaction Task*Tempo*Eye contact were significant. The main effects were very similar to those found with synchronisation accuracy: synchronisation stability was the best in ARN-task, followed by ES and AS. The second fastest tempo was the least stable, with stability increasing with faster tempi. The two-way interaction Task*Tempo is plotted in Figure 2B.

Fig. 2. (A) Two-way interaction Task*Eye contact for synchronisation accuracy. (B) Two-way interaction Task*Tempo for synchronisation stability. The error bars represent the standard error of the mean. 3.1 Circular Features Onset time series were converted into phase difference time series using three different baselines. The concentration and direction of the circular distributions of these phase values represent three complementing measures of performance: using the participants own previous inter-onset interval as a baseline gives us a measure of stability. Using the imagined metronome as the baseline yields a measure of tempo stability, and using the other performer s inter-onset intervals as the baseline gives us a measure of entrainment between the participants. For each circular distribution we calculated the mean direction (theta), representing the mean phase error or asynchrony between the participants onsets and the baseline (accuracy) and concentration measure (R), which tells about how stabile that relationship is [11]. Figure 3 shows the summary for Dyads 1-4. The self-stability measure is concentrated around the 0 degrees with a high R-value, indicating that participants were able to maintain a steady beat. Tempo stability is concentrated in the middle of the circle, showing that the tempo of the Dyads differed from the tempo of the pacing metronome. The entrainment measure is concentrated at 180 degrees, reflecting anti-phase entrainment, with the very high R-values signaling that overall, the Dyads remained strongly entrained to each other. Overall, Figure 4 demonstrates that violinists were much better at synchronizing with each other than staying with metronome.

90 1 0.8 0.6 0.4 0.2 180 0 270 Stability Tempo stability Entrainment Fig. 3. Summary of circular statistics for Dyads 1-4 (all tempo conditions for all three sequences). 3.3 Motion Analysis We had first aimed to conduct a corporeal synchrony analysis based on swaying motions of the violinists. However, upon viewing the video footage, musicians appeared to sway very little. We opted to investigate movement using a simple factor: the total amount of movement within the facing conditions vs. the total amount of movement within the non-facing conditions. The total amount of movement per trial was calculated using the cumulative distance travelled of 14 markers (per individual): Head (mean of 4), shoulders (mean of 2), hips (mean of 4), knees (mean of 2), ankles (mean of 2), toes (mean of 2), violin curl. We then summed the violinist s movement data. Figure 5 shows the difference in movement between the front-facing and back-facing conditions for Dyad 2. In each plot, the y-axis represents the total amount of movement within a trial. The values for total distance travelled have been normalized for visual representation. Figure 4 shows that generally, there is more movement in the front-facing conditions for the slowest and fastest tempi, and these differences are consistently significant at p=.05 (paired t-test between facing and back-facing trials). The differences are less stated in the two mid-tempo trials. The slowest and fastest trials are the hardest in which to maintain synchrony. While Figure 4 is for only one dyad, observation of the video recordings indicate similar results show up for the other dyads.

Fig. 4. The amount of total movement for each trial for Dyad 2. The p-values are the results of paired t-tests between facing- and back-facing conditions. 4 Discussion This paper gives a general report on a study about interpersonal coordination in dyadic performance. Thus far, we have examined the inter-onset-intervals, calculated circular statistics to measure synchronization between the dyads, and conducted pairwise t-tests on the total amount of movement per trial to see if movement patterns were different between front- and back-facing conditions. Regarding the inter-onset-interval analysis, we found that dyads sped up at slow tempo (60 BPM) but slowed down at fast tempo (150 BPM). The midrange BPM trials varied less from the pacing metronome. Dyads slowed down or sped up towards their comfortable rate. The circular statistics show clearly that musicians concentrated on remaining entrained to each other than performing to the pacing metronome (the original BPM became irrelevant soon after the pacing metronome stopped). The movement analysis showed that trials with slow and fast tempi resulted in more significant differences in amount of movement between the front- and back-facing conditions. Because the extreme tempi trials were more difficult to perform musicians used gestures to a greater extent to entrain with each other (more so than with the mid-tempo BPMs). The facing conditions provided an additional outlet for interaction and the violinists took advantage of being able to see their partner. Future directions of this project will investigate the effect of social interaction on interpersonal coordination. For instance, there were numerous occasions of laughter whenever one of the musicians would make a mistake. The laughter was more prominent in the front-facing conditions. There were also issues regarding leader/follower relationships. Though we did not assign leader/follower roles, we are interested to see if such roles

developed naturally within dyads, and whether the roles were consistent throughout the trials. Finally, future directions will provide a more concrete analysis of the embodiment of meter and synchronized movement using windowed cross-correlation analysis. Though synchrony and body sway weren t immediately visible from the video alone, a full kinematic analysis of the motion capture data to investigate subtler interactive gestures. Acknowledgements: Tommi Himberg s work was funded by the European Research Council (Advanced Grant #232946 to R. Hari) References 1. London, J. (2006). How to talk about musical meter. Retrieved September 20, 2013, from http://tiny.cc/pf1w3w. 2. Rasch, R. A. (1988). Timing and synchronization in ensemble performance. In Sloboda, J. A., editor, Generative processes in music: The psychology of performance, improvisation, and composition, pages 70 90. Clarendon Press, Oxford, England. 3. Toiviainen, P., Luck, G., and Thompson, M. R. (2010). Embodied meter: Hierar- chical eigenmodes in music-induced movement. Music Perception, 28(1):59 70. 4. Repp, B. H. and Su, Y.-H. (2012). Sensorimotor synchronization: A review of recent research (2006-2012). Psychonomic Bulletin & Review, 20(403-452). 5. Keller, P. E. and Appel, M. (2010). Individual differences, auditory imagery, and the coordination of body movements and sounds in musical ensembles. Music Perception, 28(1):27 46. 6. Ragert, M., Schroeder, T., and Keller, P. E. Knowing too little or too much: The effects of familiarity with a coperformers part on interpersonal coordination in musical ensembles. Frontiers in Psychology, 4:368. 7. Goebl, W. and Palmer, C. (2009). Synchronization of timing and motion among performing musicians. Music Perception, 26:427 425. 8. Himberg, T. and Thompson, M. R. (2011). Learning and synchronising dance movements in South African songs: Cross-cultural motion-capture study. Dance Research, 29(special electronic issue):305 328. 9. Keller, P. E. (2013). Ensemble performance: Interpersonal alignment of musical expression. In Fabian, D., Timmers, R., and Schubert, E., editors, Expressiveness in music performance: Empirical approaches across styles and cultures. Oxford University Press. 10. Toiviainen, P. and Burger, B. (2011). MoCap Toolbox Manual. University of Jyväskylä. Available at http://www.jyu.fi/ music/coe/materials/mocaptoolbox/mctmanual, Jyväskylä Finland. 11. Himberg, T. (2014). Interaction in musical time. Doctoral dissertation, Faculty of Music, University of Cambridge, Cambridge, UK.