Measurement of Motion and Emotion during Musical Performance

Similar documents
Creating a Network of Integral Music Controllers

BioTools: A Biosignal Toolbox for Composers and Performers

Emovere: Designing Sound Interactions for Biosignals and Dancers

THE "CONDUCTOR'S JACKET": A DEVICE FOR RECORDING EXPRESSIVE MUSICAL GESTURES

Multimodal Interaction in Music Using the Electromyogram and Relative Position Sensing

Motivation: BCI for Creativity and enhanced Inclusion. Paul McCullagh University of Ulster

Interacting with a Virtual Conductor

The Sound of Emotion: The Effect of Performers Emotions on Auditory Performance Characteristics

About Giovanni De Poli. What is Model. Introduction. di Poli: Methodologies for Expressive Modeling of/for Music Performance

THE SONIFIED MUSIC STAND AN INTERACTIVE SONIFICATION SYSTEM FOR MUSICIANS

Empirical Evaluation of Animated Agents In a Multi-Modal E-Retail Application

Follow the Beat? Understanding Conducting Gestures from Video

ESP: Expression Synthesis Project

Expressive information

Computer Coordination With Popular Music: A New Research Agenda 1

Social Interaction based Musical Environment

Embodied music cognition and mediation technology

A System for Generating Real-Time Visual Meaning for Live Indian Drumming

Toward a Computationally-Enhanced Acoustic Grand Piano

Using machine learning to support pedagogy in the arts

BRAIN-ACTIVITY-DRIVEN REAL-TIME MUSIC EMOTIVE CONTROL

SIEMPRE. D3.3 SIEMPRE and SIEMPRE-INCO extension Final version of techniques for data acquisition and multimodal analysis of emap signals

Concept of ELFi Educational program. Android + LEGO

Compose yourself: The Emotional Influence of Music

YARMI: an Augmented Reality Musical Instrument

Perception and Sound Design

Katie Rhodes, Ph.D., LCSW Learn to Feel Better

A real time music synthesis environment driven with biological signals

PLOrk Beat Science 2.0 NIME 2009 club submission by Ge Wang and Rebecca Fiebrink

Opening musical creativity to non-musicians

42Percent Noir - Animation by Pianist

Development of extemporaneous performance by synthetic actors in the rehearsal process

"The mind is a fire to be kindled, not a vessel to be filled." Plutarch

MusicGrip: A Writing Instrument for Music Control

Sensor Choice for Parameter Modulations in Digital Musical Instruments: Empirical Evidence from Pitch Modulation

Automatic Polyphonic Music Composition Using the EMILE and ABL Grammar Inductors *

TOWARD AN INTELLIGENT EDITOR FOR JAZZ MUSIC

15th International Conference on New Interfaces for Musical Expression (NIME)

Devices I have known and loved

Speech Recognition and Signal Processing for Broadcast News Transcription

Lian Loke and Toni Robertson (eds) ISBN:

Computational Modelling of Harmony

Evaluating Interactive Music Systems: An HCI Approach

TOWARDS IMPROVING ONSET DETECTION ACCURACY IN NON- PERCUSSIVE SOUNDS USING MULTIMODAL FUSION

An Emotionally Responsive AR Art Installation

DIGITAL AUDIO EMOTIONS - AN OVERVIEW OF COMPUTER ANALYSIS AND SYNTHESIS OF EMOTIONAL EXPRESSION IN MUSIC

Wireless sensor interface and gesture-follower for music pedagogy

Contextualising Idiomatic Gestures in Musical Interactions with NIMEs

Varieties of Tone Presence: Process, Gesture, and the Excessive Polyvalence of Pitch in Post-Tonal Music

158 ACTION AND PERCEPTION

MindMouse. This project is written in C++ and uses the following Libraries: LibSvm, kissfft, BOOST File System, and Emotiv Research Edition SDK.

Automatic Laughter Detection

MAKING INTERACTIVE GUIDES MORE ATTRACTIVE

Subjective Emotional Responses to Musical Structure, Expression and Timbre Features: A Synthetic Approach

Vuzik: Music Visualization and Creation on an Interactive Surface

Gestural Control of Music

UNIVERSITY OF SOUTH ALABAMA PSYCHOLOGY

The Effects of Web Site Aesthetics and Shopping Task on Consumer Online Purchasing Behavior

MELODIC AND RHYTHMIC CONTRASTS IN EMOTIONAL SPEECH AND MUSIC

BioGraph Infiniti Physiology Suite

However, in studies of expressive timing, the aim is to investigate production rather than perception of timing, that is, independently of the listene

NISE - New Interfaces in Sound Education

Intimacy and Embodiment: Implications for Art and Technology

AN ARTISTIC TECHNIQUE FOR AUDIO-TO-VIDEO TRANSLATION ON A MUSIC PERCEPTION STUDY

FULL-AUTOMATIC DJ MIXING SYSTEM WITH OPTIMAL TEMPO ADJUSTMENT BASED ON MEASUREMENT FUNCTION OF USER DISCOMFORT

Wireless sensor interface and gesture-follower for music pedagogy

OBSERVED DIFFERENCES IN RHYTHM BETWEEN PERFORMANCES OF CLASSICAL AND JAZZ VIOLIN STUDENTS

Shimon: An Interactive Improvisational Robotic Marimba Player

Designing for Conversational Interaction

Melody Retrieval On The Web

Bioinformatic Response Data as a Compositional Driver

Available online at ScienceDirect. Procedia Manufacturing 3 (2015 )

A System for Acoustic Chord Transcription and Key Extraction from Audio Using Hidden Markov models Trained on Synthesized Audio

Environment Expression: Expressing Emotions through Cameras, Lights and Music

Electronic Costing & Technology Experts

A History of Emerging Paradigms in EEG for Music

Creating Effective Music Listening Opportunities. Personal Listening Devices

PROBABILISTIC MODELING OF BOWING GESTURES FOR GESTURE-BASED VIOLIN SOUND SYNTHESIS

AUD 6306 Speech Science

Expressive performance in music: Mapping acoustic cues onto facial expressions

Center for New Music. The Laptop Orchestra at UI. " Search this site LOUI

Real-time composition of image and sound in the (re)habilitation of children with special needs: a case study of a child with cerebral palsy

GESTURECHORDS: TRANSPARENCY IN GESTURALLY CONTROLLED DIGITAL MUSICAL INSTRUMENTS THROUGH ICONICITY AND CONCEPTUAL METAPHOR

Outline. Why do we classify? Audio Classification

INTERNATIONAL JOURNAL OF PURE AND APPLIED RESEARCH IN ENGINEERING AND TECHNOLOGY

Final Project: Music Preference. Mackenzie McCreery, Karrie Chen, Alexander Solomon

INTRODUCING AUDIO D-TOUCH: A TANGIBLE USER INTERFACE FOR MUSIC COMPOSITION AND PERFORMANCE

Composing Affective Music with a Generate and Sense Approach

Form and Function: Examples of Music Interface Design

Empirical Musicology Review Vol. 5, No. 3, 2010 ANNOUNCEMENTS

A User-Oriented Approach to Music Information Retrieval.

MEMORY & TIMBRE MEMT 463

Influence of timbre, presence/absence of tonal hierarchy and musical training on the perception of musical tension and relaxation schemas

Music Years 7 10 Life Skills unit: Australian music

MUSICAL INSTRUMENT RECOGNITION WITH WAVELET ENVELOPES

EE391 Special Report (Spring 2005) Automatic Chord Recognition Using A Summary Autocorrelation Function

A STUDY OF ENSEMBLE SYNCHRONISATION UNDER RESTRICTED LINE OF SIGHT

ONLINE ACTIVITIES FOR MUSIC INFORMATION AND ACOUSTICS EDUCATION AND PSYCHOACOUSTIC DATA COLLECTION

1. BACKGROUND AND AIMS

Improving Piano Sight-Reading Skills of College Student. Chian yi Ang. Penn State University

Transcription:

Measurement of Motion and Emotion during Musical Performance R. Benjamin Knapp, PhD b.knapp@qub.ac.uk Javier Jaimovich jjaimovich01@qub.ac.uk Niall Coghlan ncoghlan02@qub.ac.uk Abstract This paper describes the use of physiological and kinematic sensors for the direct measurement of physical gesture and emotional changes in live musical performance. Initial studies on the measurement of performer and audience emotional state in controlled environments serve as the foundation for three pieces using the BioMuse system in live performance. By using both motion and emotion to control sound generation, the concept of integral music control has been achieved. 1. Introduction The relationship between emotion and music has become an obsession for researchers and popular culture over the past several years. With popular books such as Musicophilia [1] and Your Brain on Music [2] topping the best seller lists, it is evident that this topic has indeed a very broad appeal. The field covers topics ranging from musicology to psychology, and from social science to computer science. This paper will focus on one subset of this broad field - the concept of using direct, on-body measurement of gesture and emotion to interact with digital musical instruments (DMI). While research on the introduction of emotion as a component of humancomputer interaction has been ongoing for many years (a good collection of articles can be found in [3]), the concept of integral music control, the capability to use both motion and emotion in controlling DMI s has been around a comparatively short time [4][5][6]. In this paper, we will briefly describe our research into integral music control and then present several examples of its use in live performance. 2. Review of Integral Music Control Integral Music Control (IMC) is defined in [4] as a controller that: 1. Creates a direct interface between emotion and sound production unencumbered by a physical interface. 2. Enables the musician to move between this direct emotional control of sound synthesis and the physical interaction with a traditional acoustic instrument and through all of the possible levels of interaction in between. Figure 1 shows the standard technique of controlling sound generation: a thought creates a gesture which then controls a sound generator. Both the sounds and the proprioception from the physical interaction of creating the sound are then sensed by the performer creating a direct feedback loop. The concept of integral music control opens up the possibility for the addition of direct measurement of emotion as another means of interaction. Perception (e.g. vision, hearing, proprioception) Perception (e.g. vision, hearing) Performer Audienc Emotion / Thought Physica Gestures Sound Generation Emotion / Thought Physical Gestures (e.g. clapping, facial expression) Figure 1 (from [4]): The three layers of performance feedback using IMC. 1 represents the internal emotion and thoughts of the performer. 2 is the physical interface layer. 3 represents the consequence of the gesture - the creation of music. Performance feedback with the audience included. The dashed line represents a new path of direct measurement of emotion As can be seen in Figure 1, even direct measurement of the audience s emotional state can be used to manipulate sound. The question then becomes, what techniques can be used to directly measure motion and emotion during live musical performance to enable this kind of integral control. Coupled with kinematic sensors such as gyros, accelerometers, and 978-1-4244-4799-2/09/$25.00 2009 IEEE

magnetometers, the responsiveness of physiological sensors to both motion and emotion makes them an ideal component that can be used as part of IMC. 3. An Instantiation of IMC: The BioMuse System There are many techniques for measurement of emotion including visual recognition of facial expression, auditory recognition of speech, and pattern recognition of physiological signals. For most musical performance environments, visual and auditory recognition systems would not be appropriate. Thus, physiological signals are the most robust technique for determining emotional state for direct emotional control of a digital music instrument. The BioMuse system used in this research is composed of body worn sensors that enable unencumbered movement during live performance. Bluetooth transmitters made by Infusion Systems are used and allow for up to eight external sensor inputs. This enables several classes of sensors to be combined: 1. Kinematic sensors that measure motion of the body for use in physical gesture interaction. As mentioned previously, these include gyros, accelerometers, and magnetometers. 2. Physiological sensors that can measure somatic activity for use in physical gesture interaction. These include EMG sensors (bioflex), and extracting EMG from the EEG sensors (biowave) 3. Physiological sensors that can measure autonomic activity for use in emotional state measurement. (biowave), and GSR sensors (BioEmo). It should be noted that EMG sensors can also be used as an indicator of emotional state. Figure 3: The EMG Sensor for the BioMuse System 4. Exploring the Effects of Emotion on Performance In order to use emotion as an effective means of DMI control, the relationship between physiology and emotion during live performance has to be understood. There is a large body of literature relating physiological measurement to emotion (see [7][8] for a good summary). Recent work has even focused specifically on using physiological signals for emotion recognition while listening to music [9]). However, very little research has focused Figure 2: Analysis tool for exploring the relationship between physiological, kinematic, audio, and video signals during live performance These include ECG sensors (biobeat), EEG sensors specifically on the measurement of the emotion of

performers and audience during live performance. Some recent work [10][11][12] has begun to shed light on this important area. 4.1. Measurement of the Performer Over the past three years, a collaboration between the and the University of Genoa DIST has begun to use on-body physiological, and kinematic sensors as well as high speed cameras to explore the interaction of emotion and performance [10][11]. Performances by violinists, chamber music quartets, and traditional Irish music quartets have all been analyzed. As shown in Figure 2, signals from all of the BioMuse system sensors, coupled with audio and hi-speed video, can be analyzed to find patterns within the data. In order for physiological data to be incorporated into IMC, the relationship must be understood between an emotion that is expressed during a performance and an emotion that is truly felt. A series of experiments using psychological emotional induction techniques have begun to shed light on this relationship. For example, Figure 4 shows the relationship between the average heart rate (HR) of a violinist playing a Bach Canon without emotion and when playing under four conditions: 1. Expressed happiness 2. Induced happiness 3. Expressed sadness 4. Induced sadness This clearly shows that, while there is little difference in HR between protocols in the happy condition, there is a significant difference in the sad condition. Dif. (%) 0.00% -1.00% -2.00% -3.00% -4.00% -5.00% -6.00% -7.00% -8.00% -9.00% -10.00% Diana's Average HR compared with Neutral State Emotion Expressed happiness Induced happiness Expressed sadness Induced sadness Figure 4 (from [11]): Relationship of average heart rate (HR) during performance compared the neutral state average HR Further results from the experiments show that while physiological and kinematic signals during performances can begin to provide clues as to emotional state, they are highly affected by the underlying emotion of the performer. Thus, it is clear that much more research is needed in this area and the question becomes, can these signals be used as part of IMC? 4.2. Measurement of the Audience As was discussed previously, the emotional state of the audience can be used to control a DMI as well. A series of experiments at SARC have begun to focus on the measurement of audience emotional state using only heart rate and GSR sensors built into the audience s chairs. Figure 5: Physiological sensors attached to arm of chair Results from these sensors were compared to the Self Assessment Manikin (SAM) to understand the relationship between the HR, GSR, and the assessed emotional state (see Figure 6). From this it was clear that there was indeed a relationship between the emotional state of the audience and the changing physiological parameters. As with the performer data, the results with the audience data are preliminary, but they demonstrate the ability to measure and analyze physiological signals from an audience in real time. This data can then be used to directly control a DMI. 5. Three performance examples Over the past year, three musical performances have been staged to demonstrate the use of integral music control in live performance. 5.1. BioMuse Trio This piece, performed at the New Interfaces for Musical Expression Conference (NIME) in Pittsburgh in June 2009 consisted of a trio composed of a violin performer, a laptop performer, and a performer using the BioMuse system. The Biomuse performer had EMG sensors on the front and back of each forearm and bi-axial accelerometers placed on the back of both wrists. From these sensors, both continuous gestures and pattern recognition of discrete gestures were used to control sonic manipulations of the violin performance sampled by the laptop. Unlike previous uses of the BioMuse, every gesture was annotated into a full musical score. While no direct measurement of emotion was used in this performance, this piece demonstrated the precise control of physiological and kinematic signals possible during performance and.

Figure 6: Heart Rate and Heart Rate Variability of two audience members compared to their Self-Assessment Manikin (SAM) before and after listening to a live musical performance. thus the viability of using the BioMuse as a highly responsive chamber music instrument. 5.2. Reluctant Shaman This piece was performed at the International Computer Music Conference in Belfast in 2008. The piece explored integral musical control within the context of Irish traditional music and traditional music instrumentation. The audience was wearing earphones as well as watching a live performance. In the earphones, the audience heard exactly what the main character would hear if he were walking through an open field. They heard sonification of his heart beating and his breath, as measured by the BioMuse ECG sensor. Thus the audience was able to infer his emotional state from the sounds of his breathing and heart beat. He was also able to cause a stick to play a flute sound through the sonification of his gesture as measured by the BioMuse EMG sensor. Additionally, using the EMG sensors, the sounds created when he played a pair of wooden spoons were sonically augmented. By measuring the direction of his gaze through a magnetometer worn under his hat, the audience heard the environmental sounds exactly as he would have heard them if he were actually present in a field. They heard sonification of his footsteps as measured by sensors on his shoes as if he were walking in that field. The whistle player was also able to control sound by his gesture using the EMG sensors. The clarity of view that the audience had on the performance (lighting) was also modified based on the audience s emotional state as measured by the GSR of the sensor chairs. Thus the audience s emotion adjusted the environment of the piece affecting the performer s emotional state which was then, in turn, presented to the audience as a sonification of his breathing and heart beat. 5.3. Stem Cells This piece will be performed at the International Music and Emotion Conference in Durham, United Kingdom in August 2009. This piece uses an existing composition originally composed for laptop performance. The piece transitions between movements that require physical gesture control and movements that use direct emotional control thus demonstrating two of the four elements of integral music control within one piece. Physical gesture is measured in much the same way as the BioMuse Trio piece: emotional state is measured using the BioMuse System using EMG, ECG, EEG, GSR, and breath. It opens with the performer gradually changing from a state of serenity to a state of extreme anger causing the sound field to become increasingly complex. At the end of the piece, the performer changes from a state of joy back to a final state of serenity. Stem Cells thus demonstrates the use of precise emotional control throughout the performance. 6. Conclusions In this paper we describe the measurement of motion and emotion during musical performance using the BioMuse system. By studying physiological and kinematic signals, an understanding of how these data can be incorporated during live performance is beginning to emerge. This research demonstrates that integral music control could be a new and exciting area for composition and performance. References [1] O. Sacks. Musicophilia: Tales of Music and the Brain. New York, Knopf, Random House NY, 2007. [2] DJ Levitin. This is your Brain on Music: The Science of a Human Obsession. Dutton Books, 2006. [3] C. Peter and R. Beale (eds.). Affect and Emotion in Human-Computer Interaction From Theory to Applications Series: Lecture Notes in Computer Science Subseries: Information Systems and 978-1-4244-4799-2/09/$25.00 2009 IEEE

Applications, incl. Internet/Web, and HCI, Vol. 4868 2008. [4] R.B. Knapp and P.R. Cook. The Integral Music Controller: Introducing a Direct Emotional Interface to Gestural Control of Sound Synthesis. Proceedings of the International Computer Music Conference (ICMC 2005), Barcelona, Spain, September 5-9, 2005. [5] R. B. Knapp and P. R. Cook, Creating a Network of Integral Music Controllers, Proceedings of the New Interfaces for Musical Expression (NIME) Conference, IRCAM, Paris, France, June 5-7, 2006. [6] M.A. Ortiz Pérez, R.B. Knapp, and M.A. Alcorn. Díamair: Composing for Choir and Integral Music Controller. Proceedings of the New Interfaces for Musical Expression 2007 Conference, New York, NY, June 7-9, 2007. [7] J.T. Cacioppo, et.al. The Psychophysiology of Emotion. in Handbook of Emotions, Edited by Michael Lewis, Jeannette M. Haviland-Jones, Guilford Press, pp. 173-191, 2000. [8] M.M. Bradley, and P.J. Lang. Emotion and Motivation in Handbook of Psychophysiology, Cambridge University Press, pp 582-607, 2007. [9] J. Kim, J. and E. Andre. Emotion Recognition Based on Physiological Changes in Music Listening, IEEE Transactions on Pattern Analysis and Machine Intelligence, Volume: 30, Issue: 12, pp. 2067-2083, Dec. 2008. [10] A. Camurri, G. Castellano, R. Cowie, D. Glowinski, B. Knapp, C.L. Krumhansl, O. Villon, G. Volpe, The Premio Paganini project: a multimodal gesture based approach for explaining emotional processes in music performance, Proc. 7th International Workshop on Gesture in Human-Computer Interaction and Simulation GW2007, Lisbon, May 2007 [11] D. Glowinski, A. Camurri, G. Volpe, C. Noera, R. Cowie, E. McMahon, J. Jaimovich, and R. Benjamin Knapp. Using Induction and Multimodal Assessment to Understand the Role of Emotion in Musical Performance. The 4th Workshop on Emotion in Human-Computer Interaction, Liverpool, UK, 2nd September 2008. [12] T.M. Nakra. Inside the Conductor s Jacket: Analysis, Interpretation, and Musical Synthesis of Expressive Gesture. M.I.T. Media Laboratory Perceptual Computing Section Technical Report, no. 518, 2000. [13] N. Coghlan and R. B. Knapp. Sensory Chairs: A System for Biosignal Research and Performance. Proceedings of the New Interfaces for Musical Expression 2008 Conference, Genoa, Italy, June 5-8, 2008.