Intelligent Music Systems in Music Therapy

Similar documents
Modelling the relationships between emotional responses to, and musical content of, music therapy improvisations

Technology and clinical improvisation from production and playback to analysis and interpretation

& Ψ. study guide. Music Psychology ... A guide for preparing to take the qualifying examination in music psychology.

However, in studies of expressive timing, the aim is to investigate production rather than perception of timing, that is, independently of the listene

Music Emotion Recognition. Jaesung Lee. Chung-Ang University

THE EFFECT OF EXPERTISE IN EVALUATING EMOTIONS IN MUSIC

"The mind is a fire to be kindled, not a vessel to be filled." Plutarch

THE SOUND OF SADNESS: THE EFFECT OF PERFORMERS EMOTIONS ON AUDIENCE RATINGS

The Human Features of Music.

Curriculum Standard One: The student will listen to and analyze music critically, using the vocabulary and language of music.

SHORT TERM PITCH MEMORY IN WESTERN vs. OTHER EQUAL TEMPERAMENT TUNING SYSTEMS

About Giovanni De Poli. What is Model. Introduction. di Poli: Methodologies for Expressive Modeling of/for Music Performance

Expressive performance in music: Mapping acoustic cues onto facial expressions

On time: the influence of tempo, structure and style on the timing of grace notes in skilled musical performance

Effects of Auditory and Motor Mental Practice in Memorized Piano Performance

Chords not required: Incorporating horizontal and vertical aspects independently in a computer improvisation algorithm

Improving Piano Sight-Reading Skills of College Student. Chian yi Ang. Penn State University

The Tone Height of Multiharmonic Sounds. Introduction

Analysis of local and global timing and pitch change in ordinary

Expressive information

MELODIC AND RHYTHMIC CONTRASTS IN EMOTIONAL SPEECH AND MUSIC

Compose yourself: The Emotional Influence of Music

Computer Coordination With Popular Music: A New Research Agenda 1

Quarterly Progress and Status Report. Perception of just noticeable time displacement of a tone presented in a metrical sequence at different tempos

Harmony and tonality The vertical dimension. HST 725 Lecture 11 Music Perception & Cognition

Getting Started. Connect green audio output of SpikerBox/SpikerShield using green cable to your headphones input on iphone/ipad.

Chapter 40: MIDI Tool

DAT335 Music Perception and Cognition Cogswell Polytechnical College Spring Week 6 Class Notes

Course Overview. Assessments What are the essential elements and. aptitude and aural acuity? meaning and expression in music?

Curriculum Standard One: The student will listen to and analyze music critically, using vocabulary and language of music.

The relationship between properties of music and elicited emotions

10 Visualization of Tonal Content in the Symbolic and Audio Domains

Chapter Five: The Elements of Music

CLASSIFICATION OF MUSICAL METRE WITH AUTOCORRELATION AND DISCRIMINANT FUNCTIONS

A MULTI-PARAMETRIC AND REDUNDANCY-FILTERING APPROACH TO PATTERN IDENTIFICATION

Construction of a harmonic phrase

The Sound of Emotion: The Effect of Performers Emotions on Auditory Performance Characteristics

A Categorical Approach for Recognizing Emotional Effects of Music

SUBJECT VISION AND DRIVERS

CHILDREN S CONCEPTUALISATION OF MUSIC

MEANINGS CONVEYED BY SIMPLE AUDITORY RHYTHMS. Henni Palomäki

Music Curriculum. Rationale. Grades 1 8

Autocorrelation in meter induction: The role of accent structure a)

Essential Competencies for the Practice of Music Therapy

Pitch. The perceptual correlate of frequency: the perceptual dimension along which sounds can be ordered from low to high.

Music, Grade 9, Open (AMU1O)

THE INTERACTION BETWEEN MELODIC PITCH CONTENT AND RHYTHMIC PERCEPTION. Gideon Broshy, Leah Latterner and Kevin Sherwin

inter.noise 2000 The 29th International Congress and Exhibition on Noise Control Engineering August 2000, Nice, FRANCE

Copyright 2015 Scott Hughes Do the right thing.

2013 Music Style and Composition GA 3: Aural and written examination

Music Therapy Master s Degree Programme

BayesianBand: Jam Session System based on Mutual Prediction by User and System

Musical Developmental Levels Self Study Guide

Music Education (MUED)

An Integrated Music Chromaticism Model

Music Policy Round Oak School. Round Oak s Philosophy on Music

Arts Education Essential Standards Crosswalk: MUSIC A Document to Assist With the Transition From the 2005 Standard Course of Study

FANTASTIC: A Feature Analysis Toolbox for corpus-based cognitive research on the perception of popular music

SAMPLE ASSESSMENT TASKS MUSIC CONTEMPORARY ATAR YEAR 11

6.5 Percussion scalograms and musical rhythm

MUSICAL EAR TRAINING THROUGH ACTIVE MUSIC MAKING IN ADOLESCENT Cl USERS. The background ~

2013 HSC Music 2 Musicology and Aural Skills Marking Guidelines

Audio Feature Extraction for Corpus Analysis

Curriculum Standard One: The student will listen to and analyze music critically, using the vocabulary and language of music.

Music 175: Pitch II. Tamara Smyth, Department of Music, University of California, San Diego (UCSD) June 2, 2015

INFLUENCE OF MUSICAL CONTEXT ON THE PERCEPTION OF EMOTIONAL EXPRESSION OF MUSIC

1 Describe the way that sound and music are used to support different mediums. 2 Design and create soundtracks to support different mediums.

Perceptual Evaluation of Automatically Extracted Musical Motives

Automatic characterization of ornamentation from bassoon recordings for expressive synthesis

Why t? TEACHER NOTES MATH NSPIRED. Math Objectives. Vocabulary. About the Lesson

Real-time composition of image and sound in the (re)habilitation of children with special needs: a case study of a child with cerebral palsy

Children s recognition of their musical performance

Fundamentals of Music Theory MUSIC 110 Mondays & Wednesdays 4:30 5:45 p.m. Fine Arts Center, Music Building, room 44

Music Understanding and the Future of Music

2010 HSC Music 2 Musicology and Aural Skills Sample Answers

AP Music Theory Syllabus

AN ARTISTIC TECHNIQUE FOR AUDIO-TO-VIDEO TRANSLATION ON A MUSIC PERCEPTION STUDY

Ben Neill and Bill Jones - Posthorn

IP Telephony and Some Factors that Influence Speech Quality

REQUIREMENTS FOR MASTER OF SCIENCE DEGREE IN APPLIED PSYCHOLOGY CLINICAL/COUNSELING PSYCHOLOGY

Assessment Schedule 2017 Music: Demonstrate knowledge of conventions in a range of music scores (91276)

Therapeutic Function of Music Plan Worksheet

The Relationship Between Auditory Imagery and Musical Synchronization Abilities in Musicians

Automated extraction of motivic patterns and application to the analysis of Debussy s Syrinx

Finger motion in piano performance: Touch and tempo

Music Representations

Discovering GEMS in Music: Armonique Digs for Music You Like

Brain.fm Theory & Process

Progress across the Primary curriculum at Lydiate Primary School. Nursery (F1) Reception (F2) Year 1 Year 2

Visualizing the Chromatic Index of Music

Sound design strategy for enhancing subjective preference of EV interior sound

Modeling perceived relationships between melody, harmony, and key

DUNGOG HIGH SCHOOL CREATIVE ARTS

Connecticut Common Arts Assessment Initiative

Music. Curriculum Glance Cards

MUSIC CURRICULM MAP: KEY STAGE THREE:

MELODIC STRUCTURE AND INNER SELF IN CLINICAL IMPROVISATION

Perceptual Considerations in Designing and Fitting Hearing Aids for Music Published on Friday, 14 March :01

Music Department. Handbook

Memory and learning: experiment on Sonata KV 331, in A Major by W. A. Mozart

Transcription:

Music Therapy Today Vol. V (5) November 2004 Intelligent Music Systems in Music Therapy Erkkilä, J., Lartillot, O., Luck, G., Riikkilä, K., Toiviainen, P. {jerkkila, lartillo, luck, katariik, ptoiviai}@campus.jyu.fi Department of Music University of Jyväskylä PO Box 35 (M) 40014 University of Jyväskylä Finland Abstract This paper describes an ongoing research project, the purpose of which is to develop an automatic (computer-based) music analysis system that could be used in the analysis of improvisations produced in clinical music therapy. The paper begins by putting the project in context, and outlining the overall method employed. Following this is a description of some of the analysis tools developed so far, after which some examples are given of how these tools might be used in the clinical setting. A small pilot study, the aim of which was to examine the appropriateness of part of the overall method, is then outlined. The paper concludes by describing some of the details of the project, particularly how the clinical data is gathered, and what is required of the therapists who have agreed to participate in the project. 1

Background Intelligent Music Systems in Music Therapy is a three-year (2003-2006) research project funded by the Academy of Finland. The objective of the project is to develop automatic music analysis systems that can be used, among others, in analyzing improvisations produced in clinical music therapy. The development of the analysis methods is based on the research work carried out during the last ten years by the Music Cognition Group at the university of Jyväskylä (www.jyu.fi/musica/cognition). This work focuses on the perpection of melody, rhythm, and tonality as well as improvisation, variation, cross-cultural music cognition, and computational music analysis. We suppose that suitably chosen features extracted from a musical performance (e.g. clinical improvisation) can be used to predict assessments given, and thus psychic meanings attained, by therapists. Furthermore, we assume that these methods could be developed into computational analysis tools that would help make clinical music therapy work more effective. Finally, we suppose that interactive music systems based on intelligent musical feature extraction would be more rewarding and efficient than the present ones from the point of view of music therapy clients. To carry out automatic extraction of musical features, current knowledge about musicology, psychoacoustics, and the perception of melody, harmony, rhythm, and tonality will be applied. The methods will be based on statistical analysis as well as various modeling techniques (e.g. neural networks and dynamic systems). The connection between the extracted musical features and the perceived qualities of improvisations will be studied using listening tests. To this end, a set of improvisations will be subjected to automatic feature extraction to obtain descriptions of musi- Background 2

cal features and gestures for each improvisation. For the same improvisations, experienced music therapists and musicologists will provide subjective evaluations of given perceived qualities. The interrelationship between the musical features and the perceived qualities will be investigated using various statistical and modeling techniques. We seek to obtain models that, given the extracted musical features, provide estimates of perceived musical qualities. To this end, various statistical and neural network methods will be utilized. Figure 1 on page 3 shows a schematic overview of the methods used. FIGURE 1. A schematic representation of the method for obtaining a system for automated analysis of improvisations. Improvisations (MIDI) Computational Analysis Musical Features / Gestures texture, articulation melody rhythm harmony, tonality interaction modeling (statistical, neural nets) Listeners Assessments Perceived Musical Qualities e.g., activity, valence, tension, salience, congruence, Analysis tools The computational analysis is based on the MATLAB software (www.mathworks.com). MATLAB is a programming environment for mathematical computation, analysis, algorithm development, and visualization. Depending on the application area, the MATLAB software can Analysis tools 3

be supplemented with various toolboxes that contain specialized functions (e.g., signal processing, neural networks, statistics, fuzzy logic). Currently, the analysis is carried out from MIDI files. To access and analyze them, we use the Midi Toolbox (Eerola & Toiviainen, 2004). The MIDI Toolbox is a compilation of functions for analyzing and visualizing MIDI files in the MATLAB environment. Besides simple manipulation and filtering functions, the toolbox contains cognitively inspired analytic techniques that are suitable for context-dependent musical analysis that deal with such topics as melodic contour, similarity, key-finding, meterfinding, and segmentation. The analysis methods used specifically for music therapy improvisations are compiled into another MATLAB toolbox, the Music Therapy Toolbox (MTTB). The MTTB utilizes various functions of the MIDI Toolbox (see Figure 2 on page 5). Currently, the MTTB provides graphical representations of certain musical features of the improvisation. These features are related to: density of notes dynamics of playing usage of register (pitch height) duration of notes clarity of pulse and tonality When there are two improvisers, these features can be separately displayed for each improviser, allowing for the examination of interaction between the improvisers on these musical dimensions. The MTTB is continually developed and extended, using feedback received from clinicians and music therapy researchers involved in the project. As an end product, we aim to develop an improvisation analysis software package for music therapists. Analysis tools 4

FIGURE 2. Hierarchy of the analysis tools used Evaluation and representation of musical dialog between therapist and client One important dimension of musical expression that may be of interest for music therapy is the degree of communication between the therapist and the client playing together. In particular, when communication takes place, players imitate one another at some particular moments of the improvisation. The assessment of musical dialog may therefore be assessed through an observation of the degree of local similarity between the temporal evolutions of both improvisations, along the different features computed by the MTTB (density of notes, dynamics of playing, etc.). These local imitations are displayed in a new graphical representation called imitation diagram (ID) that has been specially designed for this purpose. Evaluation and representation of musical dialog between therapist and client 5

FIGURE 3. Temporal evolution of the musical variables (on the left) and their respective imitation diagrams (on the right). download and listen to mp3 (3 MB) download the MIDI file (12 kb) Each line of Figure 3 on page 6 is dedicated to a different musical feature: note density, mean duration, pitch standard deviation, and mean velocity. On the left side the temporal evolution of the corresponding feature is displayed with respect to each player, player 1 in black and player 2 in green. On the right side the respective ID is associated, where the horizontal axis also corresponds to the temporal evolution of the improvisation. Lines in the ID indicate local imitations. Color of lines is associated with strength of imitation: blue corresponds to slight and coarse similarities, while yellow and, to an even greater extent, red, correspond to distinct and close imitations. When the line is vertically centered, the imitation between both players is synchronous. When the line is at the upper side of the diagram, on the other hand, player 2 imitates player 1 after a specific delay, displayed by the vertical axis, in seconds. Similarly, Evaluation and representation of musical dialog between therapist and client 6

when the line is at the lower side of the diagram, player 1 imitates player 2. Finally, the length of the line indicates the duration of the imitation. This representation displays some interesting information. With reference to note density (first line), player 2 imitates player 1 from time 100 seconds, with a delay of 20 seconds. As regards mean duration (second line), player 2 imitates player 1 from time 70, with a slight delay of a couple of seconds, and player 2 becomes ahead at time 115. Meanwhile, with reference to pitch standard deviation (third line), player 2 imitates player 1 from time 115, and both players become progressively synchronous. Finally, as regards mean velocity (forth line), both players imitate each other during the whole improvisation, one player being ahead at some points, and the other player at other points. All these characteristics can be seen in the graphs on the left hand side. Music Therapy Toolbox from a clinical point of view We aim to test the Music Therapy Toolbox (MTTB) in real clinical contexts. Thus, we have established a network (which will be described in detail later in this article) where music therapists who are working in institutions for handicapped people will participate in the project by providing data as well as by giving feedback for the researchers for instance suggestions for improvements and usefulness of the method. We hope to get the network ready and to receive data from the field from Fall 2004. In the pilot stage we have tested the method mainly with music therapy students a setting which deviates somewhat from that in institutions for handicapped people. Nonetheless, the pilot setting will help evaluate the clinical relevance of the analysis method. The improvisation discussed next was created by two students. It was created without any givens, instruction or predestined roles. This way, the Music Therapy Toolbox from a clinical point of view 7

starting point was up to them, and as free as possible. Due to the specific requirements of the analysis method, the students could not choose the instruments two identical midi-pianos by themselves. After completing the improvisation, which was recorded on the hard disk of a PC, the students were asked to listen to it again as playback, and to verbalize their images, which were also recorded on the hard disk. The students performed the verbalization task separately so that they would not be influenced by what each other said. The students did the imagery trips separately without hearing each others images until they both had finished the session. MUSICAL DENSITY IN FREE IMPROVISATION In figure 4, the trend lines of musical density are depicted so that the red line represents improviser 1, the blue line improviser 2. The numbers below the trend lines indicate the duration of the improvisation in seconds. The density is simply the average number of notes played in a given time window. We can see that there is a clear increase of density starting after approximately 140 seconds, and lasting throughout the rest of the improvisation. FIGURE 4. Musical density depicted as a MTTB graph The concept of density in MTTB can be compared with the concepts of activity and arousal that are well known concepts in (music) psychology. After McMullen (1996), activation has often been explained as an increased state of arousal, and activation is frequently used even as a synonym for arousal. McMullen also refers to the work of Osgood, Suci and Music Therapy Toolbox from a clinical point of view 8

Tannenbaum, who have stated that, when depicting connotative meaning, one of the key factors is the activity dimension. Increased density in improvisation seems to consist of the contribution of several musical factors, including increase in volume, acceleration of tempo, shortened note durations, and increased number of notes in a given time window. When this kind of overall increase arousal in musical expression occurs it is a sign of increased emotional and physiological intensity as well (Husain, Thompson, & Schellenberg, 2002). If the theory suggested above has any clinical relevance, music therapists should be able to utilize musical density in many ways. Because of the importance of arousal and activity in music, and because density seems to be a close relative to them, any changes in density should have clinical relevance. We might, for instance, use the changes in density in order to divide the improvisation into sections, like in figure 4. FIGURE 5. Utilizing musical density in dividing improvisation into parts There are three parts with increased density, of which the last and longest one seems to be most intensive for both of the improvisers. When comparing the imagery processes with certain parts of the improvisation it became clear that the intensity of images was higher in the part with increased musical density (from 140 sec onward). Music Therapy Toolbox from a clinical point of view 9

MELODY TOGETHER WITH VOLUME When looking very carefully at the trend lines of the improvisers from 140 sec onward, where the density starts to increase, one can see that Blue s music seems to be more dense, especially during the first 20 seconds. Let s see what the melodic expression of the improvisers looks like within the same part (figure 6). FIGURE 6. Piano roll representation of the part with high density. Blue = lower stave, Red = higher stave. It seems that Blue is very expressive. Blue is using both hands, and producing melodic contra-movements with the right hand dominating. In the beginning, Red also makes an attempt to produce melodic contours influenced by Blue who set the new course of improvisation around 140 sec. It seems that Red gives up the expressive role, and is satisfied with her role as accompanist to Blue. It is possible to confirm the assumption about Blue s stronger expression by looking at the velocity graph (Figure 7). Velocity is a MIDI-concept which can be associated with volume in musical vocabulary with reservations. After Bruscia (1987), volume in music contributes to the emotions by indicating how much energy is directed towards the object, and how intense the emotion of the object is. Volume symbolizes power, Music Therapy Toolbox from a clinical point of view 10

force, strength, size and commitment. In this sense, the difference in volume between the improvisers when Blue seems to use more volume is in harmony with the other sources that form the basis for the interpretations. FIGURE 7. Velocity graph of the part with high density After Mélen & Wachsmann (Mélen & Wachsmann, 2001), infants of only 5 months are able to peform musical discrimination on the basis of melodic contours. So, the question must be about a very characteristic (important?) feature of music. Interestingly, in psychoanalytically orientated music therapy, only melody has been defined to be a specific expression of emotion (see Bruscia 1987). Can we then conclude that Blue s role in this particular part is more emotional and more expressive? In order to answer this it might be interesting to have a look at the images at this point of improvisation (around 140s.): BLUE: Oh yes!, Freedom and action!, No signs of anxiety! etc. [speaks loud and with dynamic manner] RED: The wind it is too hard, I don t like this, Why doesn t it stop? etc. [speaks softly without as much dynamics or volume] There seem to be clear similarities between the musical and verbal roles of the improvisers. What was possible to conclude on the basis of the graphs seemed to be in harmony with the reports by the improvisers. In addition, the interpretations derived from the analysis brought out many Music Therapy Toolbox from a clinical point of view 11

details of the interaction between the improvisers, as well as psychic processes that would have remained at the pre-conscious level without the analysis process and the consequent interpretation: from signs to symbols, from symbols to words. A Pilot Study Human experience versus computational feature extraction INTRODUCTION This section describes a small pilot study, the purpose of which was to assist in the development of the part of the project in which experienced music therapists and musicologists would subjectively evaluate clients improvisations (see Background on page 2). In the pilot study, music therapy students were required to give subjective ratings of a series of short (60 sec) improvisation excerpts. Each excerpt had to be rated on two scales: perceived ACTIVITY, and perceived VALENCE. These two scales were selected on the basis of a number of theoretical accounts which suggest that relationships between musical stimuli and affective/aesthetic behaviour can be predicted using a twodimensional activation/activity, and acceptance-rejection/evaluation framework (McMullen, 1996). With regard to the second of these dimensions, the concept of valence, based upon pleasant-unpleasant judgments usually investigated using several adjective pairs, is frequently used to describe affective/aesthetic responses to musical stimuli (Rauhala, 1973). Thus, activity and valence may be seen as central agents in the process in which one is judging the psychological meaning of musical stimuli, or the differences between stimuli. A Pilot Study Human experience versus computational feature extraction 12

The subjective ratings obtained in this experiment were to be compared with the musical features extracted from the excerpts using the computational methods described in the previous section. Using regression analyses, it was anticipated that some of the extracted musical features would be good predictors of participants activity and valence ratings. METHOD Participants. Thirteen individuals participated in the pilot study. All participants were students on the Master s music therapy programme at the University of Jyväskylä. Stimuli. Participants were presented with twenty 60-sec excerpts from therapist-client improvisations produced by students on the Master s music therapy programme in the department of music at the University of Jyväskylä. The improvisations were produced during a typical student training session. The excerpts were selected from thirty originally recorded excerpts, each of which were 2 4 minutes in length. Apparatus. An Apple Macintosh computer, running Logic sequencing software, was used to present stimuli to participants via headphones. Attached to the computer was a basic four-octave keyboard equipped with a data entry slider and a pitch bend wheel. Participants used the data entry slider to indicate the level of perceived ACTIVITY, and the pitchbend wheel to indicate perceived VALENCE. Procedure. Participants were presented with the twenty excerpts in two identical blocks. The start and end of each excerpt was signalled by a percussive tone, and there was a period of six seconds of silence between each excerpt and the next. During the first block, participants were required to rate the amount of perceived ACTIVITY. During the second block, they were asked to rate perceived VALENCE. During both blocks participants ratings were sampled at 500 ms intervals. A Pilot Study Human experience versus computational feature extraction 13

RESULTS Using the computational methods described in the previous section, the following musical features were extracted from each excerpt, with a 3- sec sliding window moving at 500 ms intervals: TABLE 1. Abbriviations DENS DUR MEANP MINP MAXP STDP MEANV MINV MAXV STDV TON MAJOR MINOR ART AC note density (note onsets per second) mean durational accent of notes mean pitch (MIDI note value) minimum pitch maximum pitch std of pitch values mean velocity minimum velocity maximum velocity std of velocity values tonal clarity clarity of major key clarity of minor key articulation index pulse clarity Ratings of ACTIVITY and VALENCE were analyzed separately. For each dimension, the values of the 15 extracted musical features, and the mean rating made by participants, were used as predictor variables in a linear regression analysis. Significant models emerged from each analysis, and are shown in Table 2 on page 14 (ACTIVITY) and Table 3 on page 15 (VALENCE) below. TABLE 2. Results of regression analysis for ACTIVITY. Using the simultaneous method of variable entry, a significant model emerged [F(15, 2384) = 196.462, p<.001; adjusted R square =.553]. Significant variables are shown below. Predictor Variable Beta Significance Level DENS.542 P<.001 MEANV.248 P<.005 MINP.229 P<.005 A Pilot Study Human experience versus computational feature extraction 14

TABLE 2. Results of regression analysis for ACTIVITY. Using the simultaneous method of variable entry, a significant model emerged [F(15, 2384) = 196.462, p<.001; adjusted R square =.553]. Significant variables are shown below. MEANP.191 P<.05 MAXV.177 P<.05 AC -.146 P<.001 MINV -.130 P<.05 STDP -.128 P<.05 TON -.127 P<.005 DUR.100 P<.001 MINOR.085 P<.005 TABLE 3. Results of regression analysis for VALENCE. Using the simultaneous method of variable entry, a significant model emerged [F(15, 2384) = 24.403, p<.001; adjusted R square =.128]. Significant variables are shown below. Predictor Variable Beta Significance Level MAXV -.613 P<.001 MAXP.533 P<.001 MINP -.507 P<.001 STDP -.491 P<.001 MINV.326 P<.001 STDV.254 P<.001 TON.108 P<.05 ART -.061 P<.05 AC -.051 P<.05 Summary A number of the computationally extracted variables were found to predict participants ratings of ACTIVITY and VALENCE fairly accurately. The most significant of these variables were DENS (note density), MEANV (mean velocity), and MINP (minimum pitch). Summary 15

Overall, this study suggests that there is some connection between the extracted musical features and the perceived qualities of the improvisations. Subsequent experiments, which will build upon the pilot study presented here, will attempt to describe this relationship in more detail. Clinical applications of the Music Therapy Toolbox in the project Clinical use of the Music Therapy Toolbox is currently at a test stage. After the very first trials with music therapy students there is a need to explore the possibility to apply the Music Therapy Toolbox in various clinical settings. A decision was made to begin this work at institutions for intellectually disabled. In Finland there are a total of 17 districts of services for intellectually disabled / federations of municipalities, and in addition to these there is the Rinnekoti-Foundation that provides residential and rehabilitation services on a large scale. At 7 of these 18 institutions, one or more qualified music therapists are available. 4 of these 7 institutions accepted the opportunity of contributing to this research project. These open-minded forerunners, who have also given human and material resources to be used in this project are: Pääjärvi Federation of Municipalities, Rinnekoti-Foundation, Satakunta District of Services for Intellectually Disabled, and Suojarinne Federation of Municipalities. Qualified music therapists who are contributing to this research project are: Arto Mäkelä (Satakunta District of Services for Intellectually Disabled), Kimmo Pyhäluoto (Pääjärvi Federation of Municipalities), Heikki Raine (Rinnekoti-Foundation), Leila Varkila (Pääjärvi Federation of Municipalities), and Jukka Värri (Suojarinne Federation of Municipalities). Clinical applications of the Music Therapy Toolbox in the project 16

Music therapy clients in contributing institutions are mostly mentally retarded but there are other clients as well. This situation results from the development of open welfare over the years: many residents of large institutions have moved to small units or to some other more independent residence. As the services of institutions are still available many other directions have noted this situation and asked for residential and rehabilitation services, including music therapy. At this time, when 10 clients have contributed to the present research, only about half of them are diagnosed as mentally retarded, while others have some psychiatric diagnosis or other neurological diagnosis than mental retardation. Mentally retarded clients have mostly a mild disability. However, it is too early to make predictions about diagnosis distribution of clients to come. Research data is gathered from clinical situations. A music therapy session includes improvisation, during which client and therapist play together with two separate but identical MIDI keyboards using piano sound. Length and content of improvisation is not restricted. MIDI controller keyboards, with no built-in sound, but 88 dynamic keys with full hammer action and aftertouch are used. The piano sound comes from a sound module. Improvisations are recorded with sequencer software. The MIDI files produced are then sent to the researchers via the web. This combination is rather complicated and sensitive to accidental changes. Many technical problems of poor quality and incompatibility between components have been encountered. In spite of these problems, data collection has begun. In addition to improvisations, therapists explore clients by performing a test called the MIDI test, in which the therapist models some basic motor functioning on the client s keyboard, and the client tries to repeat what the therapist has played, or what the therapist has asked the client to play. The purpose of this test is to find out how well the client can use his or Clinical applications of the Music Therapy Toolbox in the project 17

her hands and fingers, as well as mapping some simple musical skills. The MIDI test does not require concentrating on papers in the test situation because the information will be saved as a MIDI file that can be analyzed later. Moreover, the therapists are required to fill in web-based forms. An improvisation assessment form must be completed after every improvisation. It is a checklist that includes some musical as well as nonmusical assessment in a subjective manner. It also contains a free field for the therapist s and client s comments on the current improvisation. A basic information form is completed after the very first improvisation session only. This form contains questions concerning the client s age, diagnosis, and verbal as well as motor functioning. Every third month, therapists have to complete a follow-up form, which is the same as the basic information form. Although some of the information the therapist provides on this form may change over time, diagnostic information usually remains the same over longer periods. MTTB analysis is then performed to investigate possible connections between motor, mental, emotional, or social functioning of the clients, and musical features of the improvisations. Gabrielsson & Lindström (2001) have summarized research results concerning relationships between musical features and emotional expression, and Juslin (2001) has examined emotional communication in music performance as a function of musical codes placed on dimensions of valence and activity. Their work is a well-established basis for analyzing emotional content of musical features in music therapy improvisations. Motor functioning is present in all MIDI data. Note clusters, for example, may suggest limited motor skills. As dissonance, they can be interpreted Clinical applications of the Music Therapy Toolbox in the project 18

as emotional expression as well. To avoid incorrect interpretations, reliance upon only on one musical feature may not be enough in all cases. Some clients are probably able to express themselves by using major/ minor tonality that can be also treated as an element of emotional content of an improvisation. In addition to motor and emotional functioning, MTTB analysis is expected to reveal something about the social functioning of the client. In an MTTB graph it is easy to see the level of synchronization between the musical functioning of client and therapist. Initiative actions, and following responses, can easily be detected as well. MTTB analysis may or may not reveal some common features between clinical subgroups, or between some other classes investigated, but before obtaining and analyzing more improvisations it is too early to make extensive generalizations about this subject. Conclusions The aim of this project is to develop computational improvisation analysis tools for music therapists to use in everyday clinical practice. While therapists may choose to base much of their analysis on non-computational techniques, it is anticipated that these tools will help illuminate the musical interaction and experience shared by the client and the therapist. The process of turning an auditory input, i.e., a client-therapist improvisation, into a visual output, such a representation of client-therapist interaction, results in a static representation of a temporal event. When utilizing computers in this way, we can be sure that the visual representation is at least precise. What we cannot say yet, however, is whether the features we represent are clinically relevant. The relevance of the extracted features should become more apparent as the project unfolds. Conclusions 19

The next stage of the project, to be carried out in collaboration with clinicians, is to test and develop the method with improvisational data gathered from the field. If the MTTB is found to be appropriate in the clinical setting with this particular group of intellectually disabled clients, it is hoped that the method could be applied to other client populations as well. Acknowledgment. This work was supported by the Academy of Finland (grant No. 102253) References Bruscia, K. E. (1987). Improvisational models of music therapy. Eerola, T. & Toiviainen, P. (2004). MIDI Toolbox: MATLAB Tools for Music Research. University of Jyväskylä: Kopijyvä, Jyväskylä, Finland. Available at http://www.jyu.fi/musica/miditoolbox/. Gabrielsson, A. and Lindström, E. (2001). The influence of musical structure on emotional expression. In P. N. Juslin and J. A. Sloboda (eds.) Music and emotion theory and research. Oxford University Press, Oxford. Husain, G., Thompson, W. F., & Schellenberg, E. G. (2002). Effects of Musical Tempo and Mode on Arousal, Mood, and Spatial Abilities. Music Perception, 20(2), 151-171. Juslin, P.J. (2001). Communicating emotion in music performance: a review and theoretical framework. In P. N. Juslin and J. A. Sloboda Acknowledgment. 20

(eds.) Music and emotion theory and research. Oxford University Press, Oxford. McMullen, P. T. (1996). The Musical Experience and Affective/Aesthetic Responses: A Theoretical Framework for Empirical Research. In D. A. Hodges (Ed.), Handbook of Music Psychology (second ed., pp. 387-400). San Antonio: IMR Press, The University of Texas at San Antonio. Mélen, M., & Wachsmann, J. (2001). Categorization of Musical Motifs in Infancy. Music Perception, 18(3), 325-346. Rauhala, H. (1973). Musiikkiterapia - Teoria ja metodinen mallisto. Jyväskylä: K. J. Gummerus osakeyhtiö. This article can be cited as: Erkkilä, J., Lartillot, O., Luck, G., Riikkilä, K., Toiviainen, P. (2004) Intelligent Music Systems in Music Therapy. Music References 21