SIME Social Interaction based Musical Environment Yuichiro Kinoshita Changsong Shen Jocelyn Smith Human Communication Human Communication Sensory Perception and Technologies Laboratory Technologies Laboratory Interaction Research Group University of British Columbia University of British Columbia University of British Columbia 2356 Main Mall 2356 Main Mall 201-2366 Main Mall Vancouver, BC, Canada Vancouver, BC, Canada Vancouver, BC, Canada +1 604 822 4583 +1 604 822 4583 +1 604 822 5108 yuichiro@ece.ubc.ca csshen@ece.ubc.ca jdsmith@cs.ubc.ca ABSTRACT Current technology presents the opportunity for innovative couplings between musical output and input techniques. Appropriate input techniques and mappings to output can enable non-musicians to create music using knowledge they have from other domains. The SIME system explores the idea of using social dynamics to enable novice musicians to create music in a fun and intuitive way. Keywords Interactive Music, Social Interaction, Musical Environment 1 INTRODUCTION SIME, Social Interaction based Musical Environment, is a musical environment that enables people to explore and create music by using their knowledge of social dynamics. Players create music by moving and interacting in SIME. Each user wears a small tag associated with a particular instrument. The system allows users to choose their instrument by choosing the corresponding instrument tag. The music created is determined by the relative positions, facing directions and movements of the players. An intuitive mapping between musical concepts and social interaction enables users to experiment freely with creating music. Thus SIME can be used by several novice users to compose social music and provides a new interactive form of entertainment. The remainder of this paper is organized as follows. Section 2 provides an overview of related work. The design of the system is presented in section 3 and section 4 describes the implementation. A description of the user evaluation of SIME is given in section 5 and section 6 presents and discusses the results of this evaluation. Finally, conclusions drawn from the project and possibilities for future work are discussed in section 7. 2 RELATED WORK The use of technology to create novel input methods and mappings can be seen in several previous systems. One of the oldest electric musical instruments is the Theremin [6]. The Theremin used the capacitance between a player s hands and a metal loop to control the volume and an antenna to control pitch. Newer systems such as the MetaMuse [5] explore the use of metaphors to create intuitive control. The MetaMuse uses the metaphor of water flowing from a watering can. The user controls the music created (the layering and overlapping of discrete sound samples) by controlling the amount of the flow of water from the watering can. SIME uses a metaphor that incorporates several users instead of just one. Several systems explore the use of an individual s motion to create music. One such project, Streams [2], creates an environment in which a professional dancer creates music through movement. Here, the mapping of motion to sound is not always directly obvious. It maps specific sounds (perhaps complex) chosen for a particular piece to an individual dancer s movement-vocabulary. Another system the Iamascope [3] generates kaleidoscope images and accompanying music based on the user s body movements. These two interfaces are designed for one performer where as, SIME is designed to be used collaboratively by three users. An interface that provided a multi-person space in which people could manipulate or create music by interacting with the environment was the interactive dance club developed for the 25 th ACM SIGGRAPH[11]. In the interactive club environments, people influenced the music playing by moving through and interacting with the space. The interactions were with objects like light beams and the musical control of the participants was limited to specific elements of the music which varied based on the room in the club. The environment of the interactive dance club
is by nature multi-person but only one person actually interacted with the music in a particular zone at a time. Among systems that use movement to create music, a multi-person system that has a direct mapping between basic sounds and movement is the Musical Stage Environment developed in the MIT media lab[10]. Here the stage becomes the instrument and the performers are the musicians. Unlike SIME, which focuses on exploration by novice musicians, the Musical Stage focuses on performance and requires extensive practice to use. A system designed for exploration of social interaction through music is 2Hearts[8]. A shared musical experience is created by mapping the heart rates of two players to the musical output. The heart rates affect the music which in turn affects the social interaction and the participant s heart rates. The mapping between social interaction and music has a similar flavour to SIME; however, while heart rate is hard to control directly, the social dynamics used by SIME are simple to control. SIME also requires that the three users work together to control the music. The main goal of the SIME is to create a space in which multiple users can explore and create music in an intuitive way. Many earlier systems use movement or metaphors to create music but are designed for single users. Systems designed for multiple users either have less intuitive mappings or require substantial training and practice to use. SIME will provide a space for three people to explore music through the paradigm of social interaction. The focus is on enabling intuitive exploration and creativity by utilizing people s existing knowledge of social interaction. 3 DESIGN In this section, we will give a brief description of SIME. We describe the social and musical parameters used in SIME along with a mapping between the two. Finally, we present several detailed scenarios to illustrate this mapping. Environment SIME is a room sized musical environment. In this environment, three players create music through social interaction as shown in Figure 1. A basic set of social interactions is used and mapped to a set of musical parameters. Thus the three players in the system control the music created by how they position themselves relative to each other in the environment. Figure 1: Players in SIME. Social Parameters Two basic parameters are used to describe the social situations. These parameters are the relative directions players are facing, and the distance between the players. There are several possible facing configurations. The first is the situation where all three players are facing each other. The other situations are defined on pairs of players. Each pair of players can have four possible facing configurations. The two players A and B may be facing each other, A may be facing B while B is not facing A, B may be facing A while A is not facing B, or both players may not be facing each other. The two situations where only one player in a pair is facing the other are treated the same by SIME. The distance between players is also defined on a pairwise basis. Musical Elements In our musical environment, there are three instruments one for each player. For each of the three instruments we created two melodies, a happy melody and a sad melody. Many studies have been done that look at peoples emotional interpretation of music. Gabrielsson[4] summarized the relationship between musical factors and emotional expressions. Based on these results we created melodies that could be identified as either happy or sad. Specifically, our happy melodies tended to be faster and higher than our sad melodies. Also all our happy melodies were in the key of C major and our sad melodies were in C minor. These are all parameters that are strongly associated with happy or sad interpretation. We also created collaboration instruments. The happy collaboration melodies were all the same but had different instruments playing them depending on which players created the collaboration. We used different instruments for different collaborations since instrument is one aspect of music that people are particularly good at distinguishing [4]. The disturbing collaboration sound was a unique
instrument that sounded ominous. The same instrument and sounds were used for all disturbing collaborations. Mapping We created a simple mapping between the social and musical elements. This mapping was designed to be simple enough that players could easily learn it and quickly start enjoying the system while still containing enough expressivity to be musically interesting. Social Interaction - all three players facing each other - two players face each other - a player does not face any other players - two facing players move into each others personal spaces - a player moves into the personal space of a second player he is facing but who is not facing him - two players move closer into each others personal spaces Affect on Musical Output - all players instruments play happy melodies - their instruments play happy melodies - his instrument plays a sad melody - a collaboration instrument is added - collaboration instrument plays a happy melody - a collaboration instrument is added - collaboration instrument plays a disturbing sound - collaboration instrument increases in volume Figure 2: Three players are all facing and their person spaces are overlapping. In the scenario present in Figure 2, the players are close and facing and the music created is happy. A less pleasant situation is seen in Figure 3. Here the three players are far apart and are all facing away from each other. The result is sadder musical output. Specifically, since all players are facing away, each of the player s instruments will play a sad melody. Since the players are far apart, no collaboration sounds will be generated. Scenarios In this section, we present several representative scenarios of possible social situation that could be created by the three players. For each scenario a description of the social dynamics used and the corresponding musical situation are described. In Figure 2, the three players are all facing each other and their personal spaces are overlapping. Since they are all facing, each player will add a happy melody to the song. The overlapping personal spaces mean these three players generate three collaboration instruments. Furthermore, the collaboration instruments will play happy sounds since the players are all facing. Figure 3: All three players are far apart and facing away from each other. The previous two scenarios were quite simple. In the first all players were in a group and in the second they were not interacting. In Figure 4, a more complex scenario is presented. Here players A and B are facing each other so the instruments associated with these two players will play happy melodies. Player C is facing player B but no player is facing player C. Since player C is not being faced by anyone his instrument will play a sad melody. Finally, since the personal spaces of B and C overlap and C is facing B but B is not facing C, the disturbing collaboration sound will be generated.
used to determine the position of the player. The other one is used to tell the player s facing direction. Each tag sends out a unique IR-ID, which can be captured by the camera. The raw image is sent to the Host by a capture card. Then the situation is determined by the situation calculation component of our program. This information is passed to the Pd program. Figure 4: Players A and B are facing but far apart. The personal spaces of players C and B are overlapping. C is facing B but B is not facing C. The three scenarios presented here show some of the situations that can arise in a three person environment. In each case, we have explained how the relationship of the three individuals affects the musical output of SIME. The next section will discuss some of the implementation details of SIME. 4 IMPLEMENTATION There are two main components to SIME. One determines the current social situation and the other controls the corresponding musical output. A camera monitors the social dynamics of the system, and positions of the players and situations are determined on the host computer. Pure Data (Pd) [9] is used to create a musical composition based on the position and situations of players. The musical output is generated by a hardware MIDI synthesizer. Figure 5 shows the SIME system setup. Figure 6: The musical hats. Musical Control Pd is a real-time graphical programming environment for audio processing. A program in the Pd environment consists of objects and connections between objects. Figure 7 shows an example of the Pd screen. It allows us to manipulate MIDI control sequences and control audio data. Pd also supports network connections so that it is able to receive position data from another computer. Figure 7: An example of the Pd screen. Figure 5: The SIME system setup. Tracking Interaction Each player s position in the room and facing direction is determined by the local positioning system (LPS). There are two tags on each music hat as shown in Figure 5. One is 5 USER EVALUATION Hypotheses H0: Users will be able to use SIME to control music. H1: Users will enjoy using SIME.
User Evaluation In this section, we describe our evaluation procedure. Pre-Questionnaire Background information was collected from participants using a simple questionnaire. The questionnaire asked some questions to obtain demographic information about our sample population. Other questions asked about the musical background of the participant. Since musical background may influence the experience of using SIME, it was important to know at what musical background our test users had. Specifically, we are interested in creating an environment in which novice musicians can create music. Using SIME The users in the study interacted with the system in three different stages. Before this interaction occurred each user chose a musical hat. After the users chose their hats, we played the happy and sad melodies associated with each hat for them. Finally, we took them into the SIME environment and told them that where they were facing and the distance between them would determine the music that was played. In the first interaction segment, the users spent time exploring the interface by interacting in the SIME environment. During the second segment of interaction the players were given a list of several social interaction tasks. The purpose of these tasks was to give the players some direction in their exploration of the system. Finally, users were given the opportunity to spend some additional time exploring the interface with the added knowledge they gained by completing the tasks. Post- Questionnaire and Interview A questionnaire was used after the study to gather quantifiable information about the users experiences with SIME. Finally, to find out more information about users experiences with SIME and their thoughts about the system we conducted interviews. The questions were designed to find out users understanding of the musical control and what they liked and disliked about the system. 6 RESULTS AND ANALYSIS In this section, we present the results of our user evaluation and discuss the meaning of these results. Study Participants We had three groups test our system for a total of nine test users. The users were all in their twenties. Of these nine three were female. Eight of our study participants were university students and the other was a game programmer. The students are studying in various fields with half being computer science graduate students and the others coming from other fields such as English and education. The musical background of the participants ranged from zero to nine years of musical training. Four participants had no musical training. When asked to rate their musical ability six rated themselves as novice, two as intermediate and one as advanced. Only one subject had experience composing music and specified that they mixed digital music frequently. Thus our study population consisted of users with varied musical background and many of our participants were novice musicians. Observations We watched users while they were using the system and made some interesting observations. 1) People who knew each other spent more time using the system. Of the three groups two were composed of people who knew each other before but in one group at least one of the users did not know the others before. It seemed that the more people knew each other the more comfortable they felt in SIME. The two groups of people that knew each other talked more about what they were doing and discussed the resulting music. This observation implies that a system like SIME may be most appropriate in situations where people know each other. 2) People identified with their instrument Before users started using the system we played them the melodies associated with their musical hat. During the interaction people would mention what their instrument was doing. Now mine is happy. This observation agrees with our intuition that giving users an instrument helps them to identify with the music and understand the musical control. Questionnaire Results After the participants had used the system we asked them to answer a questionnaire indicting how much they agreed with four statements about SIME and their experience. The results are shown in Figure 8-11. The majority of users agreed or strongly agreed with statements that by working together with the others they could control the music (see Figure 8), that using SIME was enjoyable (see Figure 9) and that the music created was interesting (see Figure 10). These results support our hypotheses that users will be able to control music using SIME and will enjoy using SIME. Participants agreement with a statement that one player could control SIME alone ranged from disagree to agree (see Figure 11). The average response was close to neutral and distribution of responses was not statistically significantly different than even distribution. This result along with the result that users agreed or strongly agreed that as a group they could control the music indicates that users felt they need to work with others to control the music as expected.
Figure 8: By being aware of the other two players and working with them I was able to control the music that was created. Figure 9: I enjoyed using SIME. Figure 10: I found the music that was created was interesting. Figure 11: One player is able to control the music created by SIME on their own. Interview results After the users had answered the questionnaire we asked them several interview questions to learn more about their experience with the system. The first question was designed to find out how well the users felt they could control the music and how well they understood the musical control. All of the participants felt that they were able to control the music at least to some degree and were able to describe some of the aspects of this control. Specifically everyone determined that by facing each other they could create happy music and that by facing away they could create sad music. The participants in one group spent a lot of time playing with the system and were able to figure out the entire mapping. We did not have a hypothesis about how SIME would affect users awareness of each other but were interested to find out if users felt that it did. When asked how using SIME changed their awareness of others in the environment, many users said it made them more aware. One user said they could tell who walked closer to them and who was facing away. Several users commented that the system required them to work together and one noticed that their group worked together to try and find ways of getting happy music to play. When asked what they liked about the system users told us that they liked using social interaction to control the music. Several indicated that they liked that they could control the music and many of these were novice musicians. The fact that users specifically mentioned the ability to compose or control music as a positive aspect of the system is a strong indication that we were successful in achieving our goal. Others indicated that they liked the collaborative aspect of the system. We asked users what they thought of the idea of using social interaction to control music. In general, participants liked this idea. One went so far as to say it s brilliant really. Many users suggested situations where such a
system would be interesting or fun to use. These suggestions included parties, in a Karaoke like setting with friends, in an intimate setting and even to help condition children by having their behaviour affect the music. Discussion We achieved our goal of providing novice musicians with an enjoyable opportunity to compose music. This can be seen in both the questionnaire and interview results. In the questionnaire all but one user agreed or strongly agreed with the statement that they could control the music by working with the other two players and that they enjoyed using the system. During the interview the test users indicated that they felt they could control the music and demonstrated an understanding of how the mapping worked. Finally, when asked what they liked about the system several users indicated that they enjoyed being able to control the music. 7 CONCLUSIONS AND FUTURE WORK In this paper, we present the SIME system, in which people can use social interaction to explore and create music. User evaluation shows that subjects can understand the mapping between social interaction and music. Furthermore, both novice musicians and experienced musicians enjoyed exploring and creating music by communicating with each other. In the future, we will make the differences between the happy and sad melodies more dramatic. Also since different users may have different musical preferences we would like to allow users to choose their favourite type of music. Finally, to make SIME more robust we would like to improve our tracking system to make it more reliable in a complicated environment, such as when players move very quickly or look down at their feet. 8 ACKNOWLEDGEMENTS We would like to thank Graeme McCaig for his valuable advice about music expression, Baosheng Wang and Florian Vogt for their supporting in tracking system, Midori Ishihara for her help in creating music melodies and to all our test users. 9 REFERENCES [1] Bahl, P. and Padmanabhan, V. RADAR: An inbuilding RF-based user location and tracking system, Proceedings of IEEE INFOCOM, Vol. 2,(March 2000), pp. 775 784. [2] Bahn, C. and Trueman, D. interface.electronic chamber ensemble, CHI 2001 Workshop on New Interfaces for Musical Expression, Seattle 2001. available at http://www.arts.rpi.edu/crb/interface/examples.html [3] Fels, SS. and Mase, K. Iamascope: A Graphical Musical Instrument, Computers and Graphics. Vol. 2, (1999), pp. 277-286. [4] Gabrielsson, A. and Lindsrom, E. The Influence of Musical Structure on Emotional Expression. In Music and Emotion. Oxford University Press. 2001. pp. 223-248. [5] Gadd, A. and Fels, SS. MetaMuse: Metaphors for Expressive Instruments. In Proceedings 2002 Conference on New Interfaces for Musical Expression (NIME May 2002), Dublin, Ireland. [6] Glinsky, A., Theremin : ether music and espionage, University of Illinois Press, 2000 [7] Harter, A. Hopper, P. Steggles, A. Ward, and P. Webstet, The anatomy of a context-aware application, Proceedings of the 5th Annual ACM/IEEE International Conference on Mobile Computing and Networking (Mobicom 1999), pp. 59 68, Seattle, WA, August 1999, ACM Press [8] McCaig, G. and Fels, S. Playing on Heart-Strings: Experience with the 2Hearts System. 2nd International Conference on New Interfaces for Musical Expression. (May 2002) pp. 54-59. [9] Puckette, M. Pd Documentation. On-line at http://www.crca.ucsd.edu/~msp/pd_documentation/ [10] Reynolds, M., Schoner, B., Richards, J., Dobson, K. and Gershenfeld, K. An immersive, multi-user, musical stage environment, Proceedings of the 28th annual conference on Computer graphics and interactive techniques. (2001) pp. 553-560. [11] Synesthesia. Interactive Dance Club. Siggraph 98 Installation. http://www.elkabong.com/idc/idc.html [12] Want, R., Hopper, A., Falcao, V. and Gibbons, J. The active badge location system, ACM Transactions on Information Systems, 10,1(January 1992), 91 102.