Automatic Generation of Drum Performance Based on the MIDI Code

Similar documents
Lesson 1 name: Style Studies: Drum Set

PASIC Drumset FUNdamentals. Dan Britt

THE importance of music content analysis for musical

FREE music lessons from Berklee College of Music

Drum Sound Identification for Polyphonic Music Using Template Adaptation and Matching Methods

BayesianBand: Jam Session System based on Mutual Prediction by User and System

Electronic Musical Instrument Design Spring 2008 Name: Jason Clark Group: Jimmy Hughes Jacob Fromer Peter Fallon. The Octable.

La Salle University. I. Listening Answer the following questions about the various works we have listened to in the course so far.

Reading Answer Booklet Heart Beat

Setting up your Roland V-Drums with Melodics.

A Learning-Based Jam Session System that Imitates a Player's Personality Model

Eighth note triplets (Quaver triplets)

FreeDrumLessons.com Live. Punk Drumming. Lesson #13. Sheet Music Included. With Jared Falk & Dave Atkinson. Overview by Hugo Janado

Lets go through the chart together step by step looking at each bit and understanding what the Chart is asking us to do.

Play the KR like a piano

Weiss HS Percussion Audition Guidelines and Materials For the School year

drumlearn ebooks Fast Groove Builder by Karl Price

Elementary, Middle School, Jr. High School Vocal Soloists. Regulations

Devices I have known and loved

Teaching Total Percussion Through Fundamental Concepts

USER GUIDE V 1.6 ROLLERCHIMP DrumStudio User Guide page 1

FILL. BOOK Contents. Preface Contents... 4

Elementary, Middle School, Jr. High School Vocal Soloists. Regulations

High School Vocal Soloists. Regulations

y POWER USER MUSIC PRODUCTION and PERFORMANCE With the MOTIF ES Mastering the Sample SLICE function

Coming Soon! New Latin Styles. by Marc Dicciani

Computer Coordination With Popular Music: A New Research Agenda 1

Quick Start. Congratulations on choosing of the Roland Digital Intelligent Piano KF-7! Score Display

High School Vocal Soloists. Regulations

transcends any direct musical culture. 1 Then there are bands, like would be Reunion from the Live at Blue Note Tokyo recording 2.

Operation Manual. ZOOM Corporation. Reproduction of this manual, in whole or in part, by any means, is prohibited.

GPS. (Grade Performance Steps) The Road to Musical Success! Band Performance Tasks YEAR 1. Percussion. Snare Drum, Bass Drum, Kit, Bells

5-Note Phrases and Rhythmic Tension 2017, Marc Dicciani (written for Modern Drummer Magazine)

ACTION! SAMPLER. Virtual Instrument and Sample Collection

GimmeDaBlues: An Intelligent Jazz/Blues Player And Comping Generator for ios devices

Distributed Virtual Music Orchestra

Script Guide for All Region and All State Auditions

Drumix: An Audio Player with Real-time Drum-part Rearrangement Functions for Active Music Listening

Course Overview. Assessments What are the essential elements and. aptitude and aural acuity? meaning and expression in music?

THE ODYSSEY. Symphony X Drumset Transcription Jason Rullo on drums Transcription by Spiros

Resources. Composition as a Vehicle for Learning Music

Beat Tracking based on Multiple-agent Architecture A Real-time Beat Tracking System for Audio Signals

Music Enrichment for Children with Typical Development

about Orchestra Linus Metzler L i m e n e t L i n u s M e t z l e r W a t t s t r a s s e F r e i d o r f

Robert Alexandru Dobre, Cristian Negrescu

Drum Set. For the School Jazz Ensemble. Jim Catalano

The MPC X & MPC Live Bible 1

Erie All-City Marching Band Percussion

About Giovanni De Poli. What is Model. Introduction. di Poli: Methodologies for Expressive Modeling of/for Music Performance

METHOD TO DETECT GTTM LOCAL GROUPING BOUNDARIES BASED ON CLUSTERING AND STATISTICAL LEARNING

J-Syncker A computational implementation of the Schillinger System of Musical Composition.

Pitches and Clefs. Chapter. In This Chapter

6.UAP Project. FunPlayer: A Real-Time Speed-Adjusting Music Accompaniment System. Daryl Neubieser. May 12, 2016

Keyboard Version. Instruction Manual

Classroom. Chapter 1: Lesson 6

LBSO Listening Activities. Fanfare for the Common Man Suggested time minutes

The String Family. Bowed Strings. Plucked Strings. Musical Instruments More About Music

The growth in use of interactive whiteboards in UK schools over the past few years has been rapid, to say the least.

DRUMS. Free Choice Piece DISCOVER MORE. Graded Music Exam: General Information 1

INSTRUMENTAL MUSIC SKILLS

Starter Activities for Music Lessons

Classroom. Chapter 2: Lesson 12

PLANE TESSELATION WITH MUSICAL-SCALE TILES AND BIDIMENSIONAL AUTOMATIC COMPOSITION

Part II: Dipping Your Toes Fingers into Music Basics Part IV: Moving into More-Advanced Keyboard Features

timing Correction Chapter 2 IntroductIon to timing correction

metal Fatigue Performance notes

Music Composition with Interactive Evolutionary Computation

GPS. (Grade Performance Steps) The Road to Musical Success! Band Performance Tasks YEAR 1. Conductor

Assessment: To perform STOMP project -Performances Video of Performance to go onto T drive To reflect & evaluate the class percussion performance

Jam Tomorrow: Collaborative Music Generation in Croquet Using OpenAL

Melodic Outline Extraction Method for Non-note-level Melody Editing

PRODUCTION OF TV PROGRAMS ON A SINGLE DESKTOP PC -SPECIAL SCRIPTING LANGUAGE TVML GENERATES LOW-COST TV PROGRAMS-

Music 209 Advanced Topics in Computer Music Lecture 4 Time Warping

Term 3 Grade 6 Music Literacy

PHYSICS OF MUSIC. 1.) Charles Taylor, Exploring Music (Music Library ML3805 T )

USC Thornton School of Music

Connecticut State Department of Education Music Standards Middle School Grades 6-8

Arizona State University Sun Devil Percussion Fall Auditions 2016

INSTRUCTIONS TO CANDIDATES

Keyboard Music. Operation Manual. Gary Shigemoto Brandon Stark

Summer 2017 Monday, June 26 Friday, July 28, 2017

2012 SYLLABUS EXAMS. Bass. Drums. Guitar. Keyboards. Vocals

However, in studies of expressive timing, the aim is to investigate production rather than perception of timing, that is, independently of the listene

Leicester-Shire Schools Music Service Unit 3 Rhythm Year 1

Realtime Musical Composition System for Automatic Driving Vehicles

2018 HPMC Mini Courses and Descriptions

Drunken Sailor The Melody

Year 7 revision booklet 2017

An Audio-based Real-time Beat Tracking System for Music With or Without Drum-sounds

001 Overview 3. Introduction 3 The Kit 3 The Recording Chain Technical Details 6

The Complete Guide to Music Technology using Cubase Sample Chapter

I) Blake - Introduction. For example, consider the following beat.

Page 7 Lesson Plan Exercises 7 13 Score Pages 70 80

UNIVERSITY OF LOUISIANA AT LAFAYETTE. STEP Committee Technology Fee Application

What is the Essence of "Music?"

specialneedsinmusic.com Goals and Objectives for Special Needs and Other Students

ADJUDICATION SHEET CRAFTS

Music Radar: A Web-based Query by Humming System

Autumn. A: Plan, develop and deliver a music product B: Promote a music product C: Review the management of a music product

All Blues Miles Davis. Year 10

Transcription:

Automatic Generation of Drum Performance Based on the MIDI Code Shigeki SUZUKI Mamoru ENDO Masashi YAMADA and Shinya MIYAZAKI Graduate School of Computer and Cognitive Science, Chukyo University 101 tokodachi, kaizu-cho, Toyota-shi, 470-0393 Japan School of information Science and Technology, Chukyo University 101 tokodachi, kaizu-cho, Toyota-shi, 470-0393 Japan E-mail: shigeki@om.sist.chukyo-u.ac.jp, {endoh, myamada, miyazaki}@sist.chukyo-u.ac.jp Abstract This paper aims to construct a music band through the internet. In network applications, avatar s motion is often expected to be generated smart and the amount of data for controlling the motion through the internet has to be small for real-time response. Therefore, this paper proposes a method to generate drum performance automatically based on the MIDI code. A set of basic motion for parts of human body is prepared and high-level operations generate motion of playing the drum. In drum performance, both of hands or both of legs often move simultaneously such as preparing for beating the snare by the right hand while beating high hat cymbals by the left hand. To solve those cases, several kinds of information such as sound, timing, strength, and so on are used. They determine the motion under the constraint condition in human body motion or the sequence of sounds. Also, two or more kinds of motion are possible for the identical MIDI code in some cases. In such optional cases, the most natural and suitable motion is selected automatically. Keyword Music Session,Virtual Environment,generate performance 1. Introduction The high computation and rendering technology has enabled a new man-machine interface where three dimensional CG characters perform as avatars. High-speed networks have connected them, and a new style of human communication through the internet has appeared. Not only conventional conversation styles but also many kinds of real human s action have been naturalized into the virtual space. We have also been developing such an application, in which three dimensional CG characters play musical performance. In the network application, the amount of data transferred through the network should be small as well as possible for real-time response. Therefore, motion control through the network should be performed by the high-level control. It is important to define desirable motion elements, that corresponds to low level, according to the application to be realized, and to construct high-level motion control algorithm based on them. Musical performance consists of combination of some motion parts. Because music performance is various movements by each hands and foot does,

generating motion is difficult. Various researches have been reported for the motion generating and displaying of the music performance with the three-dimensional CG. A method to display piano playing based on the score information was proposed in [1]. It generates and displays reasonable musical motion from the note row in the back and forth. The method described in [2] displays motion of the drummer and the base guitar player with three-dimensional CG that synchronizes with automatic playing. These methods use information that is scheduled to play in the future, for generating music performances. However they cannot generate music performances in the case that the play in the future is not predictable. On the other hand, the method described in [3] forecasts the melody in the future by using information of previous melody. It performs well if the melody is repeated regularly. However, the melody prediction is hard if the melody has irregular parts. In this research, we develop music performance through the network, where three-dimensional CG characters make music performance automatically based on musical information. Especially, this paper proposes a method to generate drum performance based on the MIDI code. 2. Music session system using network Usually, music session with friends is difficult because of a geographic and time restriction etc. We are studing the research that achieves the music session by using the network. When we play music session in the real world, we use not only sound but also motion or expression for the communication with others. Those visual elements have a big influence on the impression of music and improve the power of expression of sound. Oppositely, bad motion deteriorates the impression of music. Fig. 1. Concept of music session system Therefore, our music session system provides a virtual studio where three-dimensional CG characters can play virtual musical instruments. The following sections present the overview of the music session system, available musical devices and musical performance data. 2.1. Overview of the system This system achieves music session by two or more people using the network, the computer, and musical instruments, etc(fig.1). Real time music performance is realized by using data model based on MIDI. In addition, a virtual studio and three-dimensional CG avatars are introduced in order to achieve visual communication. The software of the system consists of lobby server software and client software. The server software manages clients subscribing to the system and provides users with a lobby to find other party who does the session together. After a session group has been established, the client software creates P2P connections and enables users to play music session. The function of the client software can be divided roughly into four parts, the network part, the sound part, the graphic part, and the input controller (Fig.2).

Other Lobby server client program Client program Network part Session peer Lobby client play note action note Sound part Sound devices play note Graphics part 3DCG avatar Animation controller Input controller MIDI devices Input devices Fig. 2. Composition of session system 2.2. Available musical instruments Available musical instruments as the input device are joystick devices, PC keyboard, PC mouse and MIDI musical instruments. The MIDI instruments such as the keyboard, drums, and guitars can be bought in the music store. Users can play music as usual by using these MIDI instruments. 2.3. Musical performance data Musical performance data include motion data of players and sound data. The amount of musical performance data through the internet has to be small for real-time response. The format of the sound data employed in our system is not based on waveform format, but based on MIDI to suppress the amount of data. Information that the system can get about musical performance is on only sound and does not contain information on motion. Information on motion such as which part of the body was moved is necessary to display motion of the three-dimensional CG characters. Therefore, the motion data is generated and added to musical performance data by the method described in the next section. 3. Generating motion data Plural motions are possible for the same MIDI code in some cases. For instance, you may use any finger of any hand when you play an arbitrary sound with the piano. It is necessary to cancel such vagueness to generate motion from sound data. It is able to cancel such vagueness by using special device like motion capture system, and to generate animation of three-dimensional CG characters[4][5]. However, such a special device has possibility of hindering users from playing instruments as usual. On the other hand, our research assumes that the users play in ordinary environments. This section describes performance of musicians, especially performance of drummers, and explains a method for selecting the most natural motion of the possible motions. 3.1. Performance of musicians The musical performances are achieved by dynamical movements of the body. The speed and range of movements are restricted by physical factors such as the muscle and the frame structure. Therefore, reasonable motion that suppresses the useless of motion is requested in playing musical instruments [1]. 3.2. Performance of drummers Drummers play the drum by using the both hands and both foot at the same time to beat two or more percussions. Some motions are determined uniquely according the kind of the percussion beaten. For instance, the bass drum is always played by using the right foot, while the high hat pedal is always played by the left foot. However, motions for the other percussions cannot be determined uniquely because it is not ruled which hands must be used to beat them. In this paper, we aim at providing a method for the decision of natural motions. This method removes unnatural motions from possible motions. The unnatural motion in playing the drum is defined as follows:

(1) Performance in impossible posture; The arm intersects (2) Beating plural percussions with one hand at the same time. (3) High-speed and repeated beating with one hand For example, the motion hitting both the Snare drum and the Floor tom while arm intersected is unnatural according to (1). The motion hitting Crush cymbals and Ride cymbals with one hand at the same time is impossible and regarded as unnatural according to (2). When different percussions are alternately beaten at high speed, nobody beats purposely with one hand. So, such motion is unnatural according to (3). The motion of (1) or (2) can occur when two percussions are played at the same time. Moreover, the motion of (1) or (3) can occur when plural percussions are hit repeatedly. The method of evading these situations is described in the following section. 3.3. Simultaneous hitting There is the case that different percussions are hit with both hands at the same time, such as hitting both high hat cymbal and snare drum, hitting both floor tom and snare drum. However, the time when two percussions are hit is not the same strictly. In other words, there is little completely simultaneous play. Therefore, we judge whether or not the player attempted a simultaneous hitting by the following way. It is assumed that two playing input where their time difference is less than a constant interval can be given when the simultaneous hitting occurs. Each percussion is given a value to be identified as Table 1. However, when the same musical instruments were beaten, the value became the same. Then, the previous input is left hand. The Table 1. value of instrument hit at the same time Crash Hi-hat Snare NAME Cymbal Cymbal Drum VALUE 2 3 1 Floor Ride TomTom1 TomTom2 Tom Cymbal 4 5 7 6 following input is a right hand. The values of each musical instrument are as shown in Table 1. For instance, when the ride cymbal and the crash cymbal are beaten at the same time, the ride cymbal's value is two and the crash cymbal's is six. Therefore, the crash cymbal were hit by the right hand and Ride cymbal were hit by the left hand. Here, the value of snare drum was set lowest. The purpose of arm crossed is "Closed Hand" that beats high hat cymbals by the right arm. "Open Hand" can be achieved by lowering the value of High Hat cymbals more than the snare drum. The drum set has variation depending on musical genre. This method can correspond to them only by adding and changing value table of musical instruments. 3.4. Repeated hitting The performance of "Fills" in the drum performance is faster than the rhythm part. The constant value for the judgment of hitting was repeatedly set here. They are judged by comparing them with the input time. While the number (input time) - (last input time) is smaller than the constant value, hitting motion repeats. The hand for hitting alternately changes for each hitting. 4. Displaying drum performance Operation in the musical instrument performance can be roughly divided into three sections, motion pre-performance, performance, and post-performs.

(a) Fig.3. discontinuous motion (b) (c) Fig.5. Score of experiment (Fig.4) Fig.4. motion continuation The musical performance smoothly shifts from post-performance to the next pre-performance. Because the drum performance is raised and beaten, it is especially important how to show natural motion before it plays. 4.1. Achievement of animation This subsection explains the generation of avatar animation. As for the generation of CG animation, continuous reproduction of short motion is often used. Performance of the drum playing moves both hands and foots according to various timing. Therefore, it is difficult to prepare all motion of the character beforehand. However, for each hand or foot motion, the motion is limited. Therefore, motion for each hand or foot is prepared beforehand, and musical performance of the character is generated as a combination of them. Here, the pre-performance is not considered, because motion is generated following to the input of MIDI. 4.2. Motion continuation The motion becomes discontinuous in the animation without pre-performance.(fig.3) To prevent this problem, the post-performance is generated following to the main performance motion. 5. Experiments The following experiments evaluate the effectiveness of proposed method by comparing generated motion with an actual performance. 5.1. Overview of experiments In the first experiment, the tempo of each score (a) (b) (c) of Fig.3 is set. Then, it is repeatedly played three times. Moreover, it tries several times by a different tempo.(a) is a rhythm of simple eight beats with High Hat Cymbals and the Snare Drum. The difficulty is low. (b) is a rhythm of 16 beats. Playing uses the tam-tam and change musical instruments in "Fills". In (c), it is a little irregular compared with (a), (b) and the difficulty is high. The subject who has middle level is able to play by seeing the score. The motion actually performed is compared with the motion by the judgment method of this research. Moreover, unnatural musical performance is confirmed. 5.2. Experimental results Table 2 shows the result of experiments. Hitting at the same time was a correct answer in all the performances. On the other hand, a wrong motion was occasionally selected in the scene of fast hitting such as "Fills". These misjudgments occupy a large part of the whole misjudgments. Especially,

Table2. Experiment results Hit at same time Hitting misjudge / hit repeatedly (Correct answer) Whole Acknowledgments This research was supported in part by the grant-in-aid for Private University High-Tech Research Center. 0 / 108 6 / 96 8 / 515 (a) (100%) (94%) (98%) 0 / 98 34 / 108 36 / 539 (b) (100%) (69%) (93%) 0 / 59 17 / 168 20 / 516 (c) (100%) (90%) (96%) experiment results show that the number of misjudgments significantly increases when the player cannot play correctly according to the tempo. In the real-time musical session, the problem due to the tempo is easily happen, because it is influenced by player's ability or changes of key. It is necessary to synchronize the tempo to such a performance, for example, the system should synchronize tempo automatically by using the melody forecast. It was not so visually significant even if unnatural performance operation occur due to the misjudgment. Although, it can not generate completely the same motion as an actual performance, unnatural music performance did not appear for all the performance. 6. Conclusions In this paper, we described a method to generate drum performance automatically based on the MIDI code. Tracks were used for the display of animation. Natural musical performance was displayed by connecting motion. In this research, we generated not unnatural motion. The performance posture will be dynamically changed according to melody in the future. References [1] SEKIGUCHI HIROYUKI, EIHO SHIGERU, Generating and Displaying the Human Piano Performance, Transactions of Information Processing Society of Japan, vol.40, no.6, pp.2827-2837 (1999) [2] Matsumoto Hideaki, Goto Masataka, Muraoka Yoichi, Generating CG Animation of Virtual Players by Musical Performances, Information Processing Society of Japan, vol.98, No.16, pp.11-16, (1998) [3] MATSUO SATOKO, KATAYOSE HARUHIRO, INOKUCHI SEIJI, A Computational Model for Melody Prediction, Transactions of Information Processing Society of Japan, vol.41, no.2, pp.498-508(2000) [4] Alberto Menache, Understanding Motion Capture for Computer Animation and Video Games, Morgan Kaufmann, 2000 [5] T Molet, R Boulic, D Thalmann, A Real Time Anatomical ConverterFor Human Motion Capture, Eurographics Workshop on Computer Animation and Simulation, 1996