Jam Master, a Music Composing Interface

Size: px
Start display at page:

Download "Jam Master, a Music Composing Interface"

Transcription

1 Jam Master, a Music Composing Interface Ernie Lin Patrick Wu M.A.Sc. Candidate in VLSI M.A.Sc. Candidate in Comm. Electrical & Computer Engineering Electrical & Computer Engineering University of British Columbia University of British Columbia Vancouver, BC, Canada Vancouver, BC, Canada erlin@interchg.ubc.ca pswu@interchg.ubc.ca ABSTRACT This paper presents the interface design of the Jam Master system, an interactive music composing device that interprets human body gestures as music variable settings. The system allows multiple number of players to participate in the music making process. This paper begins with background information describing the complexity of music composition by musicians, the history of synthetic music, and the existing technologies that make a music system such as Jam Master possible. Then it continues with the discussion of the various key characteristics and requirements in real-time interactive multi-user system deign and a look at the similar systems that have been built. How the hardware systems have been selected and how the interfacing between the Polhemus sensors and Jmax (the software) are described. Then comes the player interaction section that describes how the Polhemus data is used to compose music, with the possibilities of varying the volume, pitch/frequency, and different sound effects. Finally the paper concludes with the testing and future implementation sections. Keywords Interactive music, interactions, multi-user, gesture recognition, interface design, real-time performance systems 1 INTRODUCTION Recently, human interface technologies have been receiving a great deal of attention from researchers from various disciplines including the engineering, computer science, performing arts and music areas. Interfaces, serving as the link between humans and computer or electronics, interpret human gestures and signals and process these cues to generate desired outcomes. In the lately published journals, interfaces examined mainly cover visual and audio systems that exploits the characteristics of human eyes, ears and speech, motion tracking systems that follow and react to human motions, as well as a great variety of input devices designed to replace pens and keyboards such as light pens, touch typing gloves, mice, and multi-dimensional joysticks. The Jam Master music composing system is intended to be run in real time, allowing players to generate music by moving their arms and tapping their bodies. Players can use the system to create pieces of music and learn about the different aspects underlying music composition. Overview The rest of the paper is organized as follows: Background describes music composition, history, and existing technologies, User Interface Design discusses the aspects of real-time, interactive, and multi-user systems that must be considered, Comparison of Existing Systems gives a brief overview of what other similar systems are out there, Interfacing the Hardware describes the hardware selection and the hardware interfacing process, Player Interaction explains how the data received in hardware is used to make music, and Testing, Future Considerations close off the paper with a look into what can be extended from this project. 2 BACKGROUND The background information for the Jam Master system can be separated into the music composition, history, and existing technologies sections. Music Composition From centuries ago, people have desires to make music for amusement purposes. By playing music instruments such as percussions, violins or mandolins, or using a part or parts of their bodies to whistle, clap, or sing, millions of pieces of melodies are composed everyday, and some become more famous than others. Musicians acquire music developing skills through formal training. At the fundamental level, rhythmic, melodic, textural, and harmonic processes of music and how these processes create structures in a variety of styles are studied.

2 Notation and aural recognition of rhythmic and pitch patterns, and compositional and analytical skills are vital in musicianship. Most importantly, much of the musician's craft is gained unconsciously through music composing experiences and training. And there is no simple answer as to how the knowledge, skills, experience and gestures are interrelated in music making [1]. Typically, the general public has neither the special knowledge in computing nor the specific music training to make music. On the other hand a musician would like to have maximum control over the music being played. Thus, a music composing system that targets the broad public and musician as its players should have the flexibilities of allowing experienced users to alter the music as much as possible while giving those with little or none music background and training the chance to be participants. Such a system would open up new ways to musical creativity, and musical education, regardless of the players prior musical knowledge [2]. History As technology advances, various synthesizing methods make it possible to generate music from electrical signals. At the early stages of creating electronic music, people were mainly concerned about imitating the sounds coming from the existing music instruments, thus the electronic keyboards or the electronic drum pads. These electronic instruments are not meant to replace the conventional ones, as the sounds generated, although carefully reproduced, are never close enough to the original ones. The earliest interactive computer music systems include Groove, designed by M. Mathews and his team at AT&T Bell Laboratories around 1968, and A Computer Aid for Musical Composers, developed at the National Research Council (NRC) of Canada around 1970[1]. Both systems have the basic music describing, manipulating, playing and storing functions in terms of real time interaction. As Mathews pointed out the main characteristics of music composing systems, The desired relationship between the performer and the computer is not that between a player and his instrument, but rather that between the conductor and his orchestra [3], it is clear that these systems are not just different pieces of apparatuses but ensembles that are not limited to one certain type of music. Existing Technologies The birth of Musical Instrument Digital Interface (MIDI) led to a new era in music synthesis. MIDI is a standard agreed upon by the music industries in 1983 for digitally synthesized music. It s designed to represent the sounds of certain music apparatuses such as keyboard and wind instruments in the digital format, and can be used to exchange music data between synthesizers, sequencers, computers, interfaces, and controllers thru a daisy-chained serial bus [4]. The standardization is seen as a big step in the computerized music business, and MIDI becomes widely used for musical representations. Before the idea of virtual reality (VR) came into play, some operators of electronic keyboards and guitars expressed their resentments towards having to actually hold devices in order to play. Therefore when VR becomes popular, the physical interfaces (instruments) are taken out of the music controller systems in some designs. Instead, cyber gloves, polhemus sensors, or other wearable sensors are used to measure the players music gestures while performing with an air guitar, an air drum, or an air harpoon, to determine the sounds to be generated. To extend the VR idea further, developments in music controllers surge as inventors allow music composing to be of free form, not limited to the existing music instruments. Through interacting with the human bodies, music controllers can generate pleasing music with the help of computers. 3 USER INTERFACE DESIGN The user interface for Jam Master is to be designed with the following characteristics that apply to real time, interactive, and multi-player systems in general. Technical Considerations for Real Time Systems The requirements for a music interface would vary from systems to systems, depending on what the intended achievement is and where the system will be used. Pennycook illustrates the timing requirements in a hierarchical fashion for real time systems as in Figure 1[1].

3 The first level deals with controlling the formal music structures, where the instrumentation and number of input channels are considered. Then the input data is collected from the sensor devices used at the performance execution rates level. At the third level, performance and synthesis data are merged into a continuous stream and the stream is sent to the synthesizer as updates. The synthesizer then computes the sound samples to be outputted, according to the hardware instruction rates. And finally at the last level, the sound samples are played using the audio conversion subsystems. Since the system is intended to be run in real time, the figure also lists the desirable timing constraints at each level. As the computer music system is classified as a soft real time system, it is not critical for each level to meet the timing constraints as absolute deadlines. If the constraints cannot be met, the music output would be somewhat distorted, yet not to the situation that it becomes complete noises. Features for Interaction There are some key characteristics that an interactive system should include at the design phase. Borcher, in his paper for the WorldBeat system, described several aspects of what an interactive system looks like [2]. First of all, the system should be consistent and explorable. Players should be able to access all the functions that the system provides without having the switch between input devices. Also, the system should allow players to actively participate and navigate the different features of the system, and thus produce innovative results on their own. Secondly, the system would preferably be intuitive and non-technical. Making the system easy to use will encourage people with little music experience to operate without having to do lots of practices. In an interactive environment, it would be difficult to require the participants to have the same music or system-wise training. Therefore people with different abilities should all be able to use the system. Furthermore, since the system is designed to be interactive, it shouldn t require players to follow a particular set of instructions in order to generate outputs, and thus the non-technical issue. Thirdly, the system should be cooperative. To extend the concept of interaction, making the system available for multi-player use adds another variable to the system. While the system can be operated by a single player alone, the addition of other players enables cooperative learning of music creation. And it would probably make the system more enjoyable. Weighting between the Main Composer and Side Players In a multi-player system, generally one leading role would be assumed. This is to make the decision making process somewhat simpler and to generate a more coherent result than weighting all participants as equals. However, the limited opportunities for interaction given to the side players create drawbacks for them, such as having little to distinguish the use of the system from listening to a piece of music made by someone else, and having a very limited sense of involvement during the composition. Therefore, the factor that is used to weight the activities by the main composer and the side players must be carefully decided. 4 COMPARISON OF EXISTING SYSTEMS There are several devices that have some resemblances at the fundamental level to what the authors plan to implement. Predefined Music Library The Digital Baton, built at the MIT Media Lab, incorporates optical trackers, pressure sensors and accelerometers to track operators hand gestures [4]. With that information, operators by moving the baton with certain beats or in large gestures can act as conductors to make changes in transitions in sound groupings or underlying rhythms. This digital baton is similar to the Jam Master in that it contains a predefined music library. From the library, the synthesizer of the Digital Baton selects and plays pieces of music. In Jam Master, operators will be able to choose different sound effects from the library to play as they see fit. Wearable Device Building a wearable music interface is another important concept in the Jam Master design. Maggie Orth and Rehmi Post at Media Lab built a musical denim jacket that is equipped with a 12 key keypad and a synthesizer [4]. Operators would press the keypad to generate the desired music with the jacket. In another project titled dance sneakers, sensors are built into the shoe and data is offloaded to the synthesizer over a wireless link. In Jam Master, operators will have to wear the sensors on their bodies so that the body gesturing signals can be sent to the computer for further processing. The wearable aspect gives operators mobility within some range, which allows the players to have a certain degree of freedom in movements. 5 INTERFACING THE HARDWARE In order to measure the body gestures, a piece of hardware device must be chosen to receive the inputs of the body signals. Then, an interface for the chosen hardware must be build so that data from the hardware will be ready to be accessed by the music composing interface. Hardware Selection There are several aspects that we can take advantages of in modifying music, including volume, pitch, frequency, and etc.. And these aspects can distribute over a range of values. Therefore an input device with multi-dimension measurements must be used for the system.

4 The Polhemus Fastrak sensors, provideing dynamic, real time six degree-of-freedom measurement of position (X, Y, and Z Cartesian coordinates) and orientation (yaw, pitch, and roll) are therefore the most promising electromagnetic tracking system that is available for the authors to use. Furthermore, each Polhemus system can accept data from up to four receivers at an reportedly update rate of 120 Hz (with a single receiver) and a remarkable 4ms latency [5]. The received data is transmitted to the serial port of a computer for further processing. This allows the Jam Master system to have up to four players composing music at the same time. Since no line of sight is required between the sensors and the receiving base, players of the Jam Master system wouldn t need to worry about their bodies blocking the receiver. For selecting music from the sound library, the icube system will be used. icube can be connected with various types of touch sensors that will be used for the sound effect selection. Interfacing the Polhemus and Jmax Jmax, the signal processing software that the authors have chosen for music composition (see the next section), uses patches as building blocks in the construction. Therefore, it is necessary to make an interface that outputs the measurements from the Polhemus in Jmax. The patcher for the icube system is written by a different group that is using the Jmax software as well, and therefore is not described here. Each polhemus sensor has six measurements, and therefore four sensors will output a total of twenty-four values. Using the existing driver for the Polhemus, a Jmax patch would be able to access the values. Under a hierarchical structure, a patcher that outputs the twenty-four values of the Polhemus is illustrated in Figure 2. Figure 2: The Interface for the Polhemus Fastrak System. In Figure 2, metro is a patcher and will generate continuous bangs at its output periodically once it receives a bang at its inlet. The Mypolhemustop patcher receives these continuous bangs and calls the driver that reads in the data from the Polhemus. Mypolhemustop would then format the data into four arrays, one for the data received at each sensor, and output them to its four outlets. Mypolsensor, once receives the data array containing the six readings from a sensor, will parse the arrays and output each value to one of its six outlets. 6 THE JMAX PROGRAMMING ENVIRONMENT The programming environment that was used to develop Jam Master is called jmax. It is an object-oriented, graphical programming environment based on the Max/FTS core with a java-based GUI. Objects (called patchers) are implemented in one of two ways: a) using the GUI to construct new patchers out of existing sub-patchers graphically, by using lines to connect sub-patchers together; or b) writing C code to implement the function of the patcher, then using the C as a basis for developing a jmax control object. jmax objects each have a set of inlets and a set of outlets for data. The inlets are the input ports to the patcher, and conversely, the outlets are the output ports. There are also 5 basic types of data: integer, float, message, Boolean and bang. The bang data type represents a type somewhat analogous to a switch; it is used as a signaling type for events. jmax objects are interconnected by lines in the graphical interface. These lines represent conduits for data messages to flow from an outlet of one object to the inlet of another object. Signal processing in jmax is accomplished by using jmax signal objects. Signal objects implement various signal processing constructs, like filters and delay lines. There is also a signal data type which these constructs operate on. Signals can be filtered, added, subtracted, multiplied, divided, and they can also be fed into a digital-analog converter (DAC) object at which point the signal it represents gets converted into sound that we can then hear. The best feature of jmax (at least in the authors opinions) is that recompilation is not necessary every time you modify a patcher, since the objects themselves are not changing, only the interconnection between objects. Once you make a change, to see it work involves merely changing to a different editor mode in which your patcher can be executed immediately. 7 PLAYER INTERACTIONS Jam Master requires up to four people to operate. One person is the main composer, he decides what notes are to be played and the volume of each note. The other players contribute variations of the composer s note to the symphony. The other players modify the composer s note by making it shifted slightly in frequency

5 and/or changing its amplitude. Each of the other players can also add sounds from a sound effect library to the symphony. In this way, the overall music is influenced by each of the players individual tastes. The players also need not be musically inclined; with some practice, the output sounds created could be an expression of their collective creativity. Volume/Amplitude Adjustment Each player controls his/her note independently of the other players. Each players Polhemus sensor data controls his/her own amplitude multiplication factor. The sensor data is normalized to a value between 0.0 and 1.0, along a linear scale. The note is multiplied by this factor to change its amplitude (where 0.0 means the note is muted and 1.0 means the note volume does not change). To produce this normalized factor, the Polhemus sensor data needs to be processed. The (x, y, z) coordinates returned by the sensor are used to calculate the distance between the player and the Polhemus transmitter. This distance is then normalized to between 0.0 and 1.0 by dividing this value by a normalizing factor which represents the maximum separation distance between the sensor and the transmitter. This need not be the actual maximum possible distance that the sensor is capable of reading; it could just be an arm s length or so. note the composer has generated). Pitch adjustment is accomplished by converting the main note to a frequency in Hertz, then adding a frequency offset (which may be positive or negative) and playing the resulting sound. It is the frequency offset which is under control of the other players. The frequency offset is computed based on the Polhemus sensor data received. The quantity of importance for this purpose is the roll component of the (roll, pitch, yaw) data recorded by the sensor. Roll is measured in a range from +180 degrees to 180 degrees. We have chosen to not normalize this data before applying it to the note. However, normalization would not be difficult to implement in the future. Since the roll varies from +180 to 180 degrees, the amount by which the frequency can be changed varies from +180 to 180 Hertz; that is, the amount of roll is added to the frequency of the note and played. Figure 3: Volume Adjusting patcher (note: Polhemus patcher detail has been abstracted away for simplicity) Pitch / Frequency Adjustment As with volume, pitch is also controlled independently by each player. Each player (other than the composer player) generates a slightly off-pitch version of the main note (the Figure 4: Pitch Adjusting patcher (note: Polhemus patcher detail has been abstracted away for simplicity) Sound Effects Each player also has at his/her disposal a library of sound effects that can be loaded in when the application starts. How the sound effects are activated is simple. Each player wears a set of touch sensors on his/her body. The sensors are connected to an icube that allows the signals to be interfaced with jmax. Upon touching a sensor, jmax sends a bang signal to play back the corresponding sound file.

6 Figure 5: Sound Effects Playback patcher Combining each Players Notes Combining notes is also simple. Each player s notes are reduced in amplitude by a factor of four. Then the signals are summed and fed into an oscillator and finally output via the DAC. The figure below shows how this was done in jmax (figure 6). Figure 6: Combining Player Notes patcher Some consideration was given to the issue of whether the composer should have the ability to adjust the weight of his component of the music so that he could make himself sound more dominant or less dominant over the other players. We have decided to leave this feature for a future revision of Jam Master since it is fairly trivial to implement and does not contribute greatly to the overall description of the interface to Jam Master. 8 TESTING Informal testing was completed on the application. The application was broken into its constituent parts and each part s operation was verified individually. The Polhemus sensor was also tested during the development of the polhemus patcher in jmax. The polhemus patcher was verified simply by executing the patcher in jmax and observing the values at the patcher outlets as the sensor was moved around in space. Position in (x, y, z) coordinates was displayed and verified, along with the (roll, pitch, yaw) coordinate outputs. We have not yet tried testing the application as a whole, and as such we have not tried the application with more than one player. It is also difficult to test because to use the application in such a way that produces harmonious music, a lot of practice and/or musical aptitude is needed. We (the authors) do not have this kind of experience. The best the authors have done is generated what sounds more or less like volume-varied random noise. The intended use for this application involves the composer playing a tune (perhaps something classical), with the other players creating accompaniment sounds. When done correctly (and with enough practice), the end result should be very harmonious. Problems Encountered Over the course of this project we had many difficulties which prevented us from achieving our goal as fast as we would have liked to. One of these problems had to do with the sound card installed in the computer. Our access to the sound card was very sporadic. This was because our read and write access to the device in Linux changed constantly, possibly because of the other users tweaking. In order for jmax to work (sounds and music can be produced and heard) the read and write permission for the sound card must be set correctly. Another problem we had was with the Polhemus sensor. The sensor is read via the serial port on the computer. However, for reasons not known, the read and write permission to the device was also sporadic. The permissions for the serial port also need to be set properly for the Polhemus to function. We also seemed to have some trouble with creating jmax objects from existing C source code. We felt this was because of the inadequate documentation included with jmax. However, the jmax mailing list proved to be of much assistance while trying to figure out some of the more obscure things in jmax. Also on the subject of jmax objects and poor documentation, we felt that even creating patchers from existing sub-patchers was difficult because some of the sub-patchers lacked online help (or any help at all for that

7 matter). For these sub-patchers, either trial-and-error was used to figure out the sub-patcher (its inlet/outlet types, and what each inlet/outlet represented), or the sub-patcher was worked around completely by implementing the function in a different way. 9 FUTURE EXTENSIONS / FUTURE WORK We would like to add features to the application in the future, such as echo/reverb support, support for karaokeing, as well as more signal processing features, such as filtering. We feel it would be interesting to add support for even more players to the application. More players and more features mean more variables, and we believe that having more variables available to the player increases the opportunity to express creativity. We think it would be interesting to add a module for beat detection and beat modification. For example, one idea was that the players could form circles and that the radius of the circle determines the beat of the music being played (this implies that the beat is not controlled by the composer; think of it as a drum machine type of feature the composer follows the tempo of the drum beats when he/she is composing) Another interesting thing might be a visual type of display that is modulated by the players positions and/or music being played. Our idea is somewhat similar to the output visualization plugins popular in most mainstream MP3 player programs, such as Nullsoft s Winamp. We mentioned above that each note generated by the players and the composer is given equal weight when they are combined. It would be interesting to give the composer control of this weighting so that players could be emphasized and de-emphasized as the composer saw fit. Finally, we think that adding MIDI instrument controls would be a good idea, by giving the composer greater freedom in terms of what instruments he/she has available for use. If this can be done without entering numbers into a keypad (perhaps by touch sensors worn by the composer), this adds yet another dimension of creativity that can be expressed. 10 INFORMATION AND QUESTIONS For more information, contact ssfels@ece.ubc.ca. ACKNOWLEDGEMENTS The authors would like to thank Prof. S. Fels, the members of his lab as well as several members on the jmax mailing list for the technical support given. REFERENCES 1. Pennycook, B. Computer-Music Interfaces: A Survey. Computer Surveys, Vol. 17, No. 2, June 1985, Borchers, J, WorldBeat: Designing a Baton-Based Interface for an Interactive Music Exhibit. ACM CHI 97 (Atlanta GA, March 1997). ACM Press, Mattews, M. GROOVE, a program for realtime control of a sound synthesizer by a computer, Proceedings of the 4 th Annual Conference of the American Society of University Composers. ASUC, Paradiso, J. Electronic Music: New Ways to Play, IEEE Spectrum Select, 5. Polhemus Fastrak at

8

MusicGrip: A Writing Instrument for Music Control

MusicGrip: A Writing Instrument for Music Control MusicGrip: A Writing Instrument for Music Control The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published Publisher

More information

Electronic Musical Instrument Design Spring 2008 Name: Jason Clark Group: Jimmy Hughes Jacob Fromer Peter Fallon. The Octable.

Electronic Musical Instrument Design Spring 2008 Name: Jason Clark Group: Jimmy Hughes Jacob Fromer Peter Fallon. The Octable. Electronic Musical Instrument Design Spring 2008 Name: Jason Clark Group: Jimmy Hughes Jacob Fromer Peter Fallon The Octable Introduction: You know what they say: two is company, three is a crowd, and

More information

Interacting with a Virtual Conductor

Interacting with a Virtual Conductor Interacting with a Virtual Conductor Pieter Bos, Dennis Reidsma, Zsófia Ruttkay, Anton Nijholt HMI, Dept. of CS, University of Twente, PO Box 217, 7500AE Enschede, The Netherlands anijholt@ewi.utwente.nl

More information

Ben Neill and Bill Jones - Posthorn

Ben Neill and Bill Jones - Posthorn Ben Neill and Bill Jones - Posthorn Ben Neill Assistant Professor of Music Ramapo College of New Jersey 505 Ramapo Valley Road Mahwah, NJ 07430 USA bneill@ramapo.edu Bill Jones First Pulse Projects 53

More information

CHARACTERIZATION OF END-TO-END DELAYS IN HEAD-MOUNTED DISPLAY SYSTEMS

CHARACTERIZATION OF END-TO-END DELAYS IN HEAD-MOUNTED DISPLAY SYSTEMS CHARACTERIZATION OF END-TO-END S IN HEAD-MOUNTED DISPLAY SYSTEMS Mark R. Mine University of North Carolina at Chapel Hill 3/23/93 1. 0 INTRODUCTION This technical report presents the results of measurements

More information

Distributed Virtual Music Orchestra

Distributed Virtual Music Orchestra Distributed Virtual Music Orchestra DMITRY VAZHENIN, ALEXANDER VAZHENIN Computer Software Department University of Aizu Tsuruga, Ikki-mach, AizuWakamatsu, Fukushima, 965-8580, JAPAN Abstract: - We present

More information

Toward a Computationally-Enhanced Acoustic Grand Piano

Toward a Computationally-Enhanced Acoustic Grand Piano Toward a Computationally-Enhanced Acoustic Grand Piano Andrew McPherson Electrical & Computer Engineering Drexel University 3141 Chestnut St. Philadelphia, PA 19104 USA apm@drexel.edu Youngmoo Kim Electrical

More information

1 Overview. 1.1 Nominal Project Requirements

1 Overview. 1.1 Nominal Project Requirements 15-323/15-623 Spring 2018 Project 5. Real-Time Performance Interim Report Due: April 12 Preview Due: April 26-27 Concert: April 29 (afternoon) Report Due: May 2 1 Overview In this group or solo project,

More information

Praxis Music: Content Knowledge (5113) Study Plan Description of content

Praxis Music: Content Knowledge (5113) Study Plan Description of content Page 1 Section 1: Listening Section I. Music History and Literature (14%) A. Understands the history of major developments in musical style and the significant characteristics of important musical styles

More information

Chapter Five: The Elements of Music

Chapter Five: The Elements of Music Chapter Five: The Elements of Music What Students Should Know and Be Able to Do in the Arts Education Reform, Standards, and the Arts Summary Statement to the National Standards - http://www.menc.org/publication/books/summary.html

More information

StepSequencer64 J74 Page 1. J74 StepSequencer64. A tool for creative sequence programming in Ableton Live. User Manual

StepSequencer64 J74 Page 1. J74 StepSequencer64. A tool for creative sequence programming in Ableton Live. User Manual StepSequencer64 J74 Page 1 J74 StepSequencer64 A tool for creative sequence programming in Ableton Live User Manual StepSequencer64 J74 Page 2 How to Install the J74 StepSequencer64 devices J74 StepSequencer64

More information

THE "CONDUCTOR'S JACKET": A DEVICE FOR RECORDING EXPRESSIVE MUSICAL GESTURES

THE CONDUCTOR'S JACKET: A DEVICE FOR RECORDING EXPRESSIVE MUSICAL GESTURES THE "CONDUCTOR'S JACKET": A DEVICE FOR RECORDING EXPRESSIVE MUSICAL GESTURES Teresa Marrin and Rosalind Picard Affective Computing Research Group Media Laboratory Massachusetts Institute of Technology

More information

Music (MUSIC) Iowa State University

Music (MUSIC) Iowa State University Iowa State University 2013-2014 1 Music (MUSIC) Courses primarily for undergraduates: MUSIC 101. Fundamentals of Music. (1-2) Cr. 2. F.S. Prereq: Ability to read elementary musical notation Notation, recognition,

More information

Keyboard Music. Operation Manual. Gary Shigemoto Brandon Stark

Keyboard Music. Operation Manual. Gary Shigemoto Brandon Stark Keyboard Music Operation Manual Gary Shigemoto Brandon Stark Music 147 / CompSci 190 / EECS195 Ace 277 Computer Audio and Music Programming Final Project Documentation Keyboard Music: Operating Manual

More information

A Composition for Clarinet and Real-Time Signal Processing: Using Max on the IRCAM Signal Processing Workstation

A Composition for Clarinet and Real-Time Signal Processing: Using Max on the IRCAM Signal Processing Workstation A Composition for Clarinet and Real-Time Signal Processing: Using Max on the IRCAM Signal Processing Workstation Cort Lippe IRCAM, 31 rue St-Merri, Paris, 75004, France email: lippe@ircam.fr Introduction.

More information

Curriculum Standard One: The student will listen to and analyze music critically, using the vocabulary and language of music.

Curriculum Standard One: The student will listen to and analyze music critically, using the vocabulary and language of music. Curriculum Standard One: The student will listen to and analyze music critically, using the vocabulary and language of music. 1. The student will analyze the uses of elements of music. A. Can the student

More information

Laser Conductor. James Noraky and Scott Skirlo. Introduction

Laser Conductor. James Noraky and Scott Skirlo. Introduction Laser Conductor James Noraky and Scott Skirlo Introduction After a long week of research, most MIT graduate students like to unwind by playing video games. To feel less guilty about being sedentary all

More information

K-12 Performing Arts - Music Standards Lincoln Community School Sources: ArtsEdge - National Standards for Arts Education

K-12 Performing Arts - Music Standards Lincoln Community School Sources: ArtsEdge - National Standards for Arts Education K-12 Performing Arts - Music Standards Lincoln Community School Sources: ArtsEdge - National Standards for Arts Education Grades K-4 Students sing independently, on pitch and in rhythm, with appropriate

More information

Outline. Why do we classify? Audio Classification

Outline. Why do we classify? Audio Classification Outline Introduction Music Information Retrieval Classification Process Steps Pitch Histograms Multiple Pitch Detection Algorithm Musical Genre Classification Implementation Future Work Why do we classify

More information

Computer Coordination With Popular Music: A New Research Agenda 1

Computer Coordination With Popular Music: A New Research Agenda 1 Computer Coordination With Popular Music: A New Research Agenda 1 Roger B. Dannenberg roger.dannenberg@cs.cmu.edu http://www.cs.cmu.edu/~rbd School of Computer Science Carnegie Mellon University Pittsburgh,

More information

GENERAL MUSIC Grade 3

GENERAL MUSIC Grade 3 GENERAL MUSIC Grade 3 Course Overview: Grade 3 students will engage in a wide variety of music activities, including singing, playing instruments, and dancing. Music notation is addressed through reading

More information

Robert Alexandru Dobre, Cristian Negrescu

Robert Alexandru Dobre, Cristian Negrescu ECAI 2016 - International Conference 8th Edition Electronics, Computers and Artificial Intelligence 30 June -02 July, 2016, Ploiesti, ROMÂNIA Automatic Music Transcription Software Based on Constant Q

More information

A prototype system for rule-based expressive modifications of audio recordings

A prototype system for rule-based expressive modifications of audio recordings International Symposium on Performance Science ISBN 0-00-000000-0 / 000-0-00-000000-0 The Author 2007, Published by the AEC All rights reserved A prototype system for rule-based expressive modifications

More information

QUALITY OF COMPUTER MUSIC USING MIDI LANGUAGE FOR DIGITAL MUSIC ARRANGEMENT

QUALITY OF COMPUTER MUSIC USING MIDI LANGUAGE FOR DIGITAL MUSIC ARRANGEMENT QUALITY OF COMPUTER MUSIC USING MIDI LANGUAGE FOR DIGITAL MUSIC ARRANGEMENT Pandan Pareanom Purwacandra 1, Ferry Wahyu Wibowo 2 Informatics Engineering, STMIK AMIKOM Yogyakarta 1 pandanharmony@gmail.com,

More information

TongArk: a Human-Machine Ensemble

TongArk: a Human-Machine Ensemble TongArk: a Human-Machine Ensemble Prof. Alexey Krasnoskulov, PhD. Department of Sound Engineering and Information Technologies, Piano Department Rostov State Rakhmaninov Conservatoire, Russia e-mail: avk@soundworlds.net

More information

Cathedral user guide & reference manual

Cathedral user guide & reference manual Cathedral user guide & reference manual Cathedral page 1 Contents Contents... 2 Introduction... 3 Inspiration... 3 Additive Synthesis... 3 Wave Shaping... 4 Physical Modelling... 4 The Cathedral VST Instrument...

More information

Course Overview. Assessments What are the essential elements and. aptitude and aural acuity? meaning and expression in music?

Course Overview. Assessments What are the essential elements and. aptitude and aural acuity? meaning and expression in music? BEGINNING PIANO / KEYBOARD CLASS This class is open to all students in grades 9-12 who wish to acquire basic piano skills. It is appropriate for students in band, orchestra, and chorus as well as the non-performing

More information

sing and/or play music of varied genres and styles with multiple opportunities.

sing and/or play music of varied genres and styles with multiple opportunities. Anchor: The student will sing/play an instrument using a varied repertoire of music. M.1.1. Sing and/or play a musical instrument accurately with correct fundamentals and techniques as developmentally

More information

Music at Menston Primary School

Music at Menston Primary School Music at Menston Primary School Music is an academic subject, which involves many skills learnt over a period of time at each individual s pace. Listening and appraising, collaborative music making and

More information

PHYSICS OF MUSIC. 1.) Charles Taylor, Exploring Music (Music Library ML3805 T )

PHYSICS OF MUSIC. 1.) Charles Taylor, Exploring Music (Music Library ML3805 T ) REFERENCES: 1.) Charles Taylor, Exploring Music (Music Library ML3805 T225 1992) 2.) Juan Roederer, Physics and Psychophysics of Music (Music Library ML3805 R74 1995) 3.) Physics of Sound, writeup in this

More information

Curriculum Standard One: The student will listen to and analyze music critically, using vocabulary and language of music.

Curriculum Standard One: The student will listen to and analyze music critically, using vocabulary and language of music. Curriculum Standard One: The student will listen to and analyze music critically, using vocabulary and language of music. 1. The student will analyze the uses of elements of music. A. Can the student analyze

More information

MUSIC (MUS) Music (MUS) 1

MUSIC (MUS) Music (MUS) 1 Music (MUS) 1 MUSIC (MUS) MUS 2 Music Theory 3 Units (Degree Applicable, CSU, UC, C-ID #: MUS 120) Corequisite: MUS 5A Preparation for the study of harmony and form as it is practiced in Western tonal

More information

Music Learning Expectations

Music Learning Expectations Music Learning Expectations Pre K 3 practice listening skills sing songs from memory experiment with rhythm and beat echo So Mi melodies incorporate movements to correspond to specific music use classroom

More information

Follow the Beat? Understanding Conducting Gestures from Video

Follow the Beat? Understanding Conducting Gestures from Video Follow the Beat? Understanding Conducting Gestures from Video Andrea Salgian 1, Micheal Pfirrmann 1, and Teresa M. Nakra 2 1 Department of Computer Science 2 Department of Music The College of New Jersey

More information

DEPARTMENT/GRADE LEVEL: Band (7 th and 8 th Grade) COURSE/SUBJECT TITLE: Instrumental Music #0440 TIME FRAME (WEEKS): 36 weeks

DEPARTMENT/GRADE LEVEL: Band (7 th and 8 th Grade) COURSE/SUBJECT TITLE: Instrumental Music #0440 TIME FRAME (WEEKS): 36 weeks DEPARTMENT/GRADE LEVEL: Band (7 th and 8 th Grade) COURSE/SUBJECT TITLE: Instrumental Music #0440 TIME FRAME (WEEKS): 36 weeks OVERALL STUDENT OBJECTIVES FOR THE UNIT: Students taking Instrumental Music

More information

Social Interaction based Musical Environment

Social Interaction based Musical Environment SIME Social Interaction based Musical Environment Yuichiro Kinoshita Changsong Shen Jocelyn Smith Human Communication Human Communication Sensory Perception and Technologies Laboratory Technologies Laboratory

More information

Hip Hop Robot. Semester Project. Cheng Zu. Distributed Computing Group Computer Engineering and Networks Laboratory ETH Zürich

Hip Hop Robot. Semester Project. Cheng Zu. Distributed Computing Group Computer Engineering and Networks Laboratory ETH Zürich Distributed Computing Hip Hop Robot Semester Project Cheng Zu zuc@student.ethz.ch Distributed Computing Group Computer Engineering and Networks Laboratory ETH Zürich Supervisors: Manuel Eichelberger Prof.

More information

Arts Education Essential Standards Crosswalk: MUSIC A Document to Assist With the Transition From the 2005 Standard Course of Study

Arts Education Essential Standards Crosswalk: MUSIC A Document to Assist With the Transition From the 2005 Standard Course of Study NCDPI This document is designed to help North Carolina educators teach the Common Core and Essential Standards (Standard Course of Study). NCDPI staff are continually updating and improving these tools

More information

OCTAVE C 3 D 3 E 3 F 3 G 3 A 3 B 3 C 4 D 4 E 4 F 4 G 4 A 4 B 4 C 5 D 5 E 5 F 5 G 5 A 5 B 5. Middle-C A-440

OCTAVE C 3 D 3 E 3 F 3 G 3 A 3 B 3 C 4 D 4 E 4 F 4 G 4 A 4 B 4 C 5 D 5 E 5 F 5 G 5 A 5 B 5. Middle-C A-440 DSP First Laboratory Exercise # Synthesis of Sinusoidal Signals This lab includes a project on music synthesis with sinusoids. One of several candidate songs can be selected when doing the synthesis program.

More information

S I N E V I B E S ETERNAL BARBER-POLE FLANGER

S I N E V I B E S ETERNAL BARBER-POLE FLANGER S I N E V I B E S ETERNAL BARBER-POLE FLANGER INTRODUCTION Eternal by Sinevibes is a barber-pole flanger effect. Unlike a traditional flanger which typically has its tone repeatedly go up and down, this

More information

DUNGOG HIGH SCHOOL CREATIVE ARTS

DUNGOG HIGH SCHOOL CREATIVE ARTS DUNGOG HIGH SCHOOL CREATIVE ARTS SENIOR HANDBOOK HSC Music 1 2013 NAME: CLASS: CONTENTS 1. Assessment schedule 2. Topics / Scope and Sequence 3. Course Structure 4. Contexts 5. Objectives and Outcomes

More information

Music. Music Instrumental. Program Description. Fine & Applied Arts/Behavioral Sciences Division

Music. Music Instrumental. Program Description. Fine & Applied Arts/Behavioral Sciences Division Fine & Applied Arts/Behavioral Sciences Division (For Meteorology - See Science, General ) Program Description Students may select from three music programs Instrumental, Theory-Composition, or Vocal.

More information

Figure 1: Feature Vector Sequence Generator block diagram.

Figure 1: Feature Vector Sequence Generator block diagram. 1 Introduction Figure 1: Feature Vector Sequence Generator block diagram. We propose designing a simple isolated word speech recognition system in Verilog. Our design is naturally divided into two modules.

More information

CTP431- Music and Audio Computing Musical Interface. Graduate School of Culture Technology KAIST Juhan Nam

CTP431- Music and Audio Computing Musical Interface. Graduate School of Culture Technology KAIST Juhan Nam CTP431- Music and Audio Computing Musical Interface Graduate School of Culture Technology KAIST Juhan Nam 1 Introduction Interface + Tone Generator 2 Introduction Musical Interface Muscle movement to sound

More information

Carl Sandburg Middle School

Carl Sandburg Middle School Carl Sandburg Middle School 2018-2019 Instrumental Handbook Laura Nee Band and Orchestra Director Dear Parents and Guardians, Welcome to the Carl Sandburg Middle School Band Program! Students learn the

More information

Grade Level 5-12 Subject Area: Vocal and Instrumental Music

Grade Level 5-12 Subject Area: Vocal and Instrumental Music 1 Grade Level 5-12 Subject Area: Vocal and Instrumental Music Standard 1 - Sings alone and with others, a varied repertoire of music The student will be able to. 1. Sings ostinatos (repetition of a short

More information

Field Programmable Gate Array (FPGA) Based Trigger System for the Klystron Department. Darius Gray

Field Programmable Gate Array (FPGA) Based Trigger System for the Klystron Department. Darius Gray SLAC-TN-10-007 Field Programmable Gate Array (FPGA) Based Trigger System for the Klystron Department Darius Gray Office of Science, Science Undergraduate Laboratory Internship Program Texas A&M University,

More information

ALGORHYTHM. User Manual. Version 1.0

ALGORHYTHM. User Manual. Version 1.0 !! ALGORHYTHM User Manual Version 1.0 ALGORHYTHM Algorhythm is an eight-step pulse sequencer for the Eurorack modular synth format. The interface provides realtime programming of patterns and sequencer

More information

Music Policy Round Oak School. Round Oak s Philosophy on Music

Music Policy Round Oak School. Round Oak s Philosophy on Music Music Policy Round Oak School Round Oak s Philosophy on Music At Round Oak, we believe that music plays a vital role in children s learning. As a subject itself, it offers children essential experiences.

More information

Stafford Township School District Manahawkin, NJ

Stafford Township School District Manahawkin, NJ Stafford Township School District Manahawkin, NJ Fourth Grade Music Curriculum Aligned to the CCCS 2009 This Curriculum is reviewed and updated annually as needed This Curriculum was approved at the Board

More information

FINE ARTS Institutional (ILO), Program (PLO), and Course (SLO) Alignment

FINE ARTS Institutional (ILO), Program (PLO), and Course (SLO) Alignment FINE ARTS Institutional (ILO), Program (PLO), and Course (SLO) Program: Music Number of Courses: 52 Date Updated: 11.19.2014 Submitted by: V. Palacios, ext. 3535 ILOs 1. Critical Thinking Students apply

More information

THEORY AND COMPOSITION (MTC)

THEORY AND COMPOSITION (MTC) Theory and Composition (MTC) 1 THEORY AND COMPOSITION (MTC) MTC 101. Composition I. 2 Credit Course covers elementary principles of composition; class performance of composition projects is also included.

More information

SERIAL HIGH DENSITY DIGITAL RECORDING USING AN ANALOG MAGNETIC TAPE RECORDER/REPRODUCER

SERIAL HIGH DENSITY DIGITAL RECORDING USING AN ANALOG MAGNETIC TAPE RECORDER/REPRODUCER SERIAL HIGH DENSITY DIGITAL RECORDING USING AN ANALOG MAGNETIC TAPE RECORDER/REPRODUCER Eugene L. Law Electronics Engineer Weapons Systems Test Department Pacific Missile Test Center Point Mugu, California

More information

6 th Grade Instrumental Music Curriculum Essentials Document

6 th Grade Instrumental Music Curriculum Essentials Document 6 th Grade Instrumental Curriculum Essentials Document Boulder Valley School District Department of Curriculum and Instruction August 2011 1 Introduction The Boulder Valley Curriculum provides the foundation

More information

Connecticut State Department of Education Music Standards Middle School Grades 6-8

Connecticut State Department of Education Music Standards Middle School Grades 6-8 Connecticut State Department of Education Music Standards Middle School Grades 6-8 Music Standards Vocal Students will sing, alone and with others, a varied repertoire of songs. Students will sing accurately

More information

Note on Posted Slides. Noise and Music. Noise and Music. Pitch. PHY205H1S Physics of Everyday Life Class 15: Musical Sounds

Note on Posted Slides. Noise and Music. Noise and Music. Pitch. PHY205H1S Physics of Everyday Life Class 15: Musical Sounds Note on Posted Slides These are the slides that I intended to show in class on Tue. Mar. 11, 2014. They contain important ideas and questions from your reading. Due to time constraints, I was probably

More information

INSTRUMENTAL MUSIC SKILLS

INSTRUMENTAL MUSIC SKILLS Course #: MU 82 Grade Level: 10 12 Course Name: Band/Percussion Level of Difficulty: Average High Prerequisites: Placement by teacher recommendation/audition # of Credits: 1 2 Sem. ½ 1 Credit MU 82 is

More information

Chapter 1. Introduction to Digital Signal Processing

Chapter 1. Introduction to Digital Signal Processing Chapter 1 Introduction to Digital Signal Processing 1. Introduction Signal processing is a discipline concerned with the acquisition, representation, manipulation, and transformation of signals required

More information

A System for Generating Real-Time Visual Meaning for Live Indian Drumming

A System for Generating Real-Time Visual Meaning for Live Indian Drumming A System for Generating Real-Time Visual Meaning for Live Indian Drumming Philip Davidson 1 Ajay Kapur 12 Perry Cook 1 philipd@princeton.edu akapur@princeton.edu prc@princeton.edu Department of Computer

More information

BEGINNING INSTRUMENTAL MUSIC CURRICULUM MAP

BEGINNING INSTRUMENTAL MUSIC CURRICULUM MAP Teacher: Kristine Crandall TARGET DATES First 4 weeks of the trimester COURSE: Music - Beginning Instrumental ESSENTIAL QUESTIONS How can we improve our individual music skills on our instrument? What

More information

Elements of Sound and Music Computing in A-Level Music and Computing/CS Richard Dobson, January Music

Elements of Sound and Music Computing in A-Level Music and Computing/CS Richard Dobson, January Music Elements of Sound and Music Computing in A-Level Music and Computing/CS Richard Dobson, January 2013 Music These extracts suggest that the exam boards fall into two broad groups. Some detail extensive

More information

Sharif University of Technology. SoC: Introduction

Sharif University of Technology. SoC: Introduction SoC Design Lecture 1: Introduction Shaahin Hessabi Department of Computer Engineering System-on-Chip System: a set of related parts that act as a whole to achieve a given goal. A system is a set of interacting

More information

Second Grade Music Curriculum

Second Grade Music Curriculum Second Grade Music Curriculum 2 nd Grade Music Overview Course Description In second grade, musical skills continue to spiral from previous years with the addition of more difficult and elaboration. This

More information

Music Radar: A Web-based Query by Humming System

Music Radar: A Web-based Query by Humming System Music Radar: A Web-based Query by Humming System Lianjie Cao, Peng Hao, Chunmeng Zhou Computer Science Department, Purdue University, 305 N. University Street West Lafayette, IN 47907-2107 {cao62, pengh,

More information

Music (MUS) Courses. Music (MUS) 1

Music (MUS) Courses. Music (MUS) 1 Music (MUS) 1 Music (MUS) Courses MUS-011. Basic Musicianship I. 0 Credits. Requirement for Music Majors who do not pass the Music Theory I, MUS-117, placement exam. A pre-music theory course designed

More information

MUSIC (MUSC) Bismarck State College Catalog 1

MUSIC (MUSC) Bismarck State College Catalog 1 Bismarck State College 2018-2019 Catalog 1 MUSIC (MUSC) MUSC 100. Music Appreciation Covers musical styles and forms of classical music as well as historical background from the Medieval to the Contemporary.

More information

Laboratory Assignment 3. Digital Music Synthesis: Beethoven s Fifth Symphony Using MATLAB

Laboratory Assignment 3. Digital Music Synthesis: Beethoven s Fifth Symphony Using MATLAB Laboratory Assignment 3 Digital Music Synthesis: Beethoven s Fifth Symphony Using MATLAB PURPOSE In this laboratory assignment, you will use MATLAB to synthesize the audio tones that make up a well-known

More information

AN ARTISTIC TECHNIQUE FOR AUDIO-TO-VIDEO TRANSLATION ON A MUSIC PERCEPTION STUDY

AN ARTISTIC TECHNIQUE FOR AUDIO-TO-VIDEO TRANSLATION ON A MUSIC PERCEPTION STUDY AN ARTISTIC TECHNIQUE FOR AUDIO-TO-VIDEO TRANSLATION ON A MUSIC PERCEPTION STUDY Eugene Mikyung Kim Department of Music Technology, Korea National University of Arts eugene@u.northwestern.edu ABSTRACT

More information

A few white papers on various. Digital Signal Processing algorithms. used in the DAC501 / DAC502 units

A few white papers on various. Digital Signal Processing algorithms. used in the DAC501 / DAC502 units A few white papers on various Digital Signal Processing algorithms used in the DAC501 / DAC502 units Contents: 1) Parametric Equalizer, page 2 2) Room Equalizer, page 5 3) Crosstalk Cancellation (XTC),

More information

UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD

UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD HARMONIX MUSIC SYSTEMS, INC. and KONAMI DIGITAL ENTERTAINMENT INC., Petitioners v. PRINCETON DIGITAL IMAGE CORPORATION,

More information

Lab P-6: Synthesis of Sinusoidal Signals A Music Illusion. A k cos.! k t C k / (1)

Lab P-6: Synthesis of Sinusoidal Signals A Music Illusion. A k cos.! k t C k / (1) DSP First, 2e Signal Processing First Lab P-6: Synthesis of Sinusoidal Signals A Music Illusion Pre-Lab: Read the Pre-Lab and do all the exercises in the Pre-Lab section prior to attending lab. Verification:

More information

S I N E V I B E S FRACTION AUDIO SLICING WORKSTATION

S I N E V I B E S FRACTION AUDIO SLICING WORKSTATION S I N E V I B E S FRACTION AUDIO SLICING WORKSTATION INTRODUCTION Fraction is a plugin for deep on-the-fly remixing and mangling of sound. It features 8x independent slicers which record and repeat short

More information

MUSIC (MUS) Music (MUS) 1. MUS 1530 Brass Class. Principles, concepts, difficulties typical of brass instruments and. MUS 1000 Performance Laboratory

MUSIC (MUS) Music (MUS) 1. MUS 1530 Brass Class. Principles, concepts, difficulties typical of brass instruments and. MUS 1000 Performance Laboratory Music (MUS) 1 MUSIC (MUS) MUS 1000 Performance Laboratory [0 credit hours (0, 0, 1)] Required of music majors and minors. Weekly departmental student recitals. Offered as P/NC only. MUS 1010 Concert Attendance

More information

a Collaborative Composing Learning Environment Thesis Advisor: Barry Vercoe Professor of Media Arts and Sciences MIT Media Laboratory

a Collaborative Composing Learning Environment Thesis Advisor: Barry Vercoe Professor of Media Arts and Sciences MIT Media Laboratory Musictetris: a Collaborative Composing Learning Environment Wu-Hsi Li Thesis proposal draft for the degree of Master of Science in Media Arts and Sciences at the Massachusetts Institute of Technology Fall

More information

Fraction by Sinevibes audio slicing workstation

Fraction by Sinevibes audio slicing workstation Fraction by Sinevibes audio slicing workstation INTRODUCTION Fraction is an effect plugin for deep real-time manipulation and re-engineering of sound. It features 8 slicers which record and repeat the

More information

Assessment: Course Four Column Fall 2017

Assessment: Course Four Column Fall 2017 Assessment: Course Four Column Fall 2017 El Camino: (FA) - Music ECC: MUSI 102A:Beginning Sightsinging SLO #3 Sing Minor Scales - Upon completion of the course, students should be able to sing minor scales,

More information

Formative Assessment Plan

Formative Assessment Plan OBJECTIVE: (7.ML.1) Apply the elements of music and musical techniques in order to sing and play music with accuracy and expression. I can continue to improve my tone while learning to change pitches while

More information

Building a Better Bach with Markov Chains

Building a Better Bach with Markov Chains Building a Better Bach with Markov Chains CS701 Implementation Project, Timothy Crocker December 18, 2015 1 Abstract For my implementation project, I explored the field of algorithmic music composition

More information

Take a Break, Bach! Let Machine Learning Harmonize That Chorale For You. Chris Lewis Stanford University

Take a Break, Bach! Let Machine Learning Harmonize That Chorale For You. Chris Lewis Stanford University Take a Break, Bach! Let Machine Learning Harmonize That Chorale For You Chris Lewis Stanford University cmslewis@stanford.edu Abstract In this project, I explore the effectiveness of the Naive Bayes Classifier

More information

Report. Digital Systems Project. Final Project - Synthesizer

Report. Digital Systems Project. Final Project - Synthesizer Dep. Eng. Electrotécnica e de Computadores Report Digital Systems Project Final Project - Synthesizer Authors: Ana Cláudia Fernandes dos Reis 2011149543 Francisca Agra de Almeida Quadros 2011149841 Date:

More information

6.UAP Project. FunPlayer: A Real-Time Speed-Adjusting Music Accompaniment System. Daryl Neubieser. May 12, 2016

6.UAP Project. FunPlayer: A Real-Time Speed-Adjusting Music Accompaniment System. Daryl Neubieser. May 12, 2016 6.UAP Project FunPlayer: A Real-Time Speed-Adjusting Music Accompaniment System Daryl Neubieser May 12, 2016 Abstract: This paper describes my implementation of a variable-speed accompaniment system that

More information

Composer Identification of Digital Audio Modeling Content Specific Features Through Markov Models

Composer Identification of Digital Audio Modeling Content Specific Features Through Markov Models Composer Identification of Digital Audio Modeling Content Specific Features Through Markov Models Aric Bartle (abartle@stanford.edu) December 14, 2012 1 Background The field of composer recognition has

More information

Greeley-Evans School District 6 High School Vocal Music Curriculum Guide Unit: Men s and Women s Choir Year 1 Enduring Concept: Expression of Music

Greeley-Evans School District 6 High School Vocal Music Curriculum Guide Unit: Men s and Women s Choir Year 1 Enduring Concept: Expression of Music Unit: Men s and Women s Choir Year 1 Enduring Concept: Expression of Music To perform music accurately and expressively demonstrating self-evaluation and personal interpretation at the minimal level of

More information

Music Representations

Music Representations Lecture Music Processing Music Representations Meinard Müller International Audio Laboratories Erlangen meinard.mueller@audiolabs-erlangen.de Book: Fundamentals of Music Processing Meinard Müller Fundamentals

More information

Music Alignment and Applications. Introduction

Music Alignment and Applications. Introduction Music Alignment and Applications Roger B. Dannenberg Schools of Computer Science, Art, and Music Introduction Music information comes in many forms Digital Audio Multi-track Audio Music Notation MIDI Structured

More information

Lab experience 1: Introduction to LabView

Lab experience 1: Introduction to LabView Lab experience 1: Introduction to LabView LabView is software for the real-time acquisition, processing and visualization of measured data. A LabView program is called a Virtual Instrument (VI) because

More information

Shimon: An Interactive Improvisational Robotic Marimba Player

Shimon: An Interactive Improvisational Robotic Marimba Player Shimon: An Interactive Improvisational Robotic Marimba Player Guy Hoffman Georgia Institute of Technology Center for Music Technology 840 McMillan St. Atlanta, GA 30332 USA ghoffman@gmail.com Gil Weinberg

More information

DEVELOPMENT OF MIDI ENCODER "Auto-F" FOR CREATING MIDI CONTROLLABLE GENERAL AUDIO CONTENTS

DEVELOPMENT OF MIDI ENCODER Auto-F FOR CREATING MIDI CONTROLLABLE GENERAL AUDIO CONTENTS DEVELOPMENT OF MIDI ENCODER "Auto-F" FOR CREATING MIDI CONTROLLABLE GENERAL AUDIO CONTENTS Toshio Modegi Research & Development Center, Dai Nippon Printing Co., Ltd. 250-1, Wakashiba, Kashiwa-shi, Chiba,

More information

Chapter 23 Dimmer monitoring

Chapter 23 Dimmer monitoring Chapter 23 Dimmer monitoring ETC consoles may be connected to ETC Sensor dimming systems via the ETCLink communication protocol. In this configuration, the console operates a dimmer monitoring system that

More information

TV Character Generator

TV Character Generator TV Character Generator TV CHARACTER GENERATOR There are many ways to show the results of a microcontroller process in a visual manner, ranging from very simple and cheap, such as lighting an LED, to much

More information

Third Grade Music Curriculum

Third Grade Music Curriculum Third Grade Music Curriculum 3 rd Grade Music Overview Course Description The third-grade music course introduces students to elements of harmony, traditional music notation, and instrument families. The

More information

PKUES Grade 10 Music Pre-IB Curriculum Outline. (adapted from IB Music SL)

PKUES Grade 10 Music Pre-IB Curriculum Outline. (adapted from IB Music SL) PKUES Grade 10 Pre-IB Curriculum Outline (adapted from IB SL) Introduction The Grade 10 Pre-IB course encompasses carefully selected content from the Standard Level IB programme, with an emphasis on skills

More information

Music. Curriculum Glance Cards

Music. Curriculum Glance Cards Music Curriculum Glance Cards A fundamental principle of the curriculum is that children s current understanding and knowledge should form the basis for new learning. The curriculum is designed to follow

More information

Music Curriculum Maps Revised 2016 KINDERGARTEN

Music Curriculum Maps Revised 2016 KINDERGARTEN KINDERGARTEN Understand opposite terms fast/slow. (6) Know or demonstrate care for classroom instruments. (2) 2 nd QUARTER Understand opposite terms loud/soft. (6) Demonstrate the difference between speaking,

More information

MMM 100 MARCHING BAND

MMM 100 MARCHING BAND MUSIC MMM 100 MARCHING BAND 1 The Siena Heights Marching Band is open to all students including woodwind, brass, percussion, and auxiliary members. In addition to performing at all home football games,

More information

Effects of lag and frame rate on various tracking tasks

Effects of lag and frame rate on various tracking tasks This document was created with FrameMaker 4. Effects of lag and frame rate on various tracking tasks Steve Bryson Computer Sciences Corporation Applied Research Branch, Numerical Aerodynamics Simulation

More information

Instrumental Music Curriculum

Instrumental Music Curriculum Instrumental Music Curriculum Instrumental Music Course Overview Course Description Topics at a Glance The Instrumental Music Program is designed to extend the boundaries of the gifted student beyond the

More information

Certificate of Completion Songwriting. McLENNAN COMMUNITY COLLEGE

Certificate of Completion Songwriting. McLENNAN COMMUNITY COLLEGE Certificate of Completion Songwriting McLENNAN COMMUNITY COLLEGE 2016-2017 Degree Description Students completing this program are prepared to assume positions as composers, arrangers and songwriters.

More information

MAutoPitch. Presets button. Left arrow button. Right arrow button. Randomize button. Save button. Panic button. Settings button

MAutoPitch. Presets button. Left arrow button. Right arrow button. Randomize button. Save button. Panic button. Settings button MAutoPitch Presets button Presets button shows a window with all available presets. A preset can be loaded from the preset window by double-clicking on it, using the arrow buttons or by using a combination

More information

Automatic Rhythmic Notation from Single Voice Audio Sources

Automatic Rhythmic Notation from Single Voice Audio Sources Automatic Rhythmic Notation from Single Voice Audio Sources Jack O Reilly, Shashwat Udit Introduction In this project we used machine learning technique to make estimations of rhythmic notation of a sung

More information