MusicGrip: A Writing Instrument for Music Control

Similar documents
Toward a Computationally-Enhanced Acoustic Grand Piano

Vuzik: Music Visualization and Creation on an Interactive Surface

A System for Generating Real-Time Visual Meaning for Live Indian Drumming

Ben Neill and Bill Jones - Posthorn

A prototype system for rule-based expressive modifications of audio recordings

Melody Retrieval On The Web

Aalborg Universitet. Flag beat Trento, Stefano; Serafin, Stefania. Published in: New Interfaces for Musical Expression (NIME 2013)

TOWARD AN INTELLIGENT EDITOR FOR JAZZ MUSIC

Session 1 Introduction to Data Acquisition and Real-Time Control

Teaching Plasma Nanotechnologies Based on Remote Access

General Terms Design, Human Factors.

Devices I have known and loved

Part II: Dipping Your Toes Fingers into Music Basics Part IV: Moving into More-Advanced Keyboard Features

Jam Master, a Music Composing Interface

A Composition for Clarinet and Real-Time Signal Processing: Using Max on the IRCAM Signal Processing Workstation

MUSIC AND SONIC ARTS MUSIC AND SONIC ARTS MUSIC AND SONIC ARTS CAREER AND PROGRAM DESCRIPTION

Composing with Hyperscore in general music classes: An exploratory study

Music Radar: A Web-based Query by Humming System

QUALITY OF COMPUTER MUSIC USING MIDI LANGUAGE FOR DIGITAL MUSIC ARRANGEMENT

Affective Sound Synthesis: Considerations in Designing Emotionally Engaging Timbres for Computer Music

a Collaborative Composing Learning Environment Thesis Advisor: Barry Vercoe Professor of Media Arts and Sciences MIT Media Laboratory

Enhancing the TMS320C6713 DSK for DSP Education

Zooming into saxophone performance: Tongue and finger coordination

PORTO 2018 ICLI. HASGS The Repertoire as an Approach to Prototype Augmentation. Henrique Portovedo 1

Design and Realization of the Guitar Tuner Using MyRIO

Technology Proficient for Creating

Interactive Virtual Laboratory for Distance Education in Nuclear Engineering. Abstract

The Future of TVs LCD & LCOS

Designing Intelligence into Commutation Encoders

VHDL Design and Implementation of FPGA Based Logic Analyzer: Work in Progress

Tetrapad Manual. Tetrapad. Multi-Dimensional Performance Touch Controller. Firmware: 1.0 Manual Revision:

Adding Analog and Mixed Signal Concerns to a Digital VLSI Course

Reducing tilt errors in moiré linear encoders using phase-modulated grating

SPM Training Manual Veeco Bioscope II NIFTI-NUANCE Center Northwestern University

Image Processing Using MATLAB (Summer Training Program) 6 Weeks/ 45 Days PRESENTED BY

Usability of Computer Music Interfaces for Simulation of Alternate Musical Systems

Expressiveness and digital musical instrument design

News from Rohde&Schwarz Number 195 (2008/I)

DIABLO VALLEY COLLEGE CATALOG

9/23/2014. Andrew Costin, Tom Syster, Ryan Cramer Advisor: Professor Hack Instructor: Professor Lin May 5 th, 2014

technology T05.2 teach with space MEET THE SENSE HAT Displaying text and images on the Sense HAT LED matrix

TELEVISION'S CREATIVE PALETTE. by Eric Somers

Social Interaction based Musical Environment

Request for Technology Fee Funds A separate request should be made for each initiative.

Digital audio and computer music. COS 116, Spring 2012 Guest lecture: Rebecca Fiebrink

Lab P-6: Synthesis of Sinusoidal Signals A Music Illusion. A k cos.! k t C k / (1)

VIRTUAL INSTRUMENTATION

Evaluating Interactive Music Systems: An HCI Approach

Field Programmable Gate Array (FPGA) Based Trigger System for the Klystron Department. Darius Gray

ESP: Expression Synthesis Project

Constellation: A Tool for Creative Dialog Between Audience and Composer

Modular Analog Synthesizer

Shifty Manual v1.00. Shifty. Voice Allocator / Hocketing Controller / Analog Shift Register

Supporting Creative Confidence in a Musical Composition Workshop: Sound of Colour

Music Production & Engineering

Robert Alexandru Dobre, Cristian Negrescu

Lesson Sequence: S4A (Scratch for Arduino)

MICON A Music Stand for Interactive Conducting

Original Marketing Material circa 1976

Shifty Manual. Shifty. Voice Allocator Hocketing Controller Analog Shift Register Sequential/Manual Switch. Manual Revision:

Chapter 2: Lines And Points

GFT Channel Digital Delay Generator

An FPGA Based Solution for Testing Legacy Video Displays

Acoustic Instrument Message Specification

World of Music: A Classroom and Home Musical Environment

MONITORING AND ANALYSIS OF VIBRATION SIGNAL BASED ON VIRTUAL INSTRUMENTATION

INDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR NPTEL ONLINE CERTIFICATION COURSE. On Industrial Automation and Control

In total 2 project plans are submitted. Deadline for Plan 1 is on at 23:59. The plan must contain the following information:

Smart Traffic Control System Using Image Processing

Chapter 60 Development of the Remote Instrumentation Systems Based on Embedded Web to Support Remote Laboratory

ANNOTATING MUSICAL SCORES IN ENP

Enhancing Music Maps

Banff Sketches. for MIDI piano and interactive music system Robert Rowe

THE "CONDUCTOR'S JACKET": A DEVICE FOR RECORDING EXPRESSIVE MUSICAL GESTURES

Noise Tools 1U Manual. Noise Tools 1U. Clock, Random Pulse, Analog Noise, Sample & Hold, and Slew. Manual Revision:

CTP431- Music and Audio Computing Musical Interface. Graduate School of Culture Technology KAIST Juhan Nam

PLOrk Beat Science 2.0 NIME 2009 club submission by Ge Wang and Rebecca Fiebrink

EEM Digital Systems II

THE LXI IVI PROGRAMMING MODEL FOR SYNCHRONIZATION AND TRIGGERING

Elements of Sound and Music Computing in A-Level Music and Computing/CS Richard Dobson, January Music

Preparati on for Improvised Performance in Col laboration with a Khyal Singer

Features of the 745T-20C: Applications of the 745T-20C: Model 745T-20C 20 Channel Digital Delay Generator

R H Y T H M G E N E R A T O R. User Guide. Version 1.3.0

Music for Alto Saxophone & Computer

There are many ham radio related activities

Setting up your Roland V-Drums with Melodics.

USING A SOFTWARE SYNTH: THE KORG M1 (SOFTWARE) SYNTH

Bringing an all-in-one solution to IoT prototype developers

Multiband Noise Reduction Component for PurePath Studio Portable Audio Devices

Composing for Hyperbow: A Collaboration Between MIT and the Royal Academy of Music

Tool-based Identification of Melodic Patterns in MusicXML Documents

Music in Practice SAS 2015

Music Technology I. Course Overview

Cymatic: a real-time tactile-controlled physical modelling musical instrument

The Schwinnaphone A Musical Bicycle. By Jeff Volinski with Mike Caselli

Getting Started with the LabVIEW Sound and Vibration Toolkit

SRV02-Series. Ball & Beam. User Manual

NDT Supply.com 7952 Nieman Road Lenexa, KS USA

ADSR AMP. ENVELOPE. Moog Music s Guide To Analog Synthesized Percussion. The First Step COMMON VOLUME ENVELOPES

Part I Of An Exclusive Interview With The Father Of Digital FM Synthesis. By Tom Darter.

Transcription:

MusicGrip: A Writing Instrument for Music Control The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published Publisher Gong, Nan-wei, Laibowitz, Mat, Paradiso, Joseph A. "MusicGrip: A Writing instrument for Music Control." Instrument for Music Control, Proceedings of the International Conference on New Interfaces for Musical Expression, pp. 74 77, 2009. 2009 by New Interfaces for Musical Expression http://www.nime.org/proceedings/2009/nime2009_074.pdf New Interfaces for Musical Expression Version Final published version Accessed Mon Oct 08 00:35:31 EDT 2018 Citable Link Terms of Use Detailed Terms http://hdl.handle.net/1721.1/62158 Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.

MusicGrip: A Writing Instrument for Music Control Nan-Wei Gong nanwei@media.mit.edu Mat Laibowitz mat@media.mit.edu Joseph A. Paradiso joep@media.mit.edu Abstract In this project, we have developed a real-time writing instrument for music control. The controller, MusicGrip, can capture the subtle dynamics of the user s grip while writing or drawing and map this to musical control signals and sonic outputs. This paper discusses this conversion of the common motor motion of handwriting into an innovative form of music expression. The presented example instrument can be used to integrate the composing aspect of music with painting and writing, creating a new art form from the resultant aural and visual representation of the collaborative performing process. Keywords: Interactive music control, writing instrument, pen controller, MIDI, group performing activity. 1. Introduction For many traditional musical instruments, the most complicated and difficult part for performers is the training of the reflex motor motion of one s fingers. It is common for people to spend much of their time while learning to work with a new instrument, on adjusting and adapting the finger movements for the scales, positions, and affordances specific to the instrument. Finding better ways to learn new physical motions and techniques for musical expression has always been an important aspect in the development of new music interfaces and instruments [1]. While these systems provide an easy way to get some sort of sonic output, the motion and gesture for playing those instruments expressively and creatively still needs substantial practice. It is also often quite difficult to repeat the gesture, limiting the use of these types of interfaces for composition. Handwriting and drawing movements are skills that contain unique expression for every individual. It is both a personality representation through graphic marks and a means to communicate, capture, and clarify ideas with Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers, or to redistribute to lists requires prior specific permission and/or a fee. NIME09, June 3-6, 2009, Pittsburgh, PA Copyright remains with the author(s). other people. It differs from person to person yet it's easy for everyone to repeat the same general motion easily. Our writing/drawing pattern, the way we hold a pen and the resultant handwriting can even be used as a method of identifying different individuals. Therefore, we have developed a music controller which takes advantage of the uniqueness of this motion as a means of expressive realtime music control and also takes advantage of the familiarity of writing for communication and the capture of ideas as a means for collaborative composition. In this paper, we describe the development and implementation of MusicGrip, a system that creates electronic music in real time based on the writing and drawing motion. Pressure sensors are attached to a conventional pencil grip (Figure 1) and signals are collected through three different pressure sensors, one on each face of the triangular shape grip. The sensors provide three different channels of sensing allowing the dynamic interaction between composing and drawing. By augmenting a palette of several pens in this way, the MusicGrip system can be used as a real-time, multi-user performance instrument. Figure 1. Illustration of input channels and the design of the triangular pencil grips. 2. Pen as a Music Instrument Various research projects have sought to use pens in musical applications, often for editing music notes and composing with a tablet input [5-7]. Some efforts have used commercial digital pens and tablets as musical instruments. For example, researchers at Center of New Music and Audio Technology (CNMAT) at UC Berkeley have over a decade of research employing tablet-based musical interfaces [2-3] also of note is the use of tablets as performance instruments by Arfib and collaborators [4]. The x-y pen-tip coordinate data captured by the electronic tablets in these examples are analyzed to attain features or commands that can be used to control or provide input to a 74 NIME 2009

musical system. However, the detailed variations of one s finger movements on the writing implement and the overall motion of the hand beyond the location of the pen tip can hardly be captured. To capture the nuanced expressions from movements of handwriting and drawing, we looked at the components that make up a person s unique grip on the pen and how the gripping pressures change throughout the writing session. In our design, force sensing resistors are placed on each of the three main finger contact points on the writing implement by use of a triangular shaped pencil grip (Figure 2a). Although some electronic batons have been implemented with multipoint pressure sensors that respond to dynamic grip [8], these were designed as free-gesture interfaces for the entire hand. A music controller that uses the vector finger pressure variances from the writing or drawing motion against a surface is novel in many ways. First, unlike the conventional input methods that require a specific type of pen with the electronic tablet, a pencil grip can be attached to any writing instruments. Besides the flexibility of the writing implement, it also allows writing on different surfaces. The same writing motion can have unlimited musical expressions through assorted combinations of pens and papers, as these specifics affect the pressures and the angles of the grip. In Figure 2b, the red brush gives greater pressure sensitivity then the green pencil, whereas the green pencil needs more force to create similar graphics. Figure 2. (a) Pencil grip with pressure sensors on each side. (b) Drawings from the same person with different writing instruments indicating different pressure and angle sensitivities from the tip. Since these different types of pen will generate different dynamic values from the sensors, they can provide different sonic output. This can be likened to differences in traditional musical instruments such as the type of wood used for the body or the materials for strings. A palette of various writing instruments can be provided, each mapped to different sonic output or compositional method, allowing the composer to pick different tones and mappings the same way a painter picks up a different color or texture. In the music composing aspect, different from traditional music notes and instruments representation, the composing process is documented from the drawing and writing and can be played back and analyzed through the optic input from the extension of MusicGrip which will be discussed later on. 3. System Overview The prototype consists of three pressure sensors attached to a pencil grip, serving as input signals. The signals are converted to MIDI by an Atmega168 microcontroller and a 6N139 Optocoupler. The output signals are connected to a MIDI-USB converter through a MIDI adapter and transmit to a PC. The sensor data is processed in real time and mapped to specific MIDI messages. In the first version of the design we use Reason, a digital music synthesizer platform, to generate sound and create a mapping to trigger and modify these sounds with the pen sensor data. The major components of the system are shown in Figure 4b. Figure 4. Components of the prototype: palette racks and the MIDI converter. First, the input device is a pencil grip with three pressure sensors attached to a pen. The sensor signals are transmitted through a USB cable. Second, the palette rack (the red and blue boxes) that contain the electronics to convert the input sensor signals to MIDI outputs and also provide a space to keep the various input pens ready for use. There are currently two inputs in one unit which consist of six channels. Six LEDs are included (three for each pen), which serve as indicators and light up with their brightness in proportion to the pressure input. The last component is the MIDI-USB interface which takes MIDI signals and connects the inputs to a PC. Here, we use an EDIROL 3 in/3 out USB MIDI interface from Roland. 4. Mapping Strategies 4.1 Multiple Users The pressure sensors for each pen for the three major fingers, index finger, middle finger, and thumb are mapped to parametric MIDI control signals. The user can change the mapping completely by rotating the pen and switching which finger controls which parameter. Since people generally write with only one hand at a time, the selection of which parameter to control with a particular sensor on a particular pen is done with multi-user collaboration in 75

mind. An example of a multiple user collaborative performance mapping is shown in Figure 5 and described as follows. Figure 6. Illustration of an alternative mapping strategy for single user. Figure 5. Mapping for collaborative performance from two users. One of the users will be responsible for the melody of the music by selecting notes, adding modulation, and controlling the dynamic level of the sound. The index finger is mapped to the pitch bend, the thumb to the level/volume, and the middle finger to the modulation. The other user will be responsible for the control of different effects from the synthesizer (in other words the timbre). The index finger is mapped to the phase of one oscillator, the thumb to the filter envelope, and the middle finger to the amplifier envelope. For more complicated group collaboration, the texture and color of the pen will serve as an indication of the mapped tones and effects of that particular pen, just as an artiest picks up a unique pigment from a palette in the painting process, e.g. warmer color for warmer tone. 4.2 Single User Another way to map the signal input is to coordinate the input pressure with note pitch (Figure 6). The most controllable and dynamic movement of the three channels is the one associated with the index finger. Thus, in the single user mapping, we use this channel to control the pitch. The second most manageable channel is the thumb and it is used for level control. The third channel can be assigned to a user-selectable tone control or effect. Primarily for the single user mapping, but also usable by a group, we have built a standalone version that can run and perform without a computer. For this version we have added three speakers and three amplifiers to the electronics contained in the palette rack, and have created a small tone synthesizer that works directly from the analog signals from the sensors. 5. Group Evaluation/Collaborative Performance The first user study and collaborative performance was conducted during the final project presentation of MAS.963 Principles of Electronic Musical Controllers at the Media Laboratory, MIT. Four people were asked to start drawing/writing on the same piece of paper without practicing beforehand. For this test, shown in Figure 7a, two of the input instruments were mapped to the pitch control of separate synthesizer instruments, and the other two were mapped to the control of rhythm and effects. From the written artifact shown in Figure 7b, we can clearly observe the different roles in this performance. The two pitch controlling solo instruments are the yellow and blue lines. The drawings have less repeated pattern and are more free form in shapes. On the contrary, the rhythm section shown by the red and green lines has distinct repetition. Figure 7. (a) Participants in the collaborative performance. (b) Results from the performance, indicating different roles in the composition process. Thanks to the familiarity with the use of a pen, the users were able to collaboratively create cohesive audio moments after picking up the instruments. The users were able to recreate similar sound patterns by recreating the same written patterns, allowing iteration and developing of musical parts. 76

6. Ongoing Extensions of MusicGrip MusicGrip s functionality can be extended by adding additional input sensors, for example, illuminated photodiodes and microphones. A small extension board is being designed to attach near the bottom of MusicGrip. The photodiodes collect reflected red, green and blue light from a small region near where the pen tip contacts the writing surface (an onboard white light source can uniformly illuminate the paper beneath) that can be used to trace and scan written marks from previous performances or already existing printed material. In addition, the photodiodes can provide information about the current activity, such as the presence, color, and lightness of the marks as they are drawn. The data from the photodiodes can be mapped to musical outputs in order to create compositions directly from written marks. It can also be included in the mapping alongside the MusicGrip pressure sensors as an additional real-time parametric channel. A contact microphone or piezo element mounted against the pen can capture the sounds created from the writing implement contacting and moving across the writing surface. This sound will vary according to the specific pen and paper used as well as with the user s specific gestures, providing information regarding texture variations of the contact surface. The collected audio from the microphones can be modified by a variety of digital effects (e.g., real-time convolution against a stored sample [9]) or in conjunction with the sound generated from the MusicGrip sensors (e.g., the writing sound amplitude could gate and/or filter the synthesized audio). These techniques greatly increase the compositional styles and methods that the MusicGrip can support by including the use of live sounds created from the endless supply of writing surfaces. Accelerometers and/or gyros can also provide useful gestural input from pen tilt, twist, ad dynamics. 7. Conclusions and Future Work We have presented the MusicGrip, a music control interface utilizing the input from the different pressures and angles of associated writing/drawing motions. It helps performers to relate past experiences in writing and drawing with musical control and expression. The MIDI output can be mapped into various instrument and effects which provides innumerable possibilities for the creation of music. In addition to the musical output, the performers create a written or visual artifact as additional creative output or as notation of the composition. To sum up, MusicGrip provides a new platform for collaborative musical experimentation and integrates two forms of art painting and music, into a single realm of artistic expression. In addition to the modifications suggested in the last section (and perhaps making the system wireless), we plan to test MusicGrip extensively with professional musicians and people lacking musical background to evolve a corpus of appropriate mappings. 8. Acknowledgments This work was supported by. Special thanks to the participants in MAS 963 and Alexander Reben for the filming and photography during our performance. References [1] Paradiso, J.A., Electronic Music Interfaces: New Ways to Play, IEEE Spectrum, Vol. 34, No. 12, Dec., 1997, pp. 18-30. [2] Zbyszynski, M., M. Wright, et al. Ten Years of Tablet Musical Interfaces at CNMAT, New Interfaces for Musical Expression (NIME), New York, 2007. [3] Wright, M, D. Wessel, and A. Freed New Musical Control Structures from Standard Gestural Controllers In Proc. of the International Computer Music Conference, pp. 387-390. Thessaloniki, Greece: ICMA, 1997. [4] Arfib D., Couturier J-M., Filatriau J-J., Kessous L., "What sounds can I do with a joystick and a tablet?" in Proceedings of the COST287-ConGAS 2nd International Symposium on Gesture Interface for Multimedia Systems (GIMS'06), Leeds, UK, 2006. [5] J. Anstice, T. Bell, A. Cockburn, and M. Setchell. The design of a pen-based musical input system. In Proceedings of the Sixth Australian Conference on Computer-Human Interaction, p. 260 267,Los Alamitos, California, 1996. [6] E. Ng, T. Bell and A. Cockburn. ''Improvements to a Penbased Musical Input System'', Proceedings of the 8th Australian Conference on Computer Human Interaction, pp. 178-185, 1998. [7] Farbood, M., Pasztor, E., Jennings., K. "Hyperscore: A Graphical Sketchpad for Novice Composers." IEEE Computer Graphics and Applications, 24(1), January- March 2004, pp. 50-54. [8] Paradiso, J.A. "The Brain Opera Technology: New Instruments and Gestural Sensors for Musical Interaction and Performance," Journal of New Music Research, 28(2), 1999, pp. 130-149. [9] Merrill, D., Raffle, H., and Aimi, R., The Sound of Touch: physical Manipulation of Digital Sound, in Proc. of CHI 2008, pp. 739-742. 77