THE "CONDUCTOR'S JACKET": A DEVICE FOR RECORDING EXPRESSIVE MUSICAL GESTURES

Similar documents
MIE 402: WORKSHOP ON DATA ACQUISITION AND SIGNAL PROCESSING Spring 2003

Social Interaction based Musical Environment

The Syscal family of resistivity meters. Designed for the surveys you do.

Lab experience 1: Introduction to LabView

The Design of Teaching Experiment System Based on Virtual Instrument Technology. Dayong Huo

E X P E R I M E N T 1

OPTIMUM Power Technology: Low Cost Combustion Analysis for University Engine Design Programs Using ICEview and NI Compact DAQ Chassis

Interactive Virtual Laboratory for Distance Education in Nuclear Engineering. Abstract

Re: ENSC 370 Project Physiological Signal Data Logger Functional Specifications

MONITORING AND ANALYSIS OF VIBRATION SIGNAL BASED ON VIRTUAL INSTRUMENTATION

1.1 Digital Signal Processing Hands-on Lab Courses

Data Acquisition Using LabVIEW

Measurement of Motion and Emotion during Musical Performance

Using Audiotape to Collect Data Outside the Lab: Kinematics of the Bicycle*

SRV02-Series. Rotary Pendulum. User Manual

Follow the Beat? Understanding Conducting Gestures from Video

ME EN 363 ELEMENTARY INSTRUMENTATION Lab: Basic Lab Instruments and Data Acquisition

Lesson 14 BIOFEEDBACK Relaxation and Arousal

Jam Master, a Music Composing Interface

User Guide EMG. This user guide has been created to educate and inform the reader about doing EMG measurements

Lab 1 Introduction to the Software Development Environment and Signal Sampling

ASK THE EXPERTS: Procedure for Verifying Magnetic Pickup Signal Integrity Using a Windrock Portable Analyzer

Session 1 Introduction to Data Acquisition and Real-Time Control

NOTICE: This document is for use only at UNSW. No copies can be made of this document without the permission of the authors.

Computer Coordination With Popular Music: A New Research Agenda 1

Senior Design Project A FEW PROJECT IDEAS

Development of a wearable communication recorder triggered by voice for opportunistic communication

Amplification. Most common signal conditioning

Application Note Using Buffered Outputs and Patch Panels with the SETPOINT Machinery Protection System

Research-Grade Research-Grade. Capture

Introduction: Overview. EECE 2510 Circuits and Signals: Biomedical Applications. ECG Circuit 2 Analog Filtering and A/D Conversion

Preparation of the participant. EOG, ECG, HPI coils : what, why and how

VXI RF Measurement Analyzer

Getting Started with the LabVIEW Sound and Vibration Toolkit

EngineDiag. The Reciprocating Machines Diagnostics Module. Introduction DATASHEET

LabView Exercises: Part II

Muscle Sensor KI 2 Instructions

EngineDiag. The Reciprocating Machines Diagnostics Module. Introduction DATASHEET

Environmental Controls Laboratory

LABORATORY EXPERIMENTS IN DISTANCE LEARNING

DT9857E. Key Features: Dynamic Signal Analyzer for Sound and Vibration Analysis Expandable to 64 Channels

User s Manual. Digital Media Converter Package Part #: DL-HDCAT

MindMouse. This project is written in C++ and uses the following Libraries: LibSvm, kissfft, BOOST File System, and Emotiv Research Edition SDK.

Liam Ranshaw. Expanded Cinema Final Project: Puzzle Room

EMS DATA ACQUISITION AND MANAGEMENT (LVDAM-EMS) MODEL 9062-C

medlab One Channel ECG OEM Module EG 01000

SRV02-Series. Ball & Beam. User Manual

ECE 4220 Real Time Embedded Systems Final Project Spectrum Analyzer

MTL Software. Overview

In total 2 project plans are submitted. Deadline for Plan 1 is on at 23:59. The plan must contain the following information:

LABORATORY EXPERIMENTS IN DISTANCE LEARNING

MusicGrip: A Writing Instrument for Music Control

Design and Realization of the Guitar Tuner Using MyRIO

HEAD. HEAD VISOR (Code 7500ff) Overview. Features. System for online localization of sound sources in real time

B I O E N / Biological Signals & Data Acquisition

Essentials of the AV Industry Welcome Introduction How to Take This Course Quizzes, Section Tests, and Course Completion A Digital and Analog World

STB Front Panel User s Guide

Major Differences Between the DT9847 Series Modules

Dejero LIVE+ 20/20 Transmitter

DT9834 Series High-Performance Multifunction USB Data Acquisition Modules

PRODUCT SHEET TRANSDUCER MODULE GSR100C

USB Smart Power Sensor

PRODUCT SHEET

ADS Basic Automation solutions for the lighting industry

Video VBOX Pro RLVD10P2P/RLVD10P2PV/RLVD10P4PV/RLVD20P2PV/RLVD20P4PV. Features

LASER. Laser-Position-Transducer. LLD-150 Series. Key-Features:

NDT Supply.com 7952 Nieman Road Lenexa, KS USA

ENGR 1000, Introduction to Engineering Design

Savant. Savant. SignalCalc. Power in Numbers input channels. Networked chassis with 1 Gigabit Ethernet to host

Task 4.2 Liner/Socket for multichannel EMG recording (OBG (22), UMG-GOE (2), M01 - M24):

Common Spatial Patterns 3 class BCI V Copyright 2012 g.tec medical engineering GmbH

Boosting Performance Oscilloscope Versatility, Scalability

Zargis TeleSteth User Manual

Team Members: Erik Stegman Kevin Hoffman

Brain-Computer Interface (BCI)

CAPTURE CAPTURE. VERSiON 1.2. Specialists in Medical. Digital Imaging Solutions

Product Information. EIB 700 Series External Interface Box

LASER. Laser-Position-Transducer. LLD Series. Key-Features:

Exercise 1: Muscles in Face used for Smiling and Frowning Aim: To study the EMG activity in muscles of the face that work to smile or frown.

Application of Measurement Instrumentation (1)

User s Manual. Digital Media Converter Package Part #: DL-HDCAT-WP

Intimacy and Embodiment: Implications for Art and Technology

What is sync? Why is sync important? How can sync signals be compromised within an A/V system?... 3

Ethernet to VGA over IP Converter

Heart Rate Variability Preparing Data for Analysis Using AcqKnowledge

The Ultimate Long-term EEG Monitoring System

Thought Technology Ltd Belgrave Avenue, Montreal, QC H4A 2L8 Canada

Virtual Wireless and Mobile Communication Laboratory

Adding Analog and Mixed Signal Concerns to a Digital VLSI Course

Introduction. Introduction

USB Smart Power Sensor

LabView Exercises: Part III

ISELED - A Bright Future for Automotive Interior Lighting

Using the BHM binaural head microphone

TABLE OF CONTENTS. Instructions:

Silicon PhotoMultiplier Kits

Processor time 9 Used memory 9. Lost video frames 11 Storage buffer 11 Received rate 11

HST Neural Coding and Perception of Sound. Spring Cochlear Nucleus Unit Classification from Spike Trains. M.

Studies for Future Broadcasting Services and Basic Technologies

Colour Explosion Proof Video Camera USER MANUAL VID-C

Transcription:

THE "CONDUCTOR'S JACKET": A DEVICE FOR RECORDING EXPRESSIVE MUSICAL GESTURES Teresa Marrin and Rosalind Picard Affective Computing Research Group Media Laboratory Massachusetts Institute of Technology Cambridge, MA 02139 USA marrin, picard@media.mit.edu Abstract We present the design and architecture for a device called the Conductor s Jacket, which we built to collect and analyze data from conductors. This device is built into normal clothing and designed in such a way as to allow for normal activity and not encumber the movements of the wearer. The jacket records physiological and motion information from musicians in order to better understand how they express affective and interpretive information while performing. We present our design considerations and some preliminary results from data collection trials that we conducted on a range of professional and student conductors in real-world situations. 1 Introduction A common issue encountered in the development of new instruments is the question of how to map the input control stream (often in the form of transduced gestures) to the output audio stream. That is, how to generate complex musical responses that coherently reflect the intentional information contained in gestural inputs. Often, designers of interactive performance systems choose mappings that apply one continuous axis of the input control stream to a single continuous parameter at the output stream. This method often yields a clear and useful result, but has limitations. For example, if there are characteristics or patterns in the performance of the gesture that require interpretation, then more complex methods are required. We are interested in building a system that recognizes and responds appropriately to expressive features in musical gestures. The first phase of this project involves designing and building a reliable system to collect data from musicians in real performance situations. The second phase involves analyzing the data to learn features that are natural indicators of musical expression. Both of these phases are aimed at learning more about how musicians naturally modulate their gestures and physiology when they perform music, so that better mappings can be developed between the input gestures and the output audio stream. The final phase of this project will involve constructing such mappings for the system described below. We chose to focus on conductors because they use a highly specific code of gestures in order to indicate the expressive and interpretive elements in the musical structure. Additionally, conducting gestures often indicate more global phenomena than the individual note events and are not constrained by the requirements of performing on a mechanical instrument. The data-acquisition system described in this paper, which we call the "Conductor's Jacket", is a wearable array of sensors, communications, and computation, which detects and records physiological changes and physical motion without interfering with the way that the gesture is naturally performed. This approach builds upon experimental and analytical techniques developed for situations involving human affective communication [Healey and Picard 1997, Picard 1997]. We present the system architecture and design considerations below, along with preliminary results from use of this system by three professional conductors and four conducting students.

2 The "Conductor's Jacket" System The most important human factor in the design of our system was the need to provide a device that would not constrain, encumber, or cause discomfort to a conductor during standard rehearsals and performances. We felt strongly that we should gather data in a professional context, as opposed to a laboratory simulation, in order to generate useful and significant results. Because of this choice, we had to conform to the demands of a rehearsal situation and be sensitive to the conductor's agenda. The outfit would have to be easy to put on and take off, simple to plug in, allow for free movement of the upper body, and be robust enough to withstand the lively, energetic movements of an animated subject. Given these constraints, we decided to focus on signals that could be detected at the surface of the skin in a relatively simple way. Drawing upon previous work from the M.I.T. Media Lab's Affective Computing Research Group, we chose to focus on physiological sensors that give clues to the wearer's emotional state and that provide information about the projected gestures of the conductor. The final system contains one electromyography (EMG) sensor on each bicep and forearm extensor, together with sensors for respiration, temperature, skin conductance, and heart rate. In addition, we used sensors that measure 3D position and orientation on each wrist, elbow, shoulder, nape of the neck, and small of the back. Figure 1. Placement and integration of sensors into the jacket.

The 'jacket' is a specially adapted shirt designed to accommodate and hold the different sensors in place. The design shown here was originally intended to be unobtrusively placed under normal clothing, although an alternate version was also made. The alternate version was functionally equivalent to the original, but deliberately designed to ``show all the wires.'' Hence, the appearance of the system (such as a traditional tux with no wires showing, or a ``wired look'') is up to the wearer, and the style of the clothing is functionally irrelevant. Elastic armbands hold the EMG sensors against the skin, with strain-relief sewn into the cloth. Special channels sewn onto the outside of the arms, shoulders and back of the outfit act as conduits for the cables, which join together and lead to a small plug that is in turn cabled to a nearby computer. All wires and cables are looped to minimize stress on the connectors as well as on the wearer. The EMG sensors provide a differential, 1000x amplified measurement and are made by Delsys Inc., a company that specializes in electromyography. The heart rate sensor is a Polar Heart Monitor, available in stores that sell athletic training equipment. The other physiology sensors are manufactured by Thought Technology Inc., which develops portable systems for medical monitoring. The motion sensors were part of an UltraTrak motion capture system, which was loaned to us by the Polhemus Corporation. All the sensors are noninvasive and are held in place by means of elastics and straps; no messy gels or adhesives are used. When worn and plugged in, the current version of the Conductor's Jacket provides approximately twenty feet of maneuvering space around the computer. While a fully wireless system remains in the future, the twenty-foot radius was sufficient for both orchestral rehearsals and a live professional performance. This distance also ensures that the limited mouse-clicking and typing required for running the data acquisition utilities are far enough away from the conductor so as not to be disturbing. In addition to the wearable sensor web, the Conductor's Jacket consists of a data acquisition system running on a PC. For this we chose to use an open architecture, so as to be able to integrate and swap between numerous sensors from different manufacturers. In the current version, the raw voltage outputs from the eight physiological signals are routed through an external terminal box. The signals are then sent to a board inside a PC where they are sampled at 1 khz with a 16-bit A/D converter. Figure 2. Hardware Architecture for the Conductor's Jacket.

Utilities were written in National Instruments' Labview software package to process, filter, graph, index, and write the data to files. These allow the computer operator to quickly manipulate the graphical output of the data for debugging purposes, as well as to trigger the A/D converter and naming and writing to files in various locations. A digital video camera was used at every data collection session in order to capture a high-resolution, time-indexed record of the events in the rehearsal. The video is useful in gathering accurate annotations, for example, to confirm that a particular gesture was indeed cueing the winds versus waving to an observer.' The video is not part of the wearable conducting system, although there exist wearable video systems which record their wearer's gestures through a camera [Starner, Weaver, and Pentland, 1997]. We chose to keep the equipment worn by the conductor as natural in appearance as possible; consequently, we chose a freestanding camera that was only used by us for labeling data. 3 Conclusions and Future Work The "Conductor's Jacket" is an ongoing project, the first phase of which has been completed: the design of an accurate, robust, comfortable, and unobtrusive wearable sensing system, capable of recording not only gestural motion information, but also expressive aspects of musical gesture. The system has been successfully used in data collection events during classes, rehearsals, and live concert-hall performances of seven conductor subjects over a four-month period. These subjects spanned a range of abilities, from conservatory students to prominent professionals. On average, it took six minutes to suit up each conductor, which comfortably met the requirements of even busy performers schedules. Difficulties were occasionally encountered in adjusting the sensor placement on different individuals (resulting in certain sensors losing contact or recording motion artifacts and noise), but the design, if properly positioned initially, was able to gather good data over lengthy rehearsal and performance sessions (up to three hours) without encumbering the wearer. The large data sets from these events are being analyzed and will be applied to build user-dependent models for realtime recognition of expressive gestures. Features of the data that have been identified so far include significant variations between the physiological signatures of different individuals, as well as the expressive prominence of bicep EMG and respiration signals. These results will be reported in a forthcoming publication. Work is underway to make future versions of the "Conductor's Jacket" entirely wireless, either by means of wired sensors connecting to a wearable computer, or via wireless transmission at each sensor itself. We are also exploring other applications of this interface in the areas of affect recognition and stress evaluation. It should be noted that while the "Conductor's Jacket" is designed only to sense the motions of the torso and arms, it might be combined with another hand-held interface, to augment the amount of input control. Also, while this project does not sense the articulations of the face or hands, there could be great value in adding features for detecting such gestures. The "Conductor's Jacket" is an ongoing project, involving not only the design of a physical system, but also the learning of natural, intuitive, and expressive mappings for musical gestures. The development of the system described here has enabled accurate collection of musical gesture information from professional conductors in real performance situations. It is our hope that this system will not only provide a rich source of information on the physiology and mechanics of conducting, but also will be of creative use in expanding the potential of new expressive performance systems. 4 Acknowledgments This project was funded in part by the Things That Think Consortium at the MIT Media Lab. The authors would like to thank Professor Tod Machover for his advice and support, Gregory Harman and Jennifer Healey for their hardware expertise, and the students and staff in the Affective Computing and Hyperinstruments research groups. We would also like to thank the people of the Polhemus Corporation for their generous donation and support of an UltraTrak motion capture system.

5 References Campbell, D.G. "Basal emotion patterns expressible in music." American Journal of Psychology 1942, 55, 1-17. Clynes, Manfred. Sentics: the Touch of the Emotions. New York: Doubleday and Company, 1977. Fels, S. and Hinton, G. "Glove-TalkII: An Adaptive Gesture-to-Formant Interface." Proceedings of the Conference on Computer Human Interface '95 (Denver), ACM Press, 456-463. Healey, Jennifer and Rosalind Picard. "Digital Processing of Affective Signals. ICASSP97, Seattle, Washington, May 12-15, 1998. Krumhansl, Carol L. (1997) "Psychophysiology of Musical Emotions." Proceedings of the International Computer Music Conference, 1997, pp. 3-6. Kurtenbach, Gordon and Eric A. Hulteen. "Gestures in Human-Computer Communication." In Brenda Laurel, ed., The Art of Computer-Human Interface Design. Reading, MA: Addison-Wesley Publishing Company, 1990, 309-317. Morita, Hideyuki, Shuji Hashimoto, and Sadamu Ohteru. "A Computer Music System that Follows a Human Conductor." IEEE Computer Magazine, 24(7), July 1991, 44-53. Mulder, A., S. Fels and K. Mase. "Empty-handed Gesture Analysis in Max/FTS." Proceedings of Kansei The Technology of Emotion, AIMI International Workshop, Genova, Italy, October 1997. Paradiso, Joseph. "Electronic Music: New Ways to Play," IEEE Spectrum 34:12, December 1997, 18-30. Parncutt, Richard. "Modeling piano performance: Physics and cognition of a virtual pianist." Proceedings of the International Computer Music Conference, 1997, pp. 15-18. Picard, Rosalind. Affective Computing. Cambridge: M.I.T. Press, 1997. Starner, Thad, Joshua Weaver, and Alex Pentland. A Wearable Computing Based American Sign Language Recognizer. International Symposium on Wearable Computers, 1997.