ESP: Expression Synthesis Project

Similar documents
However, in studies of expressive timing, the aim is to investigate production rather than perception of timing, that is, independently of the listene

Palestrina Pal: A Grammar Checker for Music Compositions in the Style of Palestrina

ISE 599: Engineering Approaches to Music Perception and Cognition

Computer Coordination With Popular Music: A New Research Agenda 1

ISE : Engineering Approaches to Music Perception and Cognition

About Giovanni De Poli. What is Model. Introduction. di Poli: Methodologies for Expressive Modeling of/for Music Performance

Human Preferences for Tempo Smoothness

A prototype system for rule-based expressive modifications of audio recordings

Unobtrusive practice tools for pianists

An Empirical Comparison of Tempo Trackers

Director Musices: The KTH Performance Rules System

EXPLORING EXPRESSIVE PERFORMANCE TRAJECTORIES: SIX FAMOUS PIANISTS PLAY SIX CHOPIN PIECES

A Case Based Approach to the Generation of Musical Expression

Music Performance Panel: NICI / MMM Position Statement

Robert Rowe MACHINE MUSICIANSHIP

Measuring & Modeling Musical Expression

Guide to Computing for Expressive Music Performance

Distributed Virtual Music Orchestra

Automatic characterization of ornamentation from bassoon recordings for expressive synthesis

Toward a Computationally-Enhanced Acoustic Grand Piano

Expressive information

Interacting with a Virtual Conductor

Conductor Program (computer-mediated performance)

Shimon: An Interactive Improvisational Robotic Marimba Player

Melody Retrieval On The Web

A Computational Model for Discriminating Music Performers

Aalborg Universitet. Flag beat Trento, Stefano; Serafin, Stefania. Published in: New Interfaces for Musical Expression (NIME 2013)

ESTIMATING THE ERROR DISTRIBUTION OF A TAP SEQUENCE WITHOUT GROUND TRUTH 1

Real-Time Control of Music Performance

AURAFX: A SIMPLE AND FLEXIBLE APPROACH TO INTERACTIVE AUDIO EFFECT-BASED COMPOSITION AND PERFORMANCE

Follow the Beat? Understanding Conducting Gestures from Video

Reciprocal Transformations between Music and Architecture as a Real-Time Supporting Mechanism in Urban Design

TongArk: a Human-Machine Ensemble

Etna Builder - Interactively Building Advanced Graphical Tree Representations of Music

DIGITAL AUDIO EMOTIONS - AN OVERVIEW OF COMPUTER ANALYSIS AND SYNTHESIS OF EMOTIONAL EXPRESSION IN MUSIC

OBSERVED DIFFERENCES IN RHYTHM BETWEEN PERFORMANCES OF CLASSICAL AND JAZZ VIOLIN STUDENTS

A Beat Tracking System for Audio Signals

Computational Modelling of Harmony

REALTIME ANALYSIS OF DYNAMIC SHAPING

A Composition for Clarinet and Real-Time Signal Processing: Using Max on the IRCAM Signal Processing Workstation

A STATISTICAL VIEW ON THE EXPRESSIVE TIMING OF PIANO ROLLED CHORDS

INSTLISTENER: AN EXPRESSIVE PARAMETER ESTIMATION SYSTEM IMITATING HUMAN PERFORMANCES OF MONOPHONIC MUSICAL INSTRUMENTS

Applying lmprovisationbuilder to Interactive Composition with MIDI Piano

Automated sound generation based on image colour spectrum with using the recurrent neural network

Importance of Note-Level Control in Automatic Music Performance

Analysis of Musical Content in Digital Audio

The Human, the Mechanical, and the Spaces in between: Explorations in Human-Robotic Musical Improvisation

Piano touch, timbre, ecological psychology, and cross-modal interference


Goebl, Pampalk, Widmer: Exploring Expressive Performance Trajectories. Werner Goebl, Elias Pampalk and Gerhard Widmer (2004) Introduction

Analysing Musical Pieces Using harmony-analyser.org Tools

After Direct Manipulation - Direct Sonification

Extracting Significant Patterns from Musical Strings: Some Interesting Problems.

IJMIE Volume 2, Issue 3 ISSN:

Jam Sesh. Music to Your Ears, From You. Ben Dantowitz, Edward Du, Thomas Pinella, James Rutledge, and Stephen Watson

Statistical Modeling and Retrieval of Polyphonic Music

LSTM Neural Style Transfer in Music Using Computational Musicology

Algorithmic Music Composition

Automated extraction of motivic patterns and application to the analysis of Debussy s Syrinx

Outline. Why do we classify? Audio Classification

Musical Creativity. Jukka Toivanen Introduction to Computational Creativity Dept. of Computer Science University of Helsinki

QUALITY OF COMPUTER MUSIC USING MIDI LANGUAGE FOR DIGITAL MUSIC ARRANGEMENT

Music Radar: A Web-based Query by Humming System

Music Representations

Chords not required: Incorporating horizontal and vertical aspects independently in a computer improvisation algorithm

EVIDENCE FOR PIANIST-SPECIFIC RUBATO STYLE IN CHOPIN NOCTURNES

"The mind is a fire to be kindled, not a vessel to be filled." Plutarch

Artificial Social Composition: A Multi-Agent System for Composing Music Performances by Emotional Communication

From quantitative empirï to musical performology: Experience in performance measurements and analyses

Constructive Adaptive User Interfaces Composing Music Based on Human Feelings

The Musicat Ptaupen: An Immersive Digitat Musicat Instrument

Robert Alexandru Dobre, Cristian Negrescu

TOWARD AN INTELLIGENT EDITOR FOR JAZZ MUSIC

Arts Education Essential Standards Crosswalk: MUSIC A Document to Assist With the Transition From the 2005 Standard Course of Study

A MULTI-PARAMETRIC AND REDUNDANCY-FILTERING APPROACH TO PATTERN IDENTIFICATION

A PERPLEXITY BASED COVER SONG MATCHING SYSTEM FOR SHORT LENGTH QUERIES

MICON A Music Stand for Interactive Conducting

Music Composition with Interactive Evolutionary Computation

CATMASTER AND A VERY FRACTAL CAT, A PIECE AND ITS SOFTWARE

MusicGrip: A Writing Instrument for Music Control

Ben Neill and Bill Jones - Posthorn

Evolutionary jazz improvisation and harmony system: A new jazz improvisation and harmony system

MUSIC ACOUSTICS. TMH/KTH Annual Report 2001

Vuzik: Music Visualization and Creation on an Interactive Surface

Structure and Interpretation of Rhythm and Timing 1

LEARNING AUDIO SHEET MUSIC CORRESPONDENCES. Matthias Dorfer Department of Computational Perception

WHO IS WHO IN THE END? RECOGNIZING PIANISTS BY THEIR FINAL RITARDANDI

Visualizing the Chromatic Index of Music

From RTM-notation to ENP-score-notation

DESIGN OF ANALOG FUZZY LOGIC CONTROLLERS IN CMOS TECHNOLOGIES

Machine Learning Term Project Write-up Creating Models of Performers of Chopin Mazurkas

Finger motion in piano performance: Touch and tempo

BRAIN-ACTIVITY-DRIVEN REAL-TIME MUSIC EMOTIVE CONTROL

DECODING TEMPO AND TIMING VARIATIONS IN MUSIC RECORDINGS FROM BEAT ANNOTATIONS

The Disklavier: From Educational Tool To Digital Interspatial Performance Explorations

Vocal Processor. Operating instructions. English

SYMBOLIC AND STRUCTRUAL REPRESENTATION OF MELODIC EXPRESSION

MUSICAL ACCOMPANIMENT. BRIDGET BAIRD 7"he Center for Arts and Technology Connecticut College New London CT USA

Digital audio and computer music. COS 116, Spring 2012 Guest lecture: Rebecca Fiebrink

Empirical Evaluation of Animated Agents In a Multi-Modal E-Retail Application

Transcription:

ESP: Expression Synthesis Project 1. Research Team Project Leader: Other Faculty: Graduate Students: Undergraduate Students: Prof. Elaine Chew, Industrial and Systems Engineering Prof. Alexandre R.J. François, Computer Science Jie Liu Aaron Yang 2. Statement of Project Goals The Expression Synthesis Project (ESP) aims to create a driving interface that will enable nonexperts to create expressive musical performances. Anecdotal evidence amongst musicians suggests that generating an expressive performance is very much like driving a car. Not everyone can play an instrument but almost anyone can drive a car. 3. Project Role in Support of IMSC Strategic Plan The research supports IMSC s research in sensory interfaces and user centered sciences through the design of a driving (wheel and pedals) interface for controlling and rendering expressive musical performances. It is an interdisciplinary undertaking that combines human computer interaction with the performing arts. In addition, it augments IMSC s current projects in modeling, analysis and generating of facial expression. 4. Discussion of Methodology Used In the first instantiation of the ESP interface, the pedals will allow the user to control the tempo (speed). Other scientific studies have revealed human preferences for tempo smoothness (smooth changes in velocity) [2], an attribute that is enforced by the driving interface. The display will show landscape that directly map to musical content (extracted through computational analysis) so as to guide the user to make informed expressive decisions (see Figure 1). The computational analysis tools will include the pitch spelling, chord recognition and key tracking algorithms developed as part of the MuSA and MuSA.RT projects (see Volume Two reports on Pitch Spelling Technology and MuSA.RT). 261

TERRAIN: Curvature ~ tonal patterns STEERING WHEEL: navigating the turns PEDALS: Acceleration / deceleration Figure 1: The Expression Synthesis Interface There is evidence to suggest that the dynamics (loudness) in expressive performance is often linked directly to the acceleration [3]. In the current instantiation of ESP, we have made loudness directly proportional to the acceleration parameter. 5. Short Description of Achievements in Previous Years N/A 5a. Detail of Accomplishments During the Past Year We implemented a prototype of ESP using the SAI architectural style developed at IMSC (see Volume 2 report on SAI) [4]. Figure 2 shows the application graph for the ESP system. This prototype uses a Logitech MOMO Racing Force Steering Wheel with sequential stick shifter and realistic gas and brake pedals. The wheel has six programmable buttons, two paddle shifters and 240 degrees of rotation. The current capabilities include acceleration/deceleration control via the pedals. visual display of current position along terrain, speed and acceleration. 262

Visualization Physics model Control Audio Driving Interface Renderer Position Integratio n Velocity Update Buffer Out Display Midi events, rendering, and other process MIDI Out Figure 2: Application graph for the ESP system 6. Other Relevant Work Being Conducted and How this Project is Different ESP is unique in its use of a driving interface for expression control. The cognitive overhead in learning to use such a device to generate expression is low as most of us already know how to drive a car. The ESP driving interface allows the user to make expressive choices based on structural knowledge mapped to the road curvatures. Creating interfaces for controlling musical expression is not a new endeavor. There is an entire conference devoted to such creations the International Conference on New Interfaces for Musical Expression. Other groups involved in such endeavors include: The MIT Media Lab. Expressive control projects originating from the Media Lab include Teresa Marrin s Digital Baton [6], a study on synthesizing expressive music through the language of conducting, and Gil Weinberg s squeezable embroidered balls [9.10] that use hand squeezing and stretching as control mechanisms. Roberto Bresin of the Music Group at the Department of Speech, Music and Hearing in KTH has used neural networks to learn a professional pianist s expressive gestures [1]. Rules are used to induce variations on performance timing and dynamics with respect to a nominal performance [8]. The Austrian Research Institute for Artificial Intelligence, led by Gerhard Widmer. In particular, Werner Goebl has focused on computational methods to discover general principles of expressive performance [3]. 263

The MMM group let by Henkjian Honing in the Netherlands has numerous projects devoted to studying and generating expressive performance, including an environment for analyzing, modifying and synthesizing expression called POCO [5]. The MultiMedia Laboratory at the University of Zurich and the Mathematical Music Theory Group at TU Berlin through their Rubato software [7]. Rule-based and manual (piecewise) control of tempo and dynamics is part of the software capabilities. 7. Plan for the Next Year Implement a graphical interface that generates a road and terrain that correspond to musical structures. The structures will map directly to the road curvature and surface and serve as cues for better decision making in expressive control. Currently, the user controls the acceleration and deceleration. Future versions of ESP will include an autopilot option. Other future plans include user studies to test the effectiveness of the driving interface in generating expressive performances. 8. Expected Milestones and Deliverables Create autopilot option and study the effectiveness of the driving interface. 9. Member Company Benefits A device for controlling musical expression. There will be numerous ways to apply this technology to the gaming, animation and movie industries. 10. References [1] Bresin, R. (1999). An artificial neural network model for analysis and synthesis of pianists' performance styles. Journal of the Acoustical Society of America, (105)2, 1056. [2] Cambouropoulos, E., Dixon, S. E., Goebl, W., & Widmer, G. (2001). Human preferences for tempo smoothness. In H. Lappalainen (Ed.), Proceedings of the VII International Symposium on Systematic and Comparative Musicology, III International Conference on Cognitive Musicology, August, 16 19, 2001. Jyväskylä, Finland, pp. 18 26. [3] Dixon S., Goebl W. and Widmer G., The performance worm: Real time visualization of expression based on langner's tempo-loudness animation, Proceedings of the International Computer Music Conference, pages 361-364, Göteborg, Sweden, Sept. 2002. [4] François A., A Hybrid Architectural Style for Distributed Parallel Processing of Generic Data Streams, Proceedings of the International Conference on Software Engineering, Edinburgh, Scotland, UK, May 2004. 264

[5] Honing, H. (1990). POCO: an environment for analyzing, modifying, and generating expression in music. Proceedings of the International Computer Music Conference, pp. 364-358, San Francisco, 1990. [6] Marrin Nakra, T. (2001). "The Digital Baton: a Versatile Performance Instrument. Journal of New Music Research, June 2001. [7] Rubato software: http://www.ifi.unizh.ch/groups/mml/musicmedia/rubato/rubato.html [8] Sundberg, J., Friberg, A., and Bresin, R. (2003). Attempts to reproduce a pianist's expressive timing with Director Musices performance rules. Journal of New Music Research, 32:3, 317-325 [9] Weinberg G., and Gan S. (2001) "The Squeezables: Toward an Expressive and Interdependent Multi-player Musical Instrument". Computer Music Journal. MIT Press: 25:2, pp.37-45. [10] Weinberg, G., Orth M., and Russo P. (2000) "The Embroidered Musical Ball: A Squeezable Instrument for Expressive Performance." Proceedings of CHI 2000. The Hague: ACM Press. 265

266