Augmenting Virtual Worlds with Musical Robotics

Similar documents
CARME: A MULTI-TOUCH CONTROLLER FOR REAL-TIME SPATIALISATION OF GESTURAL BEHAVIOUR

RASPER: a Mechatronic Noise-intoner

Design and Realization of the Guitar Tuner Using MyRIO

Mechatronic Keyboard Music: Design, Evaluation, and Use of a New Mechatronic Harmonium

MUSIC INFORMATION ROBOTICS: COPING STRATEGIES FOR MUSICALLY CHALLENGED ROBOTS

Mutor: Drone Chorus of Metrically Muted Motors

Cathedral user guide & reference manual

TransitHound Cellphone Detector User Manual Version 1.3

Multiband Noise Reduction Component for PurePath Studio Portable Audio Devices

DX-10 tm Digital Interface User s Guide

6.UAP Project. FunPlayer: A Real-Time Speed-Adjusting Music Accompaniment System. Daryl Neubieser. May 12, 2016

Using Extra Loudspeakers and Sound Reinforcement

Shifty Manual v1.00. Shifty. Voice Allocator / Hocketing Controller / Analog Shift Register

Polyend Poly Polyphonic MIDI to CV Converter User Manual

IEEE Santa Clara ComSoc/CAS Weekend Workshop Event-based analog sensing

HEAD. HEAD VISOR (Code 7500ff) Overview. Features. System for online localization of sound sources in real time

Installation of a DAQ System in Hall C

DNT0212 Network Processor

Radio for Everyone...

The MPC X & MPC Live Bible 1

DNT0212 Network Processor

DT9834 Series High-Performance Multifunction USB Data Acquisition Modules

Ben Neill and Bill Jones - Posthorn

Towards an Automated Pan Flute Player

Adding Analog and Mixed Signal Concerns to a Digital VLSI Course

R H Y T H M G E N E R A T O R. User Guide. Version 1.3.0

Technical Guide. Installed Sound. Loudspeaker Solutions for Worship Spaces. TA-4 Version 1.2 April, Why loudspeakers at all?

Portable Speakers. 3,000 Watt 18" Powered Subwoofer with KLARK TEKNIK DSP Technology, Speaker Modelling and ULTRANET Networking

MULTIMIX 8/4 DIGITAL AUDIO-PROCESSING

AutoPRK - Automatic Drum Player

Natural Radio. News, Comments and Letters About Natural Radio January 2003 Copyright 2003 by Mark S. Karney

PRODUCT GUIDE CEL5500 LIGHT ENGINE. World Leader in DLP Light Exploration. A TyRex Technology Family Company

AcoustiSoft RPlusD ver

Introduction to Data Conversion and Processing

N5264A. New. PNA-X Measurement Receiver. Jim Puri Applications Specialist March Rev. Jan Page 1

MIDI2DMX PRO. solid state MIDI to DMX converter. wwww.midi2dmx.eu

Noise Tools 1U Manual. Noise Tools 1U. Clock, Random Pulse, Analog Noise, Sample & Hold, and Slew. Manual Revision:

Application Notes on the ClearOne Beamforming Microphone Array

This document is intended to provide information to allow the researcher to build their own device.

Music Understanding and the Future of Music

OVERVIEW. YAMAHA Electronics Corp., USA 6660 Orangethorpe Avenue

The software concept. Try yourself and experience how your processes are significantly simplified. You need. weqube.

Digital Audio Design Validation and Debugging Using PGY-I2C

S0 Radio Broadcasting Mixer. June catalogue. Manufacturers of audio & video products for radio & TV broadcasters

INDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR NPTEL ONLINE CERTIFICATION COURSE. On Industrial Automation and Control

Noise Tools 1U Manual. Noise Tools 1U. Clock, Random Pulse, Analog Noise, Sample & Hold, and Slew. Manual Revision:

// K4815 // Pattern Generator. User Manual. Hardware Version D-F Firmware Version 1.2x February 5, 2013 Kilpatrick Audio

Integrated Circuit for Musical Instrument Tuners

Self-Playing Xylophone

SPECTRO Series SPECTRO-3-50-FCL-JR. Design. SPECTRO-3 Series True Color Sensors. Product name: SPECTRO-3-50-FCL-JR (incl. software SPECTRO3-Scope)

Next Generation Software Solution for Sound Engineering

Configuring the Stack ST8961 VS Module when used in conjunction with a Stack ST81xx series display.

Using Extra Loudspeakers and Sound Reinforcement

Digital audio is superior to its analog audio counterpart in a number of ways:

Objectives. Combinational logics Sequential logics Finite state machine Arithmetic circuits Datapath

C8000. switch over & ducking

Shifty Manual. Shifty. Voice Allocator Hocketing Controller Analog Shift Register Sequential/Manual Switch. Manual Revision:


HCS-4100/20 Series Application Software

Put your sound where it belongs: Numerical optimization of sound systems. Stefan Feistel, Bruce C. Olson, Ana M. Jaramillo AFMG Technologies GmbH

Summit Systems Sound Board Modification

Approaches to synchronize vision, motion and robotics

V9A01 Solution Specification V0.1

The new class of signaling lights

TOWARDS IMPROVING ONSET DETECTION ACCURACY IN NON- PERCUSSIVE SOUNDS USING MULTIMODAL FUSION

2012: Celebrating 40 Years of the World s Best Audio Systems

DESIGN PHILOSOPHY We had a Dream...

MOD028 GLOCKENSPIEL TECHNO-MUSIC-OLOGY

HCS-4100/50 Series Fully Digital Congress System

IRIG-B PTP Clock Converter Output Module Hardware Installation Manual

P116 SH SILENT PIANOS

Supervision of Analogue Signal Paths in Legacy Media Migration Processes using Digital Signal Processing

Displays Open Frame Monitor Model Number: AND-TFT-150Bxx

CUSSOU504A. Microphones. Week Two

Mixing in the Box A detailed look at some of the myths and legends surrounding Pro Tools' mix bus.

Digital Strobe Tuner. w/ On stage Display

News from Rohde&Schwarz Number 195 (2008/I)

How to Obtain a Good Stereo Sound Stage in Cars

Innovative Rotary Encoders Deliver Durability and Precision without Tradeoffs. By: Jeff Smoot, CUI Inc

OPTIMUM Power Technology: Low Cost Combustion Analysis for University Engine Design Programs Using ICEview and NI Compact DAQ Chassis

Major Differences Between the DT9847 Series Modules

Fortissimo. Afgroup srl. Integrated amplifier. AFGROUP srl. Issue Date: April Fortissimo Integrated amplifier 1 / 7

NOTICE. The information contained in this document is subject to change without notice.

013-RD

Tactus Stage Mixing Interface

Recording to Tape (Analogue or Digital)...10

Tiptop audio z-dsp.

Dynatone Digital Piano

T L Audio. User Manual C1 VALVE COMPRESSOR. Tony Larking Professional Sales Limited, Letchworth, England.

R-1580A Microwave Downconverter. Product Brochure

SPECTRO Series SPECTRO-3-50-UV-JR. Aufbau. SPECTRO-3 Series True Color Sensors. Product name: SPECTRO-3-50-UV-JR (incl. software SPECTRO3-Scope)

QUALITY OF COMPUTER MUSIC USING MIDI LANGUAGE FOR DIGITAL MUSIC ARRANGEMENT

DPD80 Visible Datasheet

CLT-353R & CLT-353L CAMERA LINK TRANSLATOR. User s Manual. Document # , Rev 0.1, 4/19/2013 preliminary

Experimental Study to Show the Effect of Bouncing On Digital Systems

multitrack sequencer USER GUIDE Social Entropy Electronic Music Instruments

IP Telephony and Some Factors that Influence Speech Quality

System Satellites Acoustimass Module. 2.5" (64 mm) full-range driver (per satellite) 5.25" (133 mm) dual voice coil low frequency driver

AV KEEPS NYC SECURE JAIL IS UNDER CONTROL GREETINGS FROM MARS NYPD S EOC SERVES MULTIPLE PURPOSES.

Solutions to Embedded System Design Challenges Part II

Transcription:

Augmenting Virtual Worlds with Musical Robotics Jason Long Victoria University of Wellington Wellington, New Zealand jason.long@ecs.victoria.ac.nz Abstract This paper introduces the concept of augmenting the experience of interacting with virtual worlds by making use of musical robotics systems. By creating or interpreting real-time control data from the music and sound-effect channels of interactive software while it is being used, signals for controlling robotic musical instruments and other acoustic sound-creating devices can be generated. These acoustic instruments add a physicality to virtual environments by bringing previously virtual sound into the real world. A proof of concept is described, making use of a set of custom-built robotic pitched and non-pitched percussion instruments playing in conjunction with a well-known vintage videogame. The system is presented as an installation of kinetic art and sound for participants to experience. The design of the robotic sound-objects, their control systems and the software used is described and the paper concludes with an outline of future work and a summary of some of the many potential applications for this technology. Keywords Virtual Reality, Video Games, Music, Sound Effects, Musical Robotics, Solenoid, MIDI Introduction Musical robotics is an emerging, multi-disciplinary field which combines the arts with engineering and involves actuating physical sound-making apparatus with computer control as a means of musical expression. By utilizing robotic musical instruments and sound objects, musicians are able to create compositions and installations in real acoustic space that were previously impossible to play by human performers or loudspeakers. Whereas sampled sounds are reproduced perfectly upon playback, acoustic instruments will exhibit slight variance upon each actuation even when activated in the same position and velocity. This naturally provides a sense of realism and authenticity to the experience of listening to them. One important element in the motivation for making use of musical robotics, is that they have the ability to be synchronized reliably with other instruments in an ensemble or installation as well as with other types of media. In this paper, an installation is described which demonstrates the synchronizing of custom-built robotic musical apparatus with video game worlds to provide players with a novel experience that sees in-game sounds and music leap from the virtual world to the physical world. First, a background which briefly provides some historical context to this work will be offered, before describing the installation presented. Details regarding the design and construction of both the hardware and software elements of the installation are discussed, and the paper is concluded with plans for future work to be carried out and a summary of potential applications for this technology. Background The field of automatically actuated musical apparatus has a long and rich history spanning hundreds of years, with its period of greatest popularity in the 19th century with the prevalence of sophisticated orchestrions and player pianos. In the early 20th century primarily due to the rise of the loudspeaker, automated musical instruments experienced a period of decline. However in the 1970s, the affordability of transistors and computing technology brought musical robotics pioneers such as Trimpin and Godfried Willem Raes to create robotic sound objects and musical instruments capable of being controlled in real-time for installations and concerts. The advent of the MIDI standard in 1983 brought about a new level of interconnectivity and synchronization to these instruments, and in the over 30 years since, the field has expanded to include countless practitioners world-wide. [1] and [2] provide further detail into the history of the field. There are several historic examples of acoustic sounds augmenting game-play experience. Since gambling games and pinball machines of the 19th and early 20th centuries were mechanical by nature, the sounds of their various components such as motors spinning, clicking and sliding added to the aural experience of the game. A landmark development was Pacific Development s Contact pinball machine produced in 1933, which introduced an electromechanically activated bell to indicate game states favorable to the player. [3] As video-games began to utilize exclusively loud-speaker generated audio in the 1970s, much of the mechanically produced sound from previous generations of hardware was replaced with virtual reproductions and synthesized sound. Musical Robotics and Virtual Worlds In recent times, there have been renewed developments towards using physically actuated sound to augment the experience of virtual worlds. One striking example is OccultUs by Simon de Diesbach. [4] This installation makes

use of the Occulus Rift virtual reality headset, and asks the participant to sit on a chair in a room surrounded by various mechanical devices with the headset on. The software guides the participants on rails through a virtual world filled with several rooms of strange machinery which are activated by the user s gaze. The various kinetic sound objects in the real room, such as a machine that drops glass to break it, a machine that spins a tube of marbles and one that drags chains on a metal floor are then synchronized with the machines in the virtual world to create a perception that the virtual objects are creating a real sound. Though the effect achieved is vivid, the level of interactivity in this installation is limited, with the participants somewhat passively experiencing the content rather than truly interacting with or creating it. Another recent project that is closer in scope to the installation presented in this paper is a setup by David Thompson. [5] Making use of two Raspberry Pis, a commercially produced Yamaha Disklavier and several solenoids, David is able to have the sound of several Nintendo games reproduced during game-play on an automatic piano and several percussion instruments. Due to the process used to translate the game-play audio to control instructions, this setup has the drawback of introducing a halfsecond delay between the in-game actions and the resultant sounds, and incomplete control of the sound mapping. The installation described in this paper seeks to create an experience that builds on these previous efforts and improves on their various limitations. The Installation An installation is presented, consisting of a television screen, Nintendo Entertainment System (NES) controller, Robotic Xylophone, Robotic Glockenspiel, Robotic Egg- Shakers and Robotic Castanets. When a participant picks up the controller, they are greeted with the welcome screen of the famous Super Mario Bros. NES game and are able to play the game freely on the television screen. During game-play, the original synthesized audio of the game is muted, the sound effects and music being replaced and reproduced entirely by the array of robotic musical instruments decorating the installation. The game Super Mario Bros. was selected for the instant recognizability of its theme music and sound effects, even for non-gamers. By choosing a game with very well-known sounds, the effect of their substitution with alternatives in the physical world creates a greater impact on participants and aims to let them re-imagine experiences from their earlier years in a new light. The designs of the various components of the system are described below. Figure 2: CAD Design of the Robotic Xylophone Robotic Xylophone This robotic xylophone was built around a commercially produced Yamaha xylophone, and elements of its design were inspired by instruments such as Godfried Willem Raes <Xy> automated quartertone xylophone, [6] Trimpin s Conloninpurple installation [7] and Eric Singer s Xylobot. [8] The striking mechanism is based on the Trimpin Hammer design described in [9]. The two sections of the 30-key instrument are independent from each other, and can be placed separately in the installation in order to enhance their spacial effect. The frame is constructed using laser-cut plywood (CAD drawing presented in figure 2), and they are connected to a control box via standard DB-25 cables. The Robotic Xylophone is the primary melodic instrument in the installation, and actuates the majority of the game s background music and many of the sound effects. It is fully velocity sensitive, able to perform loudly and softly as appropriate, and is able to achieve more than 20 strikes per second per note with an entire system latency of approximately 20 milliseconds. Figure 3: The Robotic Glockenspiel Figure 1: The Robotic Xylophone Robotic Glockenspiel The Robotic Glockenspiel repurposes a marching band glockenspiel as a computer-controlled instrument, inspired by other robotic metallophones such as the Karmetik

Glockenbot [10] and Godfried Willem Raes <Vibi> automated vibraphone. [11] It utilizes direct-striking tubular solenoid mechanisms similar to <Vibi> s, which allow very low latencies, simple implementation and low levels of extraneous acoustic noise. By positioning the striking mechanisms below the keys of the instrument, it leaves the possibility for it to also be played simultaneously by a human player and allows the participants and audience an unobstructed view of the instrument. After trialing solenoid shaft caps of several varying materials, wooden ones were chosen for their bright but not overly harsh sound. It is also built as a one-piece table top instrument, making it very portable and simple to set up, with a single DB-25 cable connecting it to the control box. This instrument is used in the installation to provide supplementary effects to the music and to create sound effects where a metallic timbre would be appropriate. The most common example of such a sound in the Super Mario Bros. game is the two-toned chime which is sounded when the main character picks up a coin. Robotic Castanets As shown on the right of figure 5, the Robotic Castanets make use of tubular push-type linear solenoids to strike the bottom side of the castanet against the top side, which is fixed to the laser cut MDF frame. Unlike some other automated castanet machines such as Godfriend Willem Raes <Casta Uno> and <Casta Due>, [12] these are individual devices which are fitted with standard microphone stand nuts. These allow the Robotic Castanets to be separately positioned around the installation space as appropriate for maximum spacial effect. These instruments play some of the percussive elements of the installation s background music, and some in-game sound effects. A CAD drawing of the frame of the instrument is shown in figure 6. Figure 6: CAD design of a Robotic Castanet. Figure 4: CAD design of the Robotic Glockenspiel Other Robotic Percussion To complement the pair of polyphonic pitched percussion instruments, a set of individual un-pitched percussion instruments were also utilized. The set consists of two robotic egg shakers and up to four robotic castanets. Robotic Egg Shakers As shown on the left of figure 5, the Robotic Egg Shakers utilize rotary solenoids which generate a side to side motion that is stopped at one side by the termination of the solenoid's movement, and on the other side by a rubber damper. The rotary solenoids have adjustable internal springs that return the egg shakers to their original position after the controlling voltage is removed. As these units are somewhat heavier than the Robotic Castanets, an aluminum mounting bracket was created for added strength, and it can also connect directly to standard microphone stands for flexible positioning in the performance space. These instruments also perform some of the percussion parts of the background music of the installation. Control Hardware To control this ensemble of robotic musical instruments, custom controller hardware was created. Figure 5: Robotic Egg Shakers (left) and Castanet (right).

Figure 8: A diagram charting the flow of the musical control signals from the participant s controller to the musical robots. general purpose input / output registers, which are utilized to control the 51 solenoids contained in the two instruments. The polyphonic percussion control box is powered by two transformers, one configured to supply 42v for the Robotic Xylophone, and the other configured to supply 58v for the Robotic Glockenspiel. Software Figure 7: Modular (left) and Polyphonic Percussion (right) Control Boxes Modular Control Box A flexible control box as shown on the left in figure 7 is used to send both the transient control signals to the Robotic Castanets and the toggling control signals to the Robotic Egg Shakers via the array of 3.5 mm phono jacks on its front panel. The unit contains a 48v switched-mode power supply, and the logic firmware is based around an Atmel Atmega8u2 microcontroller. This chip carries out the task of receiving MIDI messages from either the 5-pin DIN connector or the USB socket on the panel of the unit, interpreting them, and outputting the necessary control signals to actuate the appropriate robotic musical apparatus. Robotic Xylophone and Glockenspiel Controller The polyphonic percussion control box shown on the right side of figure 7 is responsible for interpreting the MIDI messages received at its 5-pin DIN or USB ports, and outputting the relevant pulses to the solenoids of the Robotic Xylophone and Glockenspiel instruments by way of the 3 DB- 25 ports on its front and side panels. In this case, an Atmega8u2 microcontroller loaded with a recompiled Hiduino firmware [13] carries out the USB-MIDI processing, and the primary microcontroller on-board is an Atmega1280. This chip was chosen due to its high number of Since the control boxes outlined above require an input of a MIDI signal to operate, software must be written to convert the musical messages that the Super Mario Bros. game generates into MIDI data in real time. The Nintendo Entertainment System synthesizes the audio in its games by way of an Audio Processing Unit (APU) housed inside its main CPU. The APU is capable of generating 4 channels of synthesized audio. These are 2 pulse waves, 1 tri wave and 1 noise channel. A very rudimentary sampler is also included, but since the contents of the sample change from game to game, that channel is not used in this project. The 4 synthesizer channels each have several controllable parameters such as pitch and length of note, duty cycle (for pulse waves), and a sweep control. These parameters are accessed by setting various registers in the APU's memory. [14] For the proof of concept of this installation, it was decided that an NES emulator that is capable of outputting its audio data in the form of MIDI messages would be employed. GNes and YoshiNES are two pieces of emulator software that were successfully trialed, though since there is more than one way to interpret the NES APU control information as MIDI, each emulator provides differing streams of MIDI information.

In order to distribute the musical information among the notes of the robotic musical instruments in an appropriate manner, a piece of software was written in the Max/MSP visual programming language to receive the MIDI messages from the emulator, filter and translate them, and output them to the hardware control boxes. Some examples of this translation include assigning specific frequencies of the noise channels to trigger the various robotic egg shakers and castanets, transposing melodic content of the tri and pulse wave channels that are out of the instruments' ranges into a range playable by the instruments, and mapping certain key sound effect notes to corresponding instruments. The signal flow from the participant s controller to the triggering of the robotic musical instruments is presented in figure 8. Future Work Though the proof-of-concept which uses emulator software to interpret in-game sound messages successfully fulfills the intended purpose of the installation, since it relies on the author of the emulator s interpretation of the sound data stream, there is some flexibility sacrificed. It is for this reason that the next iteration of the work s implementation plans to make use of the original NES console with microcontroller mounted inside the unit to intercept the data lines of the APU directly and generate a MIDI output from inside the enclosure. This will result in 3 improvements. Firstly, it will remove the dedicated MIDI translation running PC from the installation making it simpler, more compact and robust. Secondly, it will allow participants to interact with the genuine hardware of the console complete with genuine controller, adding to the authenticity of the installation, and lastly, it will allow the maximum amount of routing and interpretation flexibility by accessing the control data stream from the original game cartridge directly. Other welcome additions to the capabilities of the system include the ability to automatically detect which game is being played and switch to a corresponding configuration, and also to add further intelligence to the sound effect recognition algorithms to make sense of certain combinations of MIDI notes that indicate specific effects in the game and route performance information to the robots accordingly. Another limitation in the current system is the fact that in order to maintain a coherent cause and effect relationship between the actions of the participant and the sound actuators, currently only musical robots with a very low level of latency can be employed. This prevents some robots that require preparation in order to play notes such as Eric Singer s Guitarbot [15] and James McVay et al. s Mechbass [16] from being included in the system. Future development could potentially enable the use of these types of musical robots in the background music of games by automatically recognizing specific sequences of notes and synchronizing pre-programmed musical sequences with the in-game music. Conclusions This project has resulted in the successful construction of an art installation that breaks new ground in the area of augmenting virtual worlds with robotic sound-generation devices. It utilizes several custom-built novel robotic musical instruments and implements a mapping framework that allows in-game sounds to be realized with corresponding physical devices with minimal latency. The fact that users create an action in-game that is immediately responded to by a sound in real life creates a cause and effect relationship between the two movements and aids with the users suspension of disbelief while interacting with the system. Rather than seeing the robotic musical instruments as responding to the game, participants report perceiving the sounds of the game breaking through to the real world. In the case of this particular game, it also brings them to reimagine nostalgia from their earlier years in a new way. Though this paper described the technical details of an installation which focuses on a specific game, the instruments can be re-purposed for other games for the NES platform with little modification in the translation software. There are also many further possibilities in several other domains including applications in arcade gaming, virtual reality, cinema, education and potentially many other areas.

References 1. Ajay Kapur, A History of Robotic Musical Instruments, (paper based on a talk presented at the International Computer Music Conference, Barcelona, Spain, September, 2005). 2. Jim Murphy, Dale A. Carnegie and Ajay Kapur, Musical Robotics in a Loudspeaker World: Developments in Alternative Approaches to Localization and Spacialization, Leonardo Music Journal 22, (2012): 41. 3. Michael Sweet, Writing Interactive Music for Video Games: A Composer's Guide, (Boston: Addison-Wesley Professional, 2014). 4. Simon de Diesbach, OccultUs ECAL/Simon de Diesbach (2014), Vimeo, accessed December 12, 2014, http://vimeo.com/107016236 5. David Thompson, Nintendo audio played by player piano and robotic percussion, Youtube, accessed December 12, 2014, https://www.youtube.com/watch?v=t1wlaj7ryki 6. Godfried Willem Raes, Xy: An Automated Quartertone Xylophone, Logos Foundation, accessed December 12, 2014, http://www.logosfoundation.org/instrum_gwr/xy.html 7. Anne Focke, Trimpin: Contraptions for Art and Sound (Seattle: Marquand Books, 2011). 8. Eric Singer, XyloBot, ArtBots, NYC (2007), Vimeo, accessed December 12, 2014, http://vimeo.com/17439855 9. Ajay Kapur, Trimpin, Eric Singer, Afzal Suleman and George Tzanetakis, A Comparison of Solenoid-based Strategies for Robotic Drumming, (paper based on a talk presented at the International Computer Music Conference, Copenhagen, Denmark, September, 2007). 10. Dimitri Diakopoulos, Michael Darling and Ajay Kapur, Glockenbot, Karmetik, accessed December 12, 2014, http://www.karmetik.com/robot/glockenbot 11. Godfried Willem Raes, Vibi: An Automated Vibraphone, Logos Foundation, accessed December 12, 2014, http://www.logosfoundation.org/instrum_gwr/vibi.html 12. Godfried Willem Raes, Casta, automated sets of castanets, Logos Foundation, accessed December 12, 2014, http://www.logosfoundation.org/instrum_gwr/casta.html 13. Dimitri Diakopoulos and Ajay Kapur, HIDUINO: A firmware for building driverless USB-MIDI devices using the Arduino microcontroller, (paper based on a talk presented at the New Interfaces for Musical Expression Conference, Oslo, Norway, June, 2011). 14. Luddy Harrison, The Nintendo Entertainment System: CS433 Processor Presentation Series, University of Illinois, accessed December 12, 2014, http://www.cs.illinois.edu/~luddy/processors/nintend o.pdf 15. Eric Singer, Kevin Larke and David Bianciardi, LEMUR GuitarBot: MIDI Robotic String Instrument, (paper based on a talk presented at the New Interfaces for Musical Expression Conference, Montreal, Canada, May, 2003). 16. James McVay, Dale A. Carnegie, Jim W. Murphy and Ajay Kapur, MechBass: A Systems Overview of a New Four-Stringed Robotic Bass Guitar, (paper based on a talk presented at the Electronics New Zealand Conference, Dunedin, New Zealand, December, 2012). Author Biography Jason Long is a composer and sound artist from Christchurch, New Zealand. He completed his undergraduate study there at the University of Canterbury, and the Utrecht Higher School of the Arts, the Netherlands. He was subsequently awarded a Japanese Government scholarship to undertake a Master degree at the Tokyo University of the Arts where he designed and constructed an ensemble of robotic musical instruments. With a number of his pieces being performed internationally at festivals such as the ISCM, ACL, and ICMC and a slew of music released internationally in the form of vinyl records, CDs and digital distribution, Jason is currently pursuing a PhD at Victoria University in Wellington, conducting research in the fields of musical robotics and live electronic music.