42Percent Noir - Animation by Pianist

Similar documents
Ben Neill and Bill Jones - Posthorn

UWE has obtained warranties from all depositors as to their title in the material deposited and as to their right to deposit such material.

15th International Conference on New Interfaces for Musical Expression (NIME)

Enhancing Music Maps

Sudhanshu Gautam *1, Sarita Soni 2. M-Tech Computer Science, BBAU Central University, Lucknow, Uttar Pradesh, India

From Idea to Realization - Understanding the Compositional Processes of Electronic Musicians Gelineck, Steven; Serafin, Stefania

A Need for Universal Audio Terminologies and Improved Knowledge Transfer to the Consumer

Simple motion control implementation

Real-time Granular Sampling Using the IRCAM Signal Processing Workstation. Cort Lippe IRCAM, 31 rue St-Merri, Paris, 75004, France

AHRC ICT Methods Network Workshop De Montfort Univ./Leicester 12 June 2007 New Protocols in Electroacoustic Music Analysis

A Real-Time Genetic Algorithm in Human-Robot Musical Improvisation

Melody classification using patterns

A SuperCollider Implementation of Luigi Nono s Post-Prae-Ludium Per Donau

Devices I have known and loved

How to Obtain a Good Stereo Sound Stage in Cars

Digital audio and computer music. COS 116, Spring 2012 Guest lecture: Rebecca Fiebrink

Improvised Duet Interaction: Learning Improvisation Techniques for Automatic Accompaniment

Evolutionary jazz improvisation and harmony system: A new jazz improvisation and harmony system

A System for Generating Real-Time Visual Meaning for Live Indian Drumming

Harmony, the Union of Music and Art

QUALITY OF COMPUTER MUSIC USING MIDI LANGUAGE FOR DIGITAL MUSIC ARRANGEMENT

A FUNCTIONAL CLASSIFICATION OF ONE INSTRUMENT S TIMBRES

PLOrk Beat Science 2.0 NIME 2009 club submission by Ge Wang and Rebecca Fiebrink

Sketching (2013) Performance Guide. Jason Freeman, Yan- Ling Chen, Weibin Shen, Nathan Weitzner, and Shaoduo Xie

WRoCAH White Rose NETWORK Expressive nonverbal communication in ensemble performance

Application of a Musical-based Interaction System to the Waseda Flutist Robot WF-4RIV: Development Results and Performance Experiments

ESP: Expression Synthesis Project

N e w S o u n d Te c h n o l o g y... T h e To u c h o f a Tr u e G r a n d P i a n o...

Sensor Choice for Parameter Modulations in Digital Musical Instruments: Empirical Evidence from Pitch Modulation

A review of the implementation of HDTV technology over SDTV technology

6.UAP Project. FunPlayer: A Real-Time Speed-Adjusting Music Accompaniment System. Daryl Neubieser. May 12, 2016

Sound visualization through a swarm of fireflies

Cymatic: a real-time tactile-controlled physical modelling musical instrument

Shimon: An Interactive Improvisational Robotic Marimba Player

Measurement of Motion and Emotion during Musical Performance

High Performance Raster Scan Displays

Applying lmprovisationbuilder to Interactive Composition with MIDI Piano

TongArk: a Human-Machine Ensemble

PLATFORM. halsey burgund : scapes

Interactive Visualization for Music Rediscovery and Serendipity

Performance Improvement of AMBE 3600 bps Vocoder with Improved FEC

Exploring Choreographers Conceptions of Motion Capture for Full Body Interaction

1.1 CURRENT THEATRE PRACTISE

Porta-Person: Telepresence for the Connected Conference Room

Using machine learning to support pedagogy in the arts

Implementation of an 8-Channel Real-Time Spontaneous-Input Time Expander/Compressor

A Space for Looking is a Space for Listening

Advance Certificate Course In Audio Mixing & Mastering.

The Barco Magic-Y system

R H Y T H M G E N E R A T O R. User Guide. Version 1.3.0

ECE 402L APPLICATIONS OF ANALOG INTEGRATED CIRCUITS SPRING No labs meet this week. Course introduction & lab safety

inuke NU6000/NU3000 NU1000/NU6000DSP NU3000DSP/NU1000DSP

Automatic Polyphonic Music Composition Using the EMILE and ABL Grammar Inductors *

Brain Computer Music Interfacing Demo

Module 4: Video Sampling Rate Conversion Lecture 25: Scan rate doubling, Standards conversion. The Lecture Contains: Algorithm 1: Algorithm 2:

On the Music of Emergent Behaviour What can Evolutionary Computation bring to the Musician?

Demonstration of geolocation database and spectrum coordinator as specified in ETSI TS and TS

Chapter 1. Introduction to Digital Signal Processing

Lian Loke and Toni Robertson (eds) ISBN:

installation... from the creator... / 2

Setting up your Roland V-Drums with Melodics.

Real-time composition of image and sound in the (re)habilitation of children with special needs: a case study of a child with cerebral palsy

MS-E Crystal Flowers in Halls of Mirrors 30 Mar Algorithmic Art II. Tassu Takala. Dept. of CS

Expressive performance in music: Mapping acoustic cues onto facial expressions

An exploration of the pianist s multiple roles within the duo chamber ensemble

Acoustic Measurements Using Common Computer Accessories: Do Try This at Home. Dale H. Litwhiler, Terrance D. Lovell

Musical Interaction with Artificial Life Forms: Sound Synthesis and Performance Mappings

Chapter 24. Meeting 24, Discussion: Aesthetics and Evaluations

Toward a Computationally-Enhanced Acoustic Grand Piano

MULTIMIX 8/4 DIGITAL AUDIO-PROCESSING

PaperTonnetz: Supporting Music Composition with Interactive Paper

GLOBAL SOCIOLOGY BY ROBIN COHEN, PAUL KENNEDY DOWNLOAD EBOOK : GLOBAL SOCIOLOGY BY ROBIN COHEN, PAUL KENNEDY PDF

Intimacy and Embodiment: Implications for Art and Technology

Grades 3-5: Unique Voices with Singer/Songwriter David Sereda

PLUGIN MANUAL. museq

Proceedings of Meetings on Acoustics

THREE-DIMENSIONAL GESTURAL CONTROLLER BASED ON EYECON MOTION CAPTURE SYSTEM

Tooka: Explorations of Two Person Instruments

Development of extemporaneous performance by synthetic actors in the rehearsal process

Reflections on the digital television future

Short Set. The following musical variables are indicated in individual staves in the score:

Toward the Adoption of Design Concepts in Scoring for Digital Musical Instruments: a Case Study on Affordances and Constraints

Example: compressing black and white images 2 Say we are trying to compress an image of black and white pixels: CSC310 Information Theory.

Introduction to GRIP. The GRIP user interface consists of 4 parts:

ONE-WAY DATA TRANSMISSION FOR CABLE APPLICATIONS WEGENER COMMUNICATIONS, INC.

Automatic characterization of ornamentation from bassoon recordings for expressive synthesis

FOR IMMEDIATE RELEASE

Paulo V. K. Borges. Flat 1, 50A, Cephas Av. London, UK, E1 4AR (+44) PRESENTATION

An ecological approach to multimodal subjective music similarity perception

This is why when you come close to dance music being played, the first thing that you hear is the boom-boom-boom of the kick drum.

Evaluating Interactive Music Systems: An HCI Approach

DJ Darwin a genetic approach to creating beats

Tool-based Identification of Melodic Patterns in MusicXML Documents

Algorithmic Music Composition

Brain-Computer Interface (BCI)

Social Interaction based Musical Environment

Poème Numérique: Technology-Mediated Audience Participation (TMAP) using Smartphones and High- Frequency Sound IDs

Ready to Rock Right Out-of-the Box

On the Move. Digital Mixers

INTRODUCING AUDIO D-TOUCH: A TANGIBLE USER INTERFACE FOR MUSIC COMPOSITION AND PERFORMANCE

Transcription:

http://dx.doi.org/10.14236/ewic/hci2016.50 42Percent Noir - Animation by Pianist Shaltiel Eloul University of Oxford OX1 3LZ,UK shaltiele@gmail.com Gil Zissu UK www.42noir.com gilzissu@gmail.com 42 PERCENT NOIR is a new live performance project consisting of two music and visual artists that have collaborated closely for a long time. The project combines acoustic piano playing with digital sound and visual art. On stage, the two sides are merged together to form a performance that explores the interesting relationship between sound, vision, and live-interaction in the modern world. 42 Noir, piano, digital and Visual art 1. INTRODUCTION The piano is one of the most emotional and gesture based instrument, and hence have a very dominant character in an intimate performance. This domination presents an interesting challenge when it is combined alongside visualization. The challenge arises from the fact that visualization frequently contains automatic elements manifesting some lack of liveness. Thus the audience might lose some of the experience of the visual content and be solely engaged with the pianist. By analogy, in the fast-developing world, with the benefits of new technologies, new challenges appear in the collision between the digital and the real world. In our performance, we explore this relationship between the piano as the real world and the visualization as the digital world. And the musicvisual pieces were composed to confront actual/current discussions, which are heavily affected by technology such as globalization, immigration, and AI. Here in this performance, we wish to perform our original compositions combining piano music, visuals and sounds, together as one piece, in order to bridge the strong relationship between the acoustic and digitalised elements onstage. On stage, we use our homemade programs to create and process the sound scape and visual scape in real time. The piano notes (tone, velocity, and signal) are detected and transmitted to form a visualization that is constantly influenced during the piano playing. Programs were coded for each tune for real time processing and visualization. This allows an interesting interaction between the pianist and the Visual player on stage. For example, in one piece (see link no. (i) in section 5), the acoustic notes made by the pianist control a virtual character s mood, and movement, which results in interesting feedback between the pianist and the virtual character (Eloul and Gil (2016)). Additionally, we create virtual music performance interfaces that rely on human gestures. For example, a stochastic piano machine (see link no. (ii) in section 5), where the performer can create music piano phrases in real time by controlling simple parameters such as randomness and note density in a very experimental but surprising way. The performance can be a very engaging and multi-sense experience. We believe that technology revolution is not a frightening word in the traditional acoustic music. In fact, we wish to show that the right use of technology in live performance can increase the performativity level, which will result in more ways to the artist/performer express them-self and to approach a variety of audience in larger spectrum of music genres. 2. BIOGRAPHY 42 PERCENT NOIR is a project consisting of two artists, Gil Zissu and Shaltiel Eloul. They met five years ago, in an electronicrock music group, recorded the album Colourful Cows and performed on tour in 2013. In the group they developed a homemade live performance method which was presented NIME2014 (Eloul et al. (2016)). Gil and Shaltiel moved to continue their studies in London and Oxford in 2013, and in the meantime, created together the performance Dag is a DJ. c Eloul et al. Published by BCS Learning and Development Ltd. 1 Proceedings of British HCI 2016 Conference Fusion, Bournemouth, UK

They performed at the Kinetica Art Fair, London in October, 2014. The project was reviewed in VICE and a scientific article relating to this project is now published in the Leonardo Music Journal, MIT press (Eloul et al. (2016)). 2

42Percent Noir - Animation by Pianist 3. IMAGES Figure 1: Introducing Philip (A duet of human and stochastic piano machine program we developed for live performance). We have created an interface program that makes piano melodies from stochastic algorithm. The performer controls the program parameters in real time to create interesting piano melodies. We will perform one song with this program along with visualization and electronic samples. 3

Figure 2: Introducing Philip (A duet of human and stochastic piano machine program we developed for live performance). We have created an interface program that makes piano melodies from stochastic algorithm. The performer controls the program parameters in real time to create interesting piano melodies. We will perform one song with this program along with visualization and electronic samples. 4

Figure 3: Exhibition Trial II- On stage, the piano notes are translated into shapes curves and surfaces by algorithm that finds interesting points in a pre-recorded film, and evolves in real time. The algorithm works for any film. Therefore, we record a film just on the way to the performance, capturing the cultural and local places to give high intimacy and engagement with the local audience. 5

4. PERFORMANCE FORMATS Our main performance is a recital (20 minutes) which is our priority, Also we are flexible to other combining it with a presentation (extra 8 minutes) or a workshop. Technical and Logistical Requirements: Piano, Sound system with stereo output. Floor plan of the performance is schemed below: 6

5. LIST OF DEMOS 5.1. Documentation of the performance (i) S. Eloul, G. Zissu (2016) Animation By Pianist, 2016, 42Percent Noir, Audiovisual live performance. Available from: https://youtu.be/tbr8mspj4hu (ii) S. Eloul, G. Zissu (2016) Philip, 2015, 42Percent Noir, Audiovisual live performance. Available from: https://youtu.be/ukaahvudiv0 (iii) S. Eloul, G. Zissu (2016) Exhibition Trial II 2016,42Percent Noir, Audiovisual live performance. Available from: https://youtu.be/qdhr3hgjms4 REFERENCES Eloul S. and Zissu G. (2016) Animation By Pianist, 2016, 42Percent Noir, Audiovisual live performance. https://youtu.be/tbr8mspj4hu Amo Y., Zissu, G., Eloul S., Shlomi E., Schukin D., and Kalifa A., Managing Live Music Bands via Laptops using Max/MSP, NIME-14, 2014 Eloul S., Zissu, G., Amo Y., and Jacoby N., Motion Tracking of a Fish as a Novel Way to Control Electronic Music Performance, Leonardo, 49,3, 203-210, 2016 5.2. Online Activity (iv) S. Eloul, G. Zissu (2016) 42Percent Noir, Official Website. Available from: http://www.42noir.com 7