CymaSense: A Real-Time 3D Cymatics- Based Sound Visualisation Tool

Similar documents
VISUALIZING AND CONTROLLING SOUND WITH GRAPHICAL INTERFACES

Vuzik: Music Visualization and Creation on an Interactive Surface

Glasgow eprints Service

AN ARTISTIC TECHNIQUE FOR AUDIO-TO-VIDEO TRANSLATION ON A MUSIC PERCEPTION STUDY

Animating Timbre - A User Study

Making Progress With Sounds - The Design & Evaluation Of An Audio Progress Bar

Communicating graphical information to blind users using music : the role of context

Supporting Creative Confidence in a Musical Composition Workshop: Sound of Colour

Sound visualization through a swarm of fireflies

Scoregram: Displaying Gross Timbre Information from a Score

Exploring Choreographers Conceptions of Motion Capture for Full Body Interaction

Visual and Aural: Visualization of Harmony in Music with Colour. Bojan Klemenc, Peter Ciuha, Lovro Šubelj and Marko Bajec

Computational Parsing of Melody (CPM): Interface Enhancing the Creative Process during the Production of Music

CTP431- Music and Audio Computing Musical Acoustics. Graduate School of Culture Technology KAIST Juhan Nam

MusicGrip: A Writing Instrument for Music Control

Toward a Computationally-Enhanced Acoustic Grand Piano

Rubato: Towards the Gamification of Music Pedagogy for Learning Outside of the Classroom

The Effects of Web Site Aesthetics and Shopping Task on Consumer Online Purchasing Behavior

Perspectives on the Design of Musical Auditory Interfaces

Arts, Computers and Artificial Intelligence

EMERGENT SOUNDSCAPE COMPOSITION: REFLECTIONS ON VIRTUALITY

Harmony, the Union of Music and Art

A PSYCHOACOUSTICAL INVESTIGATION INTO THE EFFECT OF WALL MATERIAL ON THE SOUND PRODUCED BY LIP-REED INSTRUMENTS

Therapeutic Function of Music Plan Worksheet

SYNTHESIS FROM MUSICAL INSTRUMENT CHARACTER MAPS

DAT335 Music Perception and Cognition Cogswell Polytechnical College Spring Week 6 Class Notes

Implementation of an 8-Channel Real-Time Spontaneous-Input Time Expander/Compressor

A prototype system for rule-based expressive modifications of audio recordings

However, in studies of expressive timing, the aim is to investigate production rather than perception of timing, that is, independently of the listene

Real-time composition of image and sound in the (re)habilitation of children with special needs: a case study of a child with cerebral palsy

Cymatic: a real-time tactile-controlled physical modelling musical instrument

About Giovanni De Poli. What is Model. Introduction. di Poli: Methodologies for Expressive Modeling of/for Music Performance

Intimacy and Embodiment: Implications for Art and Technology

24-29 April1993 lnliiirchr9

POST-PROCESSING FIDDLE : A REAL-TIME MULTI-PITCH TRACKING TECHNIQUE USING HARMONIC PARTIAL SUBTRACTION FOR USE WITHIN LIVE PERFORMANCE SYSTEMS

Lian Loke and Toni Robertson (eds) ISBN:

THE SONIC ENHANCEMENT OF GRAPHICAL BUTTONS

Reciprocal Transformations between Music and Architecture as a Real-Time Supporting Mechanism in Urban Design

An interdisciplinary approach to audio effect classification

ANNOTATING MUSICAL SCORES IN ENP

This full text version, available on TeesRep, is the post-print (final version prior to publication) of:

A Need for Universal Audio Terminologies and Improved Knowledge Transfer to the Consumer

The Rhythm of a Pattern

Embodied music cognition and mediation technology

Outcome EN4-1A A student: responds to and composes texts for understanding, interpretation, critical analysis, imaginative expression and pleasure

Musical Acoustics Lecture 15 Pitch & Frequency (Psycho-Acoustics)

Spectral Sounds Summary

Jam Tomorrow: Collaborative Music Generation in Croquet Using OpenAL

The Tone Height of Multiharmonic Sounds. Introduction

Acoustic Instrument Message Specification

An Integrated Music Chromaticism Model

CSC475 Music Information Retrieval

After Direct Manipulation - Direct Sonification

CTP 431 Music and Audio Computing. Basic Acoustics. Graduate School of Culture Technology (GSCT) Juhan Nam

Implementation of a Ten-Tone Equal Temperament System

Analysis of local and global timing and pitch change in ordinary

Perception and Sound Design

Aalborg Universitet. Flag beat Trento, Stefano; Serafin, Stefania. Published in: New Interfaces for Musical Expression (NIME 2013)

MEMORY & TIMBRE MEMT 463

Music in Practice SAS 2015

California Content Standard Alignment: Hoopoe Teaching Stories: Visual Arts Grades Nine Twelve Proficient* DENDE MARO: THE GOLDEN PRINCE

YARMI: an Augmented Reality Musical Instrument

Opening musical creativity to non-musicians

Next Generation Software Solution for Sound Engineering

1 Overview. 1.1 Nominal Project Requirements

Music for Alto Saxophone & Computer

inter.noise 2000 The 29th International Congress and Exhibition on Noise Control Engineering August 2000, Nice, FRANCE

Sing a Song of Technology! Mary Ellen Pinzino (Association for Technology in Music Instruction 2003)

A System for Generating Real-Time Visual Meaning for Live Indian Drumming

Cymatics Chladni Plate

Instrument Recognition in Polyphonic Mixtures Using Spectral Envelopes

& Ψ. study guide. Music Psychology ... A guide for preparing to take the qualifying examination in music psychology.

A Perceptually Motivated Approach to Timbre Representation and Visualisation. Sean Soraghan

THE EFFECT OF EXPERTISE IN EVALUATING EMOTIONS IN MUSIC

Hearing Loss and Sarcasm: The Problem is Conceptual NOT Perceptual

DEMENTIA CARE CONFERENCE 2014

A Basic Study on the Conversion of Sound into Color Image using both Pitch and Energy

Consonance perception of complex-tone dyads and chords

Analysis, Synthesis, and Perception of Musical Sounds

MAutoPitch. Presets button. Left arrow button. Right arrow button. Randomize button. Save button. Panic button. Settings button

UNIVERSITY OF DUBLIN TRINITY COLLEGE

ITU-T Y Functional framework and capabilities of the Internet of things

Approaching Aesthetics on User Interface and Interaction Design

Visual Arts Colorado Sample Graduation Competencies and Evidence Outcomes

Essentials Skills for Music 1 st Quarter

Second Grade: National Visual Arts Core Standards

GLOSSARY for National Core Arts: Visual Arts STANDARDS

Confluence of Techné and Musical Thought: 3D-Composer, a Software for Micro- Composition

Topic 10. Multi-pitch Analysis

Expressive performance in music: Mapping acoustic cues onto facial expressions

Enhancing Music Maps

Expressive information

Music Representations

Hidden in Plain Sight

Auditory Interfaces A Design Platform

The CAITLIN Auralization System: Hierarchical Leitmotif Design as a Clue to Program Comprehension

ESP: Expression Synthesis Project

Laugh when you re winning

SHORT TERM PITCH MEMORY IN WESTERN vs. OTHER EQUAL TEMPERAMENT TUNING SYSTEMS

Perceptual Considerations in Designing and Fitting Hearing Aids for Music Published on Friday, 14 March :01

Transcription:

CymaSense: A Real-Time 3D Cymatics- Based Sound Visualisation Tool John McGowan J.McGowan@napier.ac.uk Grégory Leplâtre G.Leplatre@napier.ac.uk Iain McGregor I.McGregor@napier.ac.uk Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author. DIS'17 Companion, June 10-14, 2017, Edinburgh, United Kingdom 2017 Copyright is held by the owner/author(s). ACM ISBN 978-1-4503-4991-8/17/06. http://dx.doi.org/10.1145/3064857.3079159 Abstract What does music look like? Representation of music has taken many forms over time, from musical notation [16], through to random algorithm-based visualisations based on the amplitude of an audio signal [19]. One aspect of music visualisation that has not been widely explored is that of Cymatics. Cymatics are physical impressions of music created in mediums such as water. Current Cymatic visualisations are restricted to 2D imaging, whilst 3D visualisations of music are generally based on arbitrary mapping of audio-visual attributes. This paper looks at the design of CymaSense, an interactive tool based on Cymatics. Author Keywords Autism Spectrum Condition (ASC); Assistive Technologies; Music Therapy; Interactive Audio-Visual; Cymatics. ACM Classification Keywords H.5.2. Information interfaces and presentation: Graphical User Interfaces (GUI); Prototyping; Screen Design; User-Centered Design; Visualization Theory, Concepts and Paradigms. Introduction Sound visualisation takes many forms, which can be organised along a continuum, from pragmatic or functional to artistic [11]. On the functional side, volume unit (VU) meters (see Figure 1) exemplify 270

Figure 1: VU meter Figure 2: Cymatic image of sound vibrated through water Figure 3: Chladni plate representations which, though abstract, have become standard. In this case a VU meter provides an informative visualisation of loudness. Most software music players include a visualiser that fall on the artistic side of the continuum. The visualisation is driven by the audio data, but no information about the sound can be inferred from the visuals. Sound visualisation techniques can also be categorized in function of the semantic link between a sound and its visualisation. A recognised categorization is used in Auralisation, the discipline in which information is conveyed through sound: Auditory Icons [5] are linked to what they represent semantically. For example, in a user interface, the action of deleting a file can be represented by the sound of an object thrown in a trashcan. Earcons, on the other hand, involve arbitrary mappings between abstract sounds and what they represent [2]. Their meaning must therefore be learnt. It is difficult to apply the abstract/concrete categorization used in Auralisation to sound visualisation, for practical reasons: common sound analysis techniques used for real-time sound visualisation do not provide any information about the source of the sound. For example, a bird sound can only be represented by the image of a bird if the sound can be identified as being a bird sound. The only data available are generally limited to loudness, pitch and spectral data. Natural Sound Visualisation Audiovisual mappings Devising a natural visual representation of audio signals involves finding perceptually meaningful audiovisual correlates. Research into audiovisual correspondences has identified reliable mappings, such as: size to loudness, vertical position to pitch, visual brightness to pitch, visual repetitiveness to sound dissonance, texture granularity to sound compactness [6][7][13]. What visual objects these parameters should be applied to, for representation of sound, remains an open question. There are countless metaphors and systems, mostly using 2D Graphics, with an increasing number of sound representation using 3D and virtual reality [1]. Cymatics Cymatics are physical impressions of sound created in mediums such as water (Figure 2) or through particulate material on a brass plate [9]. They are the result of diffraction and refraction of sound waves created within the visualising medium [4]. Sound propagates in a spherical manner from its source contrary to typical 2D representations of sound waves. Cymatics until now have been viewed as quasi-3d patterns on the surface of water or on the surface of a Chladni plate (Figure 3). The appeal of Cymatics as a sound visualisation technique is two-fold. Firstly, they present undeniable aesthetic qualities (see Figure 2), which makes them an interesting proposition for artistic visualisation applications. Functionally, Cymatics are unique in that they are concrete representations of a sound. This does not mean, however, that they are necessarily informative. On the one hand, Chladni patterns can make an effective representation of pitch (see vibration patterns of a circular plate vibrating at 2434Hz and 3986Hz on Figure 4 and Figure 5 respectively). On the other hand, the pattern changes resulting from small spectral differences can be quite dramatic and surprising. These are, however, deterministic and are 271

Figure 4: Circular plate vibrating at 2434Hz Figure 5: Circular plate vibrating at 3986Hz Figure 6: CymaSense sample output used by some instrument makers to visualise the tonal qualities of their instruments. The aesthetic qualities, combination of predictability and surprise that Cymatic patterns exhibit, as well as the fact that they are physically accurate representations of sounds make them a fascinating sound visualisation paradigm. CymaSense A prototype was developed to experiment with and evaluate the potential of Cymatics-based sound visualisations. CymaSense is an interactive application that generates real-time 3D graphics, inspired by Cymatics, intended to encourage musical exploration through visual feedback. Given the complexity of the equations that describe the propagation of a sound wave in a fluid in 3 dimensions, a simpler Cymatics-inspired approach was chosen: high definition Cymascope images (see Figure 2) of individual frequencies were used as templates for the creation of twelve 3D Cymatic shapes, relating to each of the semitones within a musical octave. Figure 6 represents an image generated by CymaSense. Some of the properties of the Cymascope reference images have been preserved (translucence, symmetry) and new features were added: use of colour and particles emitted by the Cymatic shapes (Figure 7). Mapping The mapping between audio and visual attributes was based on validated associations presented earlier. The novelty of the visualisation paradigm also afforded experimentations with the less obvious aspects of the mapping. Amplitude to scale of Cymatic shape and particle size mapping amplitude to scale is commonly referred to in literature [12][18]. Pitch to Cymatic shape This is consistent with Cymatic shape behaviours observed. The 3D shapes created were inspired by Cymascope reference images (See Figure 7). Pitch to colour lightness - lightness of colour is affected by the relative MIDI note or audio frequency, thus implementing a pitch to lightness relationship, referred to in synesthesia and crossmodality literature [14][22]. The higher the pitch, the lighter the visual component is. This also aids in differentiating the same Cymatic shape over several octaves. Sound brightness to Cymatic shape surface quality colour is commonly associated with timbre in audiovisual mappings, but colour was used for other purposes in the application. Therefore, we decided to experiment with surface qualities as a means of representing the spectral qualities of a sound. The Cymatic shapes generated for a given frequency were modified using a 3D morphing technique as follows: the shapes of bright sounds were made to appear sharper, while dark sounds were transformed to appear more rounded. Implementation The application was implemented in Cycling 74 Max [3] for real-time audio and MIDI processing, and Unity for real-time 3D Graphics generation [20]. The Open Sound Control (OSC) protocol provides the data communication channel between the two environments. Consequently, the implemented CymaSense prototype comprises of: (1) an interface to control audio input 272

Figure 7: CymaSense shapes Figure 8: CymaSense user interface Figure 9: Audio input analysis and visual output for single or multi-user interaction (Figure 8); and (2) a separate output screen (Figure 6). The sound analysis process is represented in Figure 9: if a MIDI input is chosen, both MIDI data and audio data are analysed MIDI data can include note number, velocity, note on/off and bend. Audio data is analysed for its fundamental frequency, its partial (or harmonic) frequencies and the amplitude of the signal. Audio spectral analysis is carried out using the iana~ object, which is part of IRCAM s Max Sound Box [8]. The processed audio data is sent via OSC protocol [15] to Unity where it is analysed and triggers appropriate visual output (Figure 6). In addition to the implementation of the mapping presented above, the tool includes several options that allow the user to customize the visual output: for example, the default Cymatic shape and particle colours can be modified by the user. A mode in which random rotations of the shapes are enabled is also available. This is consistent with observations made on real-world Cymatic shapes created in water. Conclusions and Future Work Possible improvements One of the challenges of the implementation is to keep latency to a minimum and maintain a high frame rate. This places limitations on the computation that can be performed per frame. However, being able to generate Cymatic shapes in real-time based on physically correct equations, rather than predefined shapes would make conceptual and aesthetic sense. The mapping between the audio and visual parameters of the system may also be improved. The mapping presented in this paper is currently being evaluated. Applications Potential outlets for CymaSense include use of the tool for musical creativity within commercial environments from projected audio-visual art installations, through to virtual or augmented reality applications. Additionally, CymaSense could be used as a therapeutic tool. Use of multi-sensory environments to improve experiences for sensory impaired users, including those with Autism Spectrum Condition (ASC), have been previously identified [17]. ASC is a lifelong neurodevelopmental condition where people share some of the following features in their diagnosis: difficulties in social communication and interaction; problems in the use of language and verbal communication. Music therapy is considered an effective approach for addressing language and communication skills for children with ASC and provides a non-verbal means of communication [10]. Music therapists use technology within their practice to achieve a greater sense of agency and stimulate the senses of autistic clients. Previous work has demonstrated that use of shared interfaces in a therapeutic setting can enhance communication and social interaction for autistic clients [21]. CymaSense is currently being evaluated as a means of augmenting music therapy for people with ASC in an 8-week study. References 1. Florent Berthaut, Myriam Desainte-Catherine, and Martin Hachet. 2011. Interacting with 3D Reactive Widgets for Musical Performance. Journal of New Music Research 40, 3: 253 263. 273

2. Meera M. Blattner, Denise A. Sumikawa, and Robert M. Greenberg. 1989. Earcons and Icons: Their Structure and Common Design Principles. Human Computer Interaction 4, 1: 11 44. 3. Cycling 74. 2017. Max. Retrieved February 14, 2016 from https://cycling74.com/products/max/. 4. Cymascope.com. 2015. Physics :: Cymascope Research. Retrieved December 30, 2015 from http://cymascope.com/cyma_research/physics.htm l. 5. William W. Gaver. 1986. Auditory Icons: Using Sound in Computer Interfaces. Human-Computer Interaction 2, 2: 167 177. 6. Kostas Giannakis. 2006. A comparative evaluation of auditory-visual mappings for sound visualisation. Organised Sound 11, 3: 297. 7. Kostas Giannakis and Matt Smith. 2001. Imaging Soundscapes : Identifying Cognitive Associations between Auditory and Visual Dimensions. Musical Imagery: 51 69. 8. Ircam. 2017. IRCAM Forumnet Max Sound Box. Retrieved March 20, 2017 from http://forumnet.ircam.fr/product/max-sound-boxen/. 9. Hans Jenny. 1968. CYMATICS. The Sciences 8, 7: 12 18. 10. Ronna S Kaplan and Anita Louise Steele. 2005. An analysis of music therapy program goals and outcomes for clients with diagnoses on the autism spectrum. Journal of music therapy 42, 1: 2 19. 11. Robert Kosara. 2007. Visualization criticism - The missing link between information visualization and art. Proceedings of the International Conference on Information Visualisation, 631 636. 12. Mats B. Küssner. 2014. Shape, Drawing and Gesture: Cross-Modal Mappings of Sound and Music.. 13. Scott D Lipscomb and Eugene M Kim. 2004. Perceived Match Between Visual Parameters and Auditory Correlates : an Experimental Multimedia Investigation. Proceedings of the 8th International Conference on Music Perception & Cognition: 72 75. 14. Lawrence E Marks and John B Pierce. 1975. of Sensory Dimensions. Psychological Bulletin 82, 3. 15. opensoundcontrol.org. 2011. Introduction to OSC. Retrieved March 17, 2017 from http://opensoundcontrol.org/introduction-osc. 16. Richard. Rastall. 1983. The notation of western music : an introduction. Travis and Emery Music Bookshop, London. 17. Kathryn E. Ringland, Rodrigo Zalapa, Megan Neal, Lizbeth Escobedo, Monica Tentori, and Gillian R. Hayes. 2014. SensoryPaint: A Multimodal Sensory Intervention for Children with Neurodevelopmental Disorders. Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing - UbiComp 14 Adjunct September: 873 884. 18. S.M. Smith and G.N. Williams. 1997. A visualization of music. Proceedings. Visualization 97 (Cat. No. 97CB36155), IEEE, 499 503. 19. SoundSpectrum Inc. 2017. WhiteCap Screenshots - highly reactive visuals for your tunes. Retrieved December 2, 2015 from http://www.soundspectrum.com/whitecap/screens hots.html#. 20. Unity Technologies. 2017. Unity - Game Engine. Retrieved February 14, 2016 from https://unity3d.com/. 21. Lilia Villafuerte, Milena Markova, and Sergi Jorda. 2012. Acquisition of social abilities through musical tangible user interface. Proceedings of the 2012 ACM annual conference extended abstracts on Human Factors in Computing Systems Extended Abstracts - CHI EA 12, 745. 22. Mitchell Whitelaw. 2008. Synesthesia and Cross- Modality in Contemporary Audiovisuals. The Senses & Society 3, 3: 259 276. 274