Sound visualization through a swarm of fireflies

Similar documents
Robert Alexandru Dobre, Cristian Negrescu

DAT335 Music Perception and Cognition Cogswell Polytechnical College Spring Week 6 Class Notes

Music by Interaction among Two Flocking Species and Human

On the Characterization of Distributed Virtual Environment Systems

Cymatics Chladni Plate

CHARACTERIZATION OF END-TO-END DELAYS IN HEAD-MOUNTED DISPLAY SYSTEMS

DJ Darwin a genetic approach to creating beats

Development of extemporaneous performance by synthetic actors in the rehearsal process

Chords not required: Incorporating horizontal and vertical aspects independently in a computer improvisation algorithm

(12) United States Patent

MS-E Crystal Flowers in Halls of Mirrors 30 Mar Algorithmic Art II. Tassu Takala. Dept. of CS

2 nd Grade Visual Arts Curriculum Essentials Document

Evolving L-systems with Musical Notes

EAI Endorsed Transactions

Processing. Electrical Engineering, Department. IIT Kanpur. NPTEL Online - IIT Kanpur

MUSIC TRANSCRIBER. Overall System Description. Alessandro Yamhure 11/04/2005

A Keywest Technology White Paper

jsymbolic and ELVIS Cory McKay Marianopolis College Montreal, Canada

PLANE TESSELATION WITH MUSICAL-SCALE TILES AND BIDIMENSIONAL AUTOMATIC COMPOSITION

Visualizing Euclidean Rhythms Using Tangle Theory

REPORT ON THE NOVEMBER 2009 EXAMINATIONS

Ligeti. Continuum for Harpsichord (1968) F.P. Sharma and Glen Halls All Rights Reserved

Musical Hit Detection

Shimon: An Interactive Improvisational Robotic Marimba Player

How to Obtain a Good Stereo Sound Stage in Cars

On the Music of Emergent Behaviour What can Evolutionary Computation bring to the Musician?

INTERVIEW WITH MANFRED MOHR: ART AS A CALCULATION

1 Overview. 1.1 Nominal Project Requirements

Palestrina Pal: A Grammar Checker for Music Compositions in the Style of Palestrina

R H Y T H M G E N E R A T O R. User Guide. Version 1.3.0

D-Lab & D-Lab Control Plan. Measure. Analyse. User Manual

Primary Music Objectives (Prepared by Sheila Linville and Julie Troum)

OBJECTIVE EVALUATION OF A MELODY EXTRACTOR FOR NORTH INDIAN CLASSICAL VOCAL PERFORMANCES

Eastern Illinois University Panther Marching Band Festival

Research on sampling of vibration signals based on compressed sensing

High School Photography 1 Curriculum Essentials Document

Modeling memory for melodies

Generating Cinematic Camera Shots for Narratives

ANNOTATING MUSICAL SCORES IN ENP

Expressive arts Experiences and outcomes

R&D White Paper WHP 085. The Rel : a perception-based measure of resolution. Research & Development BRITISH BROADCASTING CORPORATION.

A HIGHLY INTERACTIVE SYSTEM FOR PROCESSING LARGE VOLUMES OF ULTRASONIC TESTING DATA. H. L. Grothues, R. H. Peterson, D. R. Hamlin, K. s.

Usability of Computer Music Interfaces for Simulation of Alternate Musical Systems

Enhancing Music Maps

Algorithmic Composition: The Music of Mathematics

Colour Reproduction Performance of JPEG and JPEG2000 Codecs

Devices I have known and loved

Intra-frame JPEG-2000 vs. Inter-frame Compression Comparison: The benefits and trade-offs for very high quality, high resolution sequences

Using the new psychoacoustic tonality analyses Tonality (Hearing Model) 1

In all creative work melody writing, harmonising a bass part, adding a melody to a given bass part the simplest answers tend to be the best answers.

Computational Parsing of Melody (CPM): Interface Enhancing the Creative Process during the Production of Music

However, in studies of expressive timing, the aim is to investigate production rather than perception of timing, that is, independently of the listene

Automatic Music Clustering using Audio Attributes

VGA Configuration Algorithm using VHDL

Technical Guide. Installed Sound. Loudspeaker Solutions for Worship Spaces. TA-4 Version 1.2 April, Why loudspeakers at all?

Digitization: Sampling & Quantization

MONITORING AND ANALYSIS OF VIBRATION SIGNAL BASED ON VIRTUAL INSTRUMENTATION

How to use this handout:

Technical Developments for Widescreen LCDs, and Products Employed These Technologies

Investigation of Digital Signal Processing of High-speed DACs Signals for Settling Time Testing

Automatic Projector Tilt Compensation System

Categories and Subject Descriptors I.6.5[Simulation and Modeling]: Model Development Modeling methodologies.

The Rhythm of a Pattern

RF Testing of A Single FPIX1 for BTeV

Music Performance Panel: NICI / MMM Position Statement

FEASIBILITY STUDY OF USING EFLAWS ON QUALIFICATION OF NUCLEAR SPENT FUEL DISPOSAL CANISTER INSPECTION

Analysis of local and global timing and pitch change in ordinary

SRV02-Series. Rotary Pendulum. User Manual

Improvised Duet Interaction: Learning Improvisation Techniques for Automatic Accompaniment

Application of a Musical-based Interaction System to the Waseda Flutist Robot WF-4RIV: Development Results and Performance Experiments

System Quality Indicators

SURVIVAL OF THE BEAUTIFUL

2. AN INTROSPECTION OF THE MORPHING PROCESS

Music Theory Fundamentals/AP Music Theory Syllabus. School Year:

Automatic Polyphonic Music Composition Using the EMILE and ABL Grammar Inductors *

FINAL PROJECT: PERFORMANCE ARTS AND AI

K12 Course Introductions. Introduction to Music K12 Inc. All rights reserved

secundaria EDUCATIONAL PROGRAM YEAR PROGRAM FOR 9 TH GRADE The mountain s eyes 10 arts movements you should know

Year 2 Semester 1 Criteria Sheet

Musical Entrainment Subsumes Bodily Gestures Its Definition Needs a Spatiotemporal Dimension

Challenges in the design of a RGB LED display for indoor applications

Indicator 1A: Conceptualize and generate musical ideas for an artistic purpose and context, using

An Empirical Analysis of Macroscopic Fundamental Diagrams for Sendai Road Networks

Synchronization in Music Group Playing

Figure 2: Original and PAM modulated image. Figure 4: Original image.

The purpose of this essay is to impart a basic vocabulary that you and your fellow

into a Cognitive Architecture

Musicians Adjustment of Performance to Room Acoustics, Part III: Understanding the Variations in Musical Expressions

Environment Expression: Expressing Emotions through Cameras, Lights and Music

Exploring the Rules in Species Counterpoint

An Interactive Case-Based Reasoning Approach for Generating Expressive Music

UNFINISHED SYMPATHY MASSIVE ATTACK

Logisim: A graphical system for logic circuit design and simulation

Doctor of Philosophy

AP Studio Art 2006 Scoring Guidelines

Emotional Intelligence

Understanding Layered Noise Reduction

PERCEPTUAL QUALITY OF H.264/AVC DEBLOCKING FILTER

MUSICAL SOUNDSCAPES FOR AN ACCESSIBLE AQUARIUM: BRINGING DYNAMIC EXHIBITS TO THE VISUALLY IMPAIRED

Chapter 117. Texas Essential Knowledge and Skills for Fine Arts Subchapter A. Elementary

Transcription:

Sound visualization through a swarm of fireflies Ana Rodrigues, Penousal Machado, Pedro Martins, and Amílcar Cardoso CISUC, Deparment of Informatics Engineering, University of Coimbra, Coimbra, Portugal {anatr,machado,pjmm,amilcar}@dei.uc.pt Abstract. An environment to visually express sound is proposed. It is based on a multi-agent system of swarms and inspired by the visual nature of fireflies. Sound beats are represented by light sources, which attract the virtual fireflies. When fireflies are close to light they gain energy and, as such, their bioluminescence is emphasized. Although real world fireflies do not behave as a swarm, our virtual ones follow a typical swarm behavior. This departure from biological plausibility is justified by aesthetic reasons: the desire to promote fluid visualizations and the need to convey the perturbations caused by sound events. The analysis of the experimental results highlights how the system reacts to a variety of sounds, or sequence of events, producing a visual outcome with distinct animations and artifacts for different musical pieces and genres. Keywords: Swarm Intelligence; Computer Art; Multi-Agent Systems; Sound Visualization 1 Introduction Although sound visualization has been an object of study for a long time, the emergence of the computer, with graphic capabilities, allowed the creation of new paradigms and creative processes in the area of sound visualization. Most of the initial experiments were done through analogical processes. Since the advent of computer science, art has taken significant interest in the use of computers for the generation of automated images. In section 2, we present some of the main inspirations to our work including sound visualization, generative artworks, computer art and multi-agent systems. Our research question relies on the possibility of developing a multi-agent model for sound visualization. We explore the intersection between computer art and nature-inspired multi-agent systems. In the context of this work, swarm simulations are particularly interesting because they allow the expression of a large variety of different types of behaviors and tend to be intuitive and natural forms of interaction. In section 3 we present the developed project, which is based on a multi-agent system of swarms and inspired by the visual nature of fireflies. In the scope of our work, visualization of music is understood as the mapping of a specific musical composition or sound into a visual language.

Our environment contains sources of light representing sound beats, which attract the fireflies. The closer a firefly is to the light, the more emphasized is its bioluminescence and higher is its chance of collecting energy (life). Using Reynolds boids algorithm [?], fireflies interact with the surrounding environment by means of sensors. They use them to find and react to energy sources as well as to other fireflies. In section 4 we present an analysis and corresponding experimental results of the systems behavior to 5 different songs. Lastly, in section 5 we present our conclusions and further work to be done. 2 Related Work Ernst Chladni studied thoroughly the relation between sound and image. One of his best-known achievements was the invention of Cymatics. It geometrically showed the various types of vibration on a rigid surface [?]. In the 1940s Oskar Fischinger made cinematographic works exploring the images of sound by means of traditional animation [?]. His series of 16 studies was his major success [?]. Another geometric approach, was made by Larry Cuba in 1978, but this time with digital tools. 3/78 consisted of 16 objects performing a series of precisely choreographed rhythmic transformations [?]. Complex and self-organized systems have a great appeal for the artistic practice since they can continuously change, adapt and evolve. Over the years, computer artifacts promoting emergent systems behaviors have been explored [?] [?]. Artists got fascinated with the possibility of an unpredictable but satisfying outcome. Examples of this include the work of Ben F. Laposky, Frieder Nake, Manfred Mohr, among many others [?]. 3 The Environment Fig. 1. Systems behavior and appearance example. Best viewed in color. In this section we present a swarm-based system of fireflies and all of its interactions. In this environment, fireflies are fed by the energy of sound beats (rhythmic onsets). While responding to the surrounding elements of the environment, they search for these energies (see Fig. 1). The colors were chosen according to the real nature of fireflies. Since they are visible at night, we opted

for a dark blue in the background and a brighter one for the sound beats. As for bioluminescence, we used yellow. The environment rules and behaviors, plus the visualization were implemented with Processing. The mechanism for extracting typical audio information was made with the aid of the Minim library, mainly because it contains a function for sound beat detection. 3.1 Sound (energy sources) Sound Analysis. To visualize sound, a preliminary analysis is necessary. A sound is characterized by 3 main parameters: frequency, amplitude and duration. Frequency determines the pitch of the sound. Amplitude determines how loud the sound is. Duration can define the rhythm of music and also the instant in the music where sound beats happen. We perform sound analysis prior to the visualization, in order to promote a fluid animation and convey the perturbations caused by sound events. We compute the main sound characteristics (pitch, volume, sound beats) and export them to a text file. Sound beats are detected note onsets. They are related to the temporal/horizontal position of a sound event. Although the mechanism used to extract audio is not novel and remains simple, we think this approach is adequate to the goals of our system. It fits in the amount of expressiveness that we intend to represent in our visualization, as visual simplicity characterizes the fireflies natural environment Fig. 2. Graphical representation of sound objects. a - Sound beats instants, b - Amplitude, c - Frequency, d - Collision. Sound s Graphic Representation. After the sound analysis, all the properties of sound are mapped into graphical representations. Sound beats are mapped into instants (t1, t2, t3,... ) which defined the objects horizontal position as shown in Fig. 2a. Each sound object has a pre-defined duration, meaning that it is removed from the environment at the end of its duration. Amplitude was translated into the objects size, i.e., the size is directly proportional to the amplitude (Fig. 2b). Lastly, frequency is mapped into the objects vertical position in the environment (Fig. 2c). High frequencies (HIF) are positioned on the top of screen and low frequencies (LOF) emerge in lower positions of the vertical axis. A fourth characteristic presented in the graphical representation of sound objects is collision (Fig. 2d). This last one is not directly related to sound, only to sound object s physics. When a object collides with another one, a contrary force is applied between these two, separating them from each other.

3.2 Agents (fireflies) Agent behavior. Because the sound beats are presented from the left to right, fireflies are initially born on the left side of the screen, vertically centered. Agents are provided with a specific vision towards the surrounding environment. A vision angle of 30 and a depth of 150 pixels were considered as optimum values (Fig. 3), because the agents could have a high amount of independence and resemble to their original behavior. Agents motion is based on the Boids algorithm. They walk randomly until they find something that may affect their behavior, such as source of light or other agents. Fig. 3. Agent field of view: angle (A) and depth (D). The closer they are to a source of light, the more attracted they get to it, meaning that there is a force of attraction towards it. Along with that, agents have a swarming behavior, meaning that neighbor agents can see each others and follow them through flocking behavior rules [?]. These rules were presented by Reynolds with a computational model of swarms exhibiting natural flocking behavior. He demonstrated how a particular computer simulation of boids could produce complex phenomena from simple mechanisms. These behaviors define how each creature behaves in relation to its neighbors: separation, alignment or cohesion [?]. Fig. 4. Left image: separation. Right image: cohesion. The swarming behaviors present in this system are: separation and cohesion (Fig 4). Separation gives the agents the ability to maintain a certain distance from others nearby in order to prevent agents from crowding together. Cohesion gives agents the ability to approach and form a group with other nearby agents [?]. No alignment force was applied. Alignment is usually associated with flocking behavior, like birds and fishes do. Swarm behavior like the one found in bees, flies and our fireflies does not imply alignment. Additionally, the life and death of each agent is also determined by the way it interacts with the environment. The agent begins with an initial lifespan, losing

part of its energy at each cycle. If the agent gets close to an energy source, it gains more energy and a longer lifespan; otherwise, it keeps losing its energy until it dies. There are no mechanisms for the rebirth of agents, as we intend to keep a clear visualization and understanding of interactions among agents. Agent s Graphic Representation. Fireflies use bioluminescence to communicate and attract other fireflies. As an agent gradually approaches the light emitted by a sound object within its field of view, the more excited it gets and the more it emphasizes its bioluminescence (Fig. 5, left image). This will temporarily influence the agents size because it gets intermittent. The real agent size will be as big as the energy (Fig. 5, right image) that it has at a certain instant. When an agent dies, it disappears from the environment. Fig. 5. Left image: agent approximation to an object (AG OB). Right image: agent growth (E). 4 Results and Discussion Fig. 6. The music that generated this response is characterized by a variety of intensity and big density of beats. This section presents an analysis of the systems behavior in response to 4 different songs or melodic sequences (from track 1 to track 4). These tracks vary in rhythm, intensity and frequencies, allowing us to illustrate and highlight how the system reacts to different sound stimuli. Unfortunately, conveying the overall feel of an animation 1 in a paper has its difficulties. To circumvent this issue and to ease our analysis, first we analyze a complete visualization of the track so we can perceive the differences inside each one. Secondly, we present the trajectory made by the agents of the corresponding music to better analyze their behavior in the different tracks. We present only one example of those figures due to space constraints. Track 1 corresponds to a piece with high density of beats and low contrast of intensities. This promotes a higher chance of having a longer lifespan. However, 1 A demonstration video can be found at http://tinyurl.com/ky7yaql.

the low contrast of the intensities implies that they do not gather so much energy at once. Track 2 (Fig. 6), is also characterized by a high density of beats, but in this case the contrast in intensities make swarms gain more energy. Track 3 has a low contrast of frequencies and a balanced density of beats. For Last, Track 4 as opposed to almost all of the other examples so far described, has a strong contrast between high and low frequencies. Adding to this, the low density of beats results in a reduced lifespan for swarms as they have a short field of view. From the observation of these patterns created by our system, we can conclude: (i) fireflies have a tendency to follow the pattern created by the sound beats as we could see in the example depicted in Fig. 6; (ii) there is a bigger concentration of fireflies in the sources that contain more energy; (iii) tracks with a lower contrast between frequencies promote a more balanced spread of the fireflies in the environment; (iv) tracks with a high density of beats give fireflies a longer lifespan because the agents have a narrow vision field and thus they can collect more energy even if it is in small pieces of it. 5 Conclusions and Future Work We presented an environment to visualize audio signals. It was inspired by the visual nature of fireflies and based on a multi-agent system of swarms proposed by Reynolds. In this environment, sound is mapped into light objects with energy, which attract the virtual fireflies. When fireflies are close to light they gain energy and, as such, their bioluminescence is emphasized. The flocking behavior of the group emerges based on simple rules of interaction. In real life the presented technique may be useful for people with low understanding of music to take part in musical events. In further work we will expand our system by introducing more sophisticated mechanisms for the sound analysis, which allow the representation of higher-level concepts and musical events. On the other hand, we also wish to explore alternative visual representations to offer the user a wider array of choices. Finally, a user study should be performed to assess the strengths and weaknesses of the different visualization variants and evaluate the system.