Synaesthetic Effects Can Produce an Immersive Visual Music Chao-Chun Wu Submitted in Partial Fulfillment of the Requirements for the Degree of Master of Fine Arts or Master of Arts in Motion Media Design at Savannah College of Art and Design November 2013, Chao-Chun Wu The author hereby grants SCAD permission to reproduce and to distribute publicly paper and electronic thesis copies of document in whole or in part in any medium now known or hereafter created. Chao-Chun Wu Author Date Michael Betancourt Committee chair Date Andre Ruschkowski Committee Date James Gladman Committee Date
Synaesthetic Effects Can Produce an Immersive Visual Music A Thesis Submitted to the Faculty of the Motion Media Design Department in Partial Fulfillment of the Requirements for the Degree of Master of Fine Arts or Master of Arts in Motion Media Design at Savannah College of Art and Design By Chao-Chun Wu Savannah, GA November 2013
TABLES OF CONTENTS List of Figures 1 Abstract 2 Introduction 3 Color Music and Visual Music 3 Synaesthesia 7 Immersive Performance 8 Visual Project 10 Bibliography 17
Wu 1 LIST OF FIGURES Figure 1. Alexander Wallace Rimington, Colour-Organ, (1893) 4 Figure 2. Oskar Fischinger, still images, Radio Dynamics (1942) 6 Figure 3. Oskar Fischinger, still images, Radio Dynamics (1942) 8 Figure 4. Ryoichi Kurokawa, live performance, Weather in My Brain Sound-Visual Art Festival, Taipei, Taiwan (2006) 9 Figure 5. D-Fuse and Scanner, live performance, Elektrs Festival 6, Montreal (2005) 10 Figure 6. Elektron Machinedrum 12 Figure 7. Max//MSP/Jitter set to generate videos that loudness and brightness are synchronized 14 Figure 8. Max/MSP/Jitter set to generate videos that loudness and saturation are synchronized 15 Figure 9. Post-production in Adobe After Effects 16
Wu 2 Abstract Synaesthetic Effects Can Produce an Immersive Visual Music Chao-Chun Wu November 2013 This thise based on author's experience of being a DJ and VJ to explore the possibilities that synaesthetic effects can produce an immersive visual music by surveying the history of color music, visual music and synaesthesia.
Wu 3 Synesthetic Effects Can Produce an Immersive Visual Music INTRODUCTION Since late 1990s, I was influenced by electronic music, and then I became a DJ playing music in outdoors events. Later, I realized there seemed to be some relation between sound and vision when I played music with a VJ playing video. Occasionally, I began to create videos and synchronized them with music. As a VJ, I figure out that the movements, colors and speed of video can be used synchronize with tempo, pitch and loudness of music, and when these connections are established, the videos are transferred to music and music is transferred to vision. The audiences look immersed in this kind of atmosphere. In this world, although the abilities of hearing and vision are quiet different, a relationship seems existing. I tried to confirm this unexplained relationship and then I learnt about visual music. From the history, many ancient pioneers had already examined the possibilities how to define this relationship, but they could not transfer one color light to one specific sound or pitch by scientific ways. In addition, those pioneers used their own way to develop many unique methods to explain the relation including synesthetic effects. COLOR MUSIC AND VISUAL MUSIC The color music instruments invented by Father Castel and painter A. Wallace Rimington reveal the desire to quantify the relationship of color and music. French Jesuit Louis Bertrand Castel proposed an instrument he named the clavecin pour les yeux (ocular harpsichord) to demonstrate his ideas about the linkage of sound and light. The device Castel designed employed colored strips of paper lit from behind by a candle when a specific key was hit, the colored strips of paper rose above the cover of harpsichord. Father Castel s music instrument clavecin pour les yeux not only attempted to define the relationship between
Wu 4 sound and light or color, but also could perform this relationship as an audiovisual performance or called color music. After Father Castel designed his color music instrument clavecin pour les yeux, more and more similar instruments were made. His was the earliest of these devices known to produce color music. In 1893, Alexander Wallace Rimington, the British inventor and professor of fine arts in London, used the term Colour-Organ in his patent application for his color music instrument. (Fig. 1) Rimington s Colour-Organ is around three meters high, and the light of lamp could perform different lightness, color tones and saturation, but it could not perform sound or music. Instead, it needed another performer to play the musical accompaniment. Although Colour-Organ needed two performers that one manipulates color light and the other played music, the color lights and music of performance were synchronized. Color music produces a synesthetic connection between color light and music. Fig. 1. Alexander Wallace Rimington, Colour-Organ, (1893)
Wu 5 In 1912, art critic Roger Fry proposed the idea of visual music in his defense of Post-Impressionism: Give up all resemblance to natural form, and create a purely abstract language of form a visual music. 1 Similar to color music, visual music is also established by visual elements producing a synesthetic feeling. The visual elements of visual music not only includes color light, but also motion, scale, position and so on. However, the initial use of the term visual music was not as same as today; Fry suggested the visual elements in abstract painting such as lines, color can produce a synesthetic feeling like music but in visual music it is the creation of these synesthetic effects, without a connection to musical notes that is significant; visual music can be silent. Abstract film brought the aesthetic of abstract painting to animation, and also brought the idea of synesthesia. Figure 2 shows still images from German artist Oskar Fischinger s film Radio Dynamics (1942). Each frames looks like an abstract painting containing the basic visual components of visual music such as shapes, colors, and scale. When playing these still images to animation, the difference between each frames can produce a musical feeling rhythm, so, if twenty-four frames are played in one second, audiences see twenty four different shapes and colors and feel a rhythm with twenty four beats per second. As a result, this abstract animation can produce synesthetic effects to sense musical feeling rhythm. 1 Fry, Roger. An Important Event of the Season: Recent Paintings of Mr. Alfred Maurer of Paris (New York: Folsom Galleries, 1913), pp. 56
Wu 6 Fig. 2. Oskar Fischinger, still images, Radio Dynamics (1942) Although the methods to produce visual music are different today, synesthesia is still the fundamental principle for creating visual music. Figure 3 shows a contemporary live concert performed by German electronic music pioneer Kraftwerk in 2004. This performance has several similarities to Rimington s Colour-Organ: the video on the big screen is not generated by the instruments they created music, and the first artist on the right plays video synchronized with the music by matching the tempo. In this performance the video produces a synesthetic musical feeling and can be understood as a translation of the music into visual form.
Wu 7 SYNAESTHESIA The term synesthesia is used in both art history and psychology. In Art history and criticism, synesthesia describes artwork where visuals are organized in ways similar to music. Synesthetic art includes both still works, such as paintings, and motion works, such as animated films. Sound is optional since it is how the visual elements produce an analogue to musical structure and feeling that is the essential part of its synesthetic individual receives a stimulus in one sense modality and experiences a sensation in another. 2 Psychological uses of synesthesia describe experiences where one can see color shapes while hearing a specific sound (such as music). In art history, the synesthesia is metaphoric. It describes artwork such as Oskar Fischinger s abstract animation Radio Dynamics (Fig. 2). While it is completely silent at the beginning of the animation, it even has a title card stating Please! No Music, Experiment in Color-Rhythm. (Fig. 3) the different colored shapes animated on screen produce a musical rhythm for the eyes. Live music performances where a DJ and VJ perform together is the example can also be immersive, and are typically designed to be synesthetic. VJ manipulates the visuals, synchronizing colors, shapes and movements to what the DJ plays by matching tempo, loudness, pitch or a specific sound effect. The synesthetic effects occur where the music and videos are synchronized. 2 Ione, Amy. Tyler, Christopher. Journal of the History of the Neurosciences (Milton Park: Taylor & Francis Group, 2004) pp. 58.
Wu 8 Fig. 3. Oskar Fischinger, still images, Radio Dynamics (1942) IMMERSIVE PERFORMANCE Immersion is often referred to physical experience that one is surrounded in an artificial environment and isolated from the reality. Figure 4 was taken in 2006 at the Weather in My Brain Sound-Visual Art Festival, in Taipei, Taiwan. This festival used sixteen screens places around the periphery of the concert room, producing immersive environment. For most of the performances, the screens all showed same video. Only a few artists had work prepared to use the potentials of this multi-screen projection system. The Japenese video artist Ryoichi Kurokawa, (standing, center in Fig. 4) was one of a few artists played multi-screens performance in the festival. However, the feeling of immersion doesn t only occur in this type of special projection environment; it can also happen in more traditional projection situations, especially those that are synesthetic. Synesthetic effects are common in VJ performances. Figure 5 shows a performance by D-Fuse in 2005. The top half of the figure shows different lights and colors changed with music. Although this event was not held in a room with panoramic screens but in a public park, the audience feels immersed. The top left side of
Wu 9 figure has maximum colors and lights and the top right side of figure is the darkest one. From the left to right side, the volume of music is gradually down, and the top half of figure shows how VJ synchronizes the video to the music. In bottom half of the figure 5, the melody of the music has changed dramatically, so the VJ used different visual forms to reflect these changes in the music. The colored lights in the video are synchronized with music to produce a feeling of speed. This strong synesthetic effect forces audiences to concentrate on the screen: the result is immersion. Fig. 4 Ryoichi Kurokawa, live performance, Weather in My Brain Sound-Visual Art Festival, Taipei, Taiwan (2006)
Wu 10 Fig. 5 D-Fuse and Scanner, live performance, Elektrs Festival 6, Montreal (2005) VISUAL PROJECT The thesis project explores the potentials for creating an immersive experience through synchronized audio-visual manipulations and a formal vocabulary based in my experiences as a VJ. The piece is called Carbon, and its imagery reflects the carbon cycle discussed by Italian writer/chemist Primo Levi in his book The Periodic Table: Our character lies for hundreds of millions of years, bound to three atoms of oxygen and one of calcium, in the form of limestone: it already has a very long cosmic history behind it, but we shall ignore it. For it time does not exist, or exists only in the form of sluggish variations in temperature, daily or seasonal, if, for the good fortune of this tale, its position is not too far from the earth s surface. Its existence, whose monotony cannot be thought of
Wu 11 without horror, is a pitiless alternation of hots and colds, that is, of oscillations (always of equal frequency) a trifle more restricted and a trifle more ample: an imprisonment, for this potentially living personage, worthy of Catholic Hell. To it, until this moment, the present tense is suited, which is that of narration it is congealed in an eternal present, barely scratched by the moderate quivers of thermal agitation. 3 Levi used eternity as the metaphor to describe the carbon cycle: no matter what changes, it s just another cycle and nothing really changes for eternity; there is nothing new under the sun. This idea of eternity is the organizing theme of this visual project. It uses three different video footages to represent past, present and future. As the synchronization of image and sound is an essential part of this project, the music was created specifically for it using the Elektron Machinedrum. (Fig. 6) The Machinedrum a well-known instrument introduced by the Swedish electronic musical instrument company Elektron in 2005. This instrument is sequencer and drum machine that produces drum beats for dance music. The music made for this project uses the most common time signature 4/4. The Machinedrum allows sixteen different sounds per track. Because this instrument is designed for live performance, it doesn t allow the recording/saving of the entire sound track, forcing me to recorded the music separately. 3 Levi, Primo. The Periodic Table, (New York: Alfred A. Knopf, 1996), pp. 233.
Wu 12 Fig. 6. Elektron Machinedrum The visuals were generated live using Max/MSP/Jitter, the footage generated was captured off the monitor using Snapz Pro X, allowing post-production in Adobe AfterEffects. Figure 7 and 8 are the patches I used to generate synchronized footage. In Visual Music: Searching for an Aesthetic, multi-media artist Tom DeWitt concluded: Perhaps because the monochrome retinal neurons, called rods, are more sensitive to light than the color-sensitive cones are, the transition from black and white to color can produce the sensation of awakening. As our level of visual perception grows in increased amplitudes of light, visual perception changes from black and white to color. 4 DeWitt suggested that human eyes are more sensitive to black and white than colorful vision. For colored footages, manipulate the brightness can produce similar effects like black and white footages do. Central to this process was a third-party object set called Sound Analyzer, created by MIT student Tristan Jehan. This third-party object set contained an 4 DeWitt, Tom. Leonardo, Vol. 20, No. 2 (Cambridge: The MIT Press, 1987), pp. 116.
Wu 13 object named loudness~, that translated the loudness of the sound into a number, but as this output is always negative, they had to be re-mapping to positive. The object scale can remapping numbers to another specific range of numbers, and then I use an object name jit.brcosa to control the brightness and saturation of the raw footages by connection the numbers analyzed from loudness~ and scale. After capturing the videos rendered by Max/MSP/Jitter, I can get three videos synchronized with music and brightness, and another three synchronized with saturation. The music contains low frequency sound, especially from a kick drum, and the video has been synchronized with both the loudness of the music, and with this specific instrument. Adobe AfterEffects was used to produce a locked synchronization, and to create the color changing effects visible in the finished piece. Max/MSP/Jitter presented challenges for this process. When recording the footage, it had a tendency to drop frames, resulting in a loss of sync; the application of effects in Max/MSP/Jitter only made this problem greater. By using the layers in After Effects (Fig. 9), and setting key frames to match the rhythm of the music, it was possible to create the same effects that were initially planned for production with Max/MSP/Jitter since the video files are already synchronized with music, by applying the blend mode difference, the color changes with sound. Although this thesis project is produced in a semi-live situation, the idea of synchronizing visual form with music is still as same as live performance. Live VJ performances in concerts exhibit synesthetic effects because they synchronize visual forms with music: these synesthesias create a feeling of immersion that isolate their audiences whether they are standing in an immersive in-the-round projection
Wu 14 environment or not; it is the synesthetic effect that causes the condition to audiences to feel immersion. Fig. 7 Max//MSP/Jitter set to generate videos that loudness and brightness are synchronized
Wu 15 Fig. 8 Max/MSP/Jitter set to generate videos that loudness and saturation are synchronized
Wu 16 Fig. 9 Post-production in Adobe After Effects
Wu 17 Bibliography DeWitt, Tom. Leonardo, Vol. 20, No. 2. Cambridge: The MIT Press, 1987. Print Fry, Roger. An Important Event of the Season: Recent Paintings of Mr. Alfred Maurer of Paris. New York: Folsom Galleries, 1913. Print. Ione, Amy. Tyler, Christopher. Journal of the History of the Neurosciences. Milton Park: Taylor & Francis Group, 2004. Print Levi, Primo. The Periodic Table. New York: Alfred A. Knopf, 1996. Print