Singing accuracy, listeners tolerance, and pitch analysis

Similar documents
How do we perceive vocal pitch accuracy during singing? Pauline Larrouy-Maestri & Peter Q Pfordresher

Perception of melodic accuracy in occasional singers: role of pitch fluctuations? Pauline Larrouy-Maestri & Peter Q Pfordresher

How do scoops influence the perception of singing accuracy?

Pitch analysis workshop

Semi-automated extraction of expressive performance information from acoustic recordings of piano music. Andrew Earis

MELODIES PERFORMED WITH WESTERN

Pitch Perception in Music: Do Scoops Matter?

Topic 10. Multi-pitch Analysis

High School String Players Perception of Violin, Trumpet, and Voice Intonation

Estimating the Time to Reach a Target Frequency in Singing

Topic 4. Single Pitch Detection

ANALYSIS OF INTONATION TRAJECTORIES IN SOLO SINGING

Quarterly Progress and Status Report. Replicability and accuracy of pitch patterns in professional singers

Music Representations

Music Complexity Descriptors. Matt Stabile June 6 th, 2008

Audio Feature Extraction for Corpus Analysis

Pitch. The perceptual correlate of frequency: the perceptual dimension along which sounds can be ordered from low to high.

Transcription An Historical Overview

Intonation in Unaccompanied Singing: Accuracy, Drift and a Model of Reference Pitch Memory

DAT335 Music Perception and Cognition Cogswell Polytechnical College Spring Week 6 Class Notes

APPLICATIONS OF A SEMI-AUTOMATIC MELODY EXTRACTION INTERFACE FOR INDIAN MUSIC

Automatic characterization of ornamentation from bassoon recordings for expressive synthesis

Efficient Computer-Aided Pitch Track and Note Estimation for Scientific Applications. Matthias Mauch Chris Cannam György Fazekas

Musical Acoustics Lecture 15 Pitch & Frequency (Psycho-Acoustics)

Copyright 2009 Pearson Education, Inc. or its affiliate(s). All rights reserved. NES, the NES logo, Pearson, the Pearson logo, and National

However, in studies of expressive timing, the aim is to investigate production rather than perception of timing, that is, independently of the listene

The Beat Alignment Test (BAT): Surveying beat processing abilities in the general population

K-12 Performing Arts - Music Standards Lincoln Community School Sources: ArtsEdge - National Standards for Arts Education

Transcription of the Singing Melody in Polyphonic Music

Pitch Perception. Roger Shepard

Automatically extracting performance data from recordings of trained singers

MUSIC CURRICULM MAP: KEY STAGE THREE:

The Relationship Between Auditory Imagery and Musical Synchronization Abilities in Musicians

MAutoPitch. Presets button. Left arrow button. Right arrow button. Randomize button. Save button. Panic button. Settings button

HST 725 Music Perception & Cognition Assignment #1 =================================================================

Proc. of NCC 2010, Chennai, India A Melody Detection User Interface for Polyphonic Music

A Computational Model for Discriminating Music Performers

Measurement of overtone frequencies of a toy piano and perception of its pitch

THE INTERACTION BETWEEN MELODIC PITCH CONTENT AND RHYTHMIC PERCEPTION. Gideon Broshy, Leah Latterner and Kevin Sherwin

Music Theory. Fine Arts Curriculum Framework. Revised 2008

Automatic scoring of singing voice based on melodic similarity measures

Beltone True TM with Tinnitus Breaker Pro

Topics in Computer Music Instrument Identification. Ioanna Karydi

AP Music Theory 2015 Free-Response Questions

We realize that this is really small, if we consider that the atmospheric pressure 2 is

Content Area Course: Chorus Grade Level: Eighth 8th Grade Chorus

THE importance of music content analysis for musical

Speech To Song Classification

OVER THE YEARS, PARTICULARLY IN THE PAST

GENERAL MUSIC 6 th GRADE

ST. JOHN S EVANGELICAL LUTHERAN SCHOOL Curriculum in Music. Ephesians 5:19-20

Music. Last Updated: May 28, 2015, 11:49 am NORTH CAROLINA ESSENTIAL STANDARDS

AUTOMATICALLY IDENTIFYING VOCAL EXPRESSIONS FOR MUSIC TRANSCRIPTION

AN APPROACH FOR MELODY EXTRACTION FROM POLYPHONIC AUDIO: USING PERCEPTUAL PRINCIPLES AND MELODIC SMOOTHNESS

Grade Level 5-12 Subject Area: Vocal and Instrumental Music

Music Representations

Statistical Modeling and Retrieval of Polyphonic Music

SINGING PITCH EXTRACTION BY VOICE VIBRATO/TREMOLO ESTIMATION AND INSTRUMENT PARTIAL DELETION

Quantifying Tone Deafness in the General Population

2014A Cappella Harmonv Academv Handout #2 Page 1. Sweet Adelines International Balance & Blend Joan Boutilier

ON FINDING MELODIC LINES IN AUDIO RECORDINGS. Matija Marolt

Content Area Course: Chorus Grade Level: 9-12 Music

MELODY EXTRACTION FROM POLYPHONIC AUDIO OF WESTERN OPERA: A METHOD BASED ON DETECTION OF THE SINGER S FORMANT

AUTOMATIC ACCOMPANIMENT OF VOCAL MELODIES IN THE CONTEXT OF POPULAR MUSIC

Rechnergestützte Methoden für die Musikethnologie: Tool time!

Scheme of work: 2 years (A-level)

Instrument Recognition in Polyphonic Mixtures Using Spectral Envelopes

Acoustic and musical foundations of the speech/song illusion

Perceptual Considerations in Designing and Fitting Hearing Aids for Music Published on Friday, 14 March :01

ANALYSIS OF VOCAL IMITATIONS OF PITCH TRAJECTORIES

Automatic Construction of Synthetic Musical Instruments and Performers

Experiments on tone adjustments

Pitch Perception and Grouping. HST.723 Neural Coding and Perception of Sound

Grade 3 General Music

POST-PROCESSING FIDDLE : A REAL-TIME MULTI-PITCH TRACKING TECHNIQUE USING HARMONIC PARTIAL SUBTRACTION FOR USE WITHIN LIVE PERFORMANCE SYSTEMS

Scheme of work: 2 years (AS and A-level)

AUD 6306 Speech Science

Score following using the sung voice. Miller Puckette. Department of Music, UCSD. La Jolla, Ca

Grade HS Band (1) Basic

The Research of Controlling Loudness in the Timbre Subjective Perception Experiment of Sheng

Student Performance Q&A:

Do Zwicker Tones Evoke a Musical Pitch?

A REAL-TIME SIGNAL PROCESSING FRAMEWORK OF MUSICAL EXPRESSIVE FEATURE EXTRACTION USING MATLAB

PRESCOTT UNIFIED SCHOOL DISTRICT District Instructional Guide January 2016

Analysis of local and global timing and pitch change in ordinary

Using the new psychoacoustic tonality analyses Tonality (Hearing Model) 1

Influence of tonal context and timbral variation on perception of pitch

TOWARD AN INTELLIGENT EDITOR FOR JAZZ MUSIC

Noise evaluation based on loudness-perception characteristics of older adults

Scoregram: Displaying Gross Timbre Information from a Score

2 2. Melody description The MPEG-7 standard distinguishes three types of attributes related to melody: the fundamental frequency LLD associated to a t

The effect of focused instruction on young children s singing accuracy

User-Specific Learning for Recognizing a Singer s Intended Pitch

International School of Kenya

Signal Processing for Melody Transcription

Grade 2 General Music

Topic 11. Score-Informed Source Separation. (chroma slides adapted from Meinard Mueller)

Proceedings of the 7th WSEAS International Conference on Acoustics & Music: Theory & Applications, Cavtat, Croatia, June 13-15, 2006 (pp54-59)

Machine Learning Term Project Write-up Creating Models of Performers of Chopin Mazurkas

Visual Arts, Music, Dance, and Theater Personal Curriculum

Transcription:

Singing accuracy, listeners tolerance, and pitch analysis Pauline Larrouy-Maestri Pauline.Larrouy-Maestri@aesthetics.mpg.de Johanna Devaney Devaney.12@osu.edu

Musical errors Contour error Interval error Tonality error

Musical errors 166 performances Computer assisted method (Larrouy-Maestri & Morsomme, 2013) 3 criteria http://sldr.org/sldr000774/en Judges 1-2 - 3-4 - 5-6 - 7-8 - 9 Out of tune In tune

Musical errors F(3,165) = 231.51; p <.01 81% Interval deviation Tonality modulations F(3,165) = 104.44; p <.01 66% Interval deviation Larrouy-Maestri, P., Lévêque, Y., Schön, D., Giovanni, A., & Morsomme, D. (2013). The evaluation of singing voice accuracy: A comparison between subjective and objective methods. Journal of Voice. Larrouy-Maestri, P., Magis, D., Grabenhorst, M., & Morsomme, D. (revision). Layman or professional musician: Who makes the better judge?

Musical errors Intervals are important in the definition of vocal pitch accuracy in a melodic context When you are an experts, you pay attention to interval deviation and number of modulations But tolerance?

Tolerance Pitch discrimination (e.g., http://www.musicianbrain.com/pitchtest/) In a melodic context Semitone (100 cents) Berkowska & Dalla Bella, 2009 ; Dalla Bella et al., 2007, 2009a, 2009b ; Pfordresher & al., 2007, 2009, 2010 Quartertone (50 cents) Hutchins & Peretz; 2012 ; Hutchins, Roquet, & Peretz, 2012 ; Pfordresher & Mantell, 2014 Tolerance of layman listeners for non-familiar melodies Much less that a quartertone! Whatever the type of error, the place and size of the interval But effect of familiarity? Yes (Kinney, 2009) No (Warrier & Zatorre, 2002) Effect of expertise? Yes (most of the literature) No (Larrouy-Maestri et al., under revision)

Tolerance: Participants Musicians Non Musicians n 30 30 Gender 5 women 5 women Age M = 41 (SD = 11.85) M = 41 (SD = 12) Instrument 20 chords 11 wind 4 percussions 5 singers Years of training M = 30.7 (SD = 12.32) no history of choral singing no formal musical training (max 2 years and no practice during the past 5 years) Starting M = 8.8 (SD = 4.63) Audiometry hearing threshold below 20 db HL Production task MBEA (Peretz et al., 2003) ability to perform Happy Birthday with respect to appropriate melodic contour no deficit in music perception

Tolerance: Material Familiar and Non-Familiar melodies Online questionnaire 399 participants from 13 to 70 years old (M = 29.81) Familiarity ratings t(398) = 20.92, p <.001 No effect of expertise on the ratings (p >.05)

Tolerance: Procedure Methods of limits (Van Besouw, Brereton, & Howard, 2008) Two times Test-retest paradigm

Tolerance: Test-retest Highly significant correlation (r(60) =.91, p <.001) Tolerance (Cents) No effect of the direction of the deviation (i.e., enlargement vs. compression) t(59) =.-96, p =.34 No effect of expertise (p =.08) or familiarity (p =.71) or interaction (p =.65) on the evolution test-retest Training effect (t(59) = 2.92, p =.005)

Tolerance: Effect of expertise and familiarity Tolerance (Cents)! Effect of expertise (F(1, 116) = 139.11, p <.001, η 2 =.54) No effect of familiarity (F(1, 116) = 2.74, p =.10) No interaction (F(1, 116) =.60, p =.44)

Tolerance: Effect of expertise and familiarity Low tolerance of all listeners when listening to melodies slightly out of tune (less than a quarter tone) Highly significant expertise effect, even for a familiar song well known by the participants (i.e., Happy Birthday) Training effect (mainly for the musicians) But perceptual limit of musicians?

Pitch analysis

Historical Methods University of Iowa Carl Seashore (1938) and colleagues studied timing, dynamics, intonation, and vibrato in pianists, violinists, and singers Equipment: piano rolls, films of the movement of piano hammers during performance, phonophotographic apparati Cary (1922)

Historical Methods Phonophotography technique Henrici Harmonic Analyzer Frequency graphed in 10 cent units Intensity graphed in decibels Seashore (1937) Timing information as a function of linear space

Manual Annotation by Tapping

Manual Annotation with Software Audio Sculpt + Open Music

Manual Annotation with Software PRAAT

Manual Annotation with Software Audacity

Automatic Annotation Sonic Visualiser

Automatic Annotation TONY

Automatic Annotation Melodyne

Identify Note Onsets and Offsets Fundamental Frequency (F0) Estimation Perceived Pitch Evolution of F0

Score-guided performance data extraction Monophonic and quasi-polyphonic Timing information is available via MIDI/audio alignment Fundamental frequency (F0), and amplitude can be reliably extracted Soprano Alto Tenor Devaney, Mandel, and Ellis (2009)

Score-guided performance data extraction Polyphonic Timing information (including asynchronies between lines) is available in the alignment F0 and amplitude are harder to extract Currently exploring the using High Resolution methods with Roland Badeau for the task of score-guided extracting of frequency and loudness information in polyphonic audio Devaney and Ellis (2009) Devaney (2014)

Perceived Pitch Possible calculation methods Shonle and Horan (1980) Iwamiya, Kosugi, and Kitamura (1983) D Alessandro and Castellengo (1994, 1995) Gockel, Moore, and Carlyon (2001) Geometric mean over the duration of the note Center frequency between peaks and troughs in vibratos and symmetrical trills In asymmetrical trills pitch shifts according to the direction of the asymmetry - F0 at the end of the note was more significant for the pitch perception than the beginning of the note. Mean of the steady-state portion oft he note rather than the mid-point between the maximum and minimum frequencies Weighted mean based on the fundamental frequencies rate of change, with higher weightings for frames that had a smaller rate of change

Evolution of F 0 Modeling note trajectories Characterizing F0 trajectories is under-studied One option is to decompose of F0 trace with the Discrete Cosine Transform to estimate slope and curvature Devaney, Mandel and Fujinaga (2011) Devaney and Wessel (2013)

AMPACT Automatic Music Performance and Comparison Toolkit www.ampact.org

Thank you for your attention! Johanna Devaney Devaney.12@osu.edu Pauline Larrouy-Maestri Pauline.Larrouy-Maestri@aesthetics.mpg.de