AMD+ Testing Report. Compiled for Ultracomms 20th July Page 1

Similar documents
Broadcasting Authority of Ireland Guidelines in Respect of Coverage of Referenda

Best Practice. for. Peer Review of Scholarly Books

Australian Broadcasting Corporation Submission to the Senate Standing Committee on Environment, Communications and the Arts

Ofcom's proposed guidance on regional production and regional programming

February 22, To whom it may concern:

Panel Discussion: Personal Safety and Human Factors

The Scheduling of Television Advertising: Approaches to Enforcement. Response from the Commercial Broadcasters Association to Ofcom October 2014

Broadcasting Decision CRTC

VAR Generator Operation for Maintaining Network Voltage Schedules

The HKIE Outstanding Paper Award for Young Engineers/Researchers 2019 Instructions for Authors

Note for Applicants on Coverage of Forth Valley Local Television

Vision Call Statistics User Guide

OUR CONSULTATION PROCESS WITH YOU

SIX STEPS TO BUYING DATA LOSS PREVENTION PRODUCTS

Composer Commissioning Survey Report 2015

Start IT (itq) Audio and video software ( )

REGULATING THE BBC AS A PUBLIC SERVICE. Michael Starks Associate, Programme in Comparative Media Law and Policy Oxford University*

Ethical Policy for the Journals of the London Mathematical Society

CALL FOR PAPERS. standards. To ensure this, the University has put in place an editorial board of repute made up of

GROWING VOICE COMPETITION SPOTLIGHTS URGENCY OF IP TRANSITION By Patrick Brogan, Vice President of Industry Analysis

Bootstrap Methods in Regression Questions Have you had a chance to try any of this? Any of the review questions?

Pitch correction on the human voice

RESPONSE OF CHANNEL 5 BROADCASTING LTD TO OFCOM S CONSULTATION ON PROPOSED PROGRAMMING OBLIGATIONS FOR NEW CHANNEL 3 AND CHANNEL 5 LICENCES

RECORDED MUSIC FOR THE PURPOSE OF DANCING MUSIC LICENSING CONSULTATION

Building Your DLP Strategy & Process. Whitepaper

Application of Measurement Instrumentation (1)

Comparing gifts to purchased materials: a usage study

Guidelines Irish Aid Logo

Information for organisations seeking to be prescribed as a 'key cultural institution'

COMMUNICATIONS OUTLOOK 1999

IEEE C a-02/26r1. IEEE Broadband Wireless Access Working Group <

Guide to contributors. 1. Aims and Scope

SPIRIT. SPIRIT Attendant. Communications System. User s Guide. Lucent Technologies Bell Labs Innovations

Monitor QA Management i model

Broadcasting Authority of Ireland Rule 27 Guidelines General Election Coverage

DATA LOSS PREVENTION: A HOLISTIC APPROACH

BBC S RELEASE POLICY FOR SECONDARY TELEVISION AND COMMERCIAL VIDEO-ON-DEMAND PROGRAMMING IN THE UK

Young Choir of the Year Postal Entry Form

Independent TV: Content Regulation and the Communications Bill 2002

This document is meant purely as a documentation tool and the institutions do not assume any liability for its contents

Department for Culture, Media and Sport. The balance of payments between television platforms and public service broadcasters

Official Journal L 191, 23/07/2009 P

CYRIL JACKSON PRIMARY SCHOOL CCTV POLICY

Heads of Agreement. New Zealand International Convention Centre

Editorial Policy. 1. Purpose and scope. 2. General submission rules

ISO Digital Forensics- Video Analysis

Consumer aerial survey. Implementing Ofcom s UHF Strategy

91.7 The Edge, WSUW-FM Training Manual

SUPREME COURT OF COLORADO Office of the Chief Justice DIRECTIVE CONCERNING COURT APPOINTMENTS OF DECISION-MAKERS PURSUANT TO , C.R.S.

Thesis and Dissertation Handbook

GCSE Teacher Guidance on the Music Industry Music

College of Communication and Information

SCHEDULE 5 PERFORMER ALLOCATION RULES

INSTRUCTIONS FOR AUTHORS

CUBITT TOWN JUNIOR SCHOOL CCTV POLICY 2017

Policy on the syndication of BBC on-demand content

For an Outdoor Kiosk Licence

Popular Music Theory Syllabus Guide

21. OVERVIEW: ANCILLARY STUDY PROPOSALS, SECONDARY DATA ANALYSIS

VENDOR MANUAL VERSION 2.0. SECTION 8 Quality Assurance Requirements

STAT 113: Statistics and Society Ellen Gundlach, Purdue University. (Chapters refer to Moore and Notz, Statistics: Concepts and Controversies, 8e)

Analysis of the Televisions Implementing Measure Eco-Design Directive for Energy-related Products (ErP) formerly known as Energy-using Products (EuP)

Re: Notice of Oral Ex Parte Communications, WC Docket No

The Communications Market: Digital Progress Report

Writing & Submitting a Paper for a Peer Reviewed Life Sciences Journal

SIGNALPERSON REFERENCE MANUAL

Estimation of inter-rater reliability

THE RADIO CODE. The Radio Code. Broadcasting Standards in New Zealand Codebook

Bibliometrics and the Research Excellence Framework (REF)

Public Administration Review Information for Contributors

GUIDELINES FOR PREPARATION OF ARTICLE STYLE THESIS AND DISSERTATION

Broadcasting Decision CRTC and Broadcasting Orders CRTC , , , , and

Multi-Media Card (MMC) DLL Tuning

Privacy Policy. April 2018

Version : 27 June General Certificate of Secondary Education June Foundation Unit 1. Final. Mark Scheme

NAA ENHANCING THE QUALITY OF MARKING PROJECT: THE EFFECT OF SAMPLE SIZE ON INCREASED PRECISION IN DETECTING ERRANT MARKING

THE PAY TELEVISION CODE

COMMISSION STAFF WORKING DOCUMENT. Accompanying document to the

Bibliometric Rankings of Journals Based on the Thomson Reuters Citations Database

THE FAIR MARKET VALUE

Published July BFI Neighbourhood Cinema: Equipment Fund Guidelines for Applicants

Kindly refer to Appendix A (Author s Checklist) and Appendix B (Template of the Paper) for more details/further information.

Service availability will be dependent on geographic coverage of DAB and digital television services 2

MODELLING IMPLICATIONS OF SPLITTING EUC BAND 1

What You Need to Know About Addressing GDPR Data Subject Rights in Primo

SoundExchange compliance Noncommercial webcaster vs. CPB deal

Measurement of automatic brightness control in televisions critical for effective policy-making

Roman Road Primary School

21. OVERVIEW: ANCILLARY STUDY PROPOSALS, SECONDARY DATA ANALYSIS

Incorporation of Escorting Children to School in Individual Daily Activity Patterns of the Household Members

BBC Response to Glasgow 2014 Commonwealth Games Draft Spectrum Plan

National Code of Best Practice. in Editorial Discretion and Peer Review for South African Scholarly Journals

Code of Conduct. July 2016

Level 2 Digital Electronics 2 ( )

Level 1 Video software (AV )

CANADIAN BROADCAST STANDARDS COUNCIL PRAIRIE REGIONAL PANEL. CKCK-TV re Promos for the Sopranos and an Advertisement for the Watcher

Data will be analysed based upon actual screen size, but may be presented if necessary in three size bins : Screen size category Medium (27 to 39 )

Thursday 23 June 2016 Afternoon

JOB DESCRIPTION FOR PICTURE EDITOR VISUAL JOURNALISM ARABIC SERVICE

Discussing some basic critique on Journal Impact Factors: revision of earlier comments

Transcription:

AMD+ Testing Report Compiled for Ultracomms 20th July 2015 Page 1

Table of Contents 1 Preface 2 Confidentiality 3 DJN-Solutions-Ltd -Overview 4 Background 5 Methodology 6 Calculation-of-False-Positive-Rate 7 Results 8 Conclusion Page 2

1 Preface Independent study underlines benefits of AMD+, a breakthrough in Answering Machine Detection (AMD) To support the recent launch of Ultracomms AMD+ solution, the company commissioned DJN Solutions, an independent consultancy firm, to provide a third-party, impartial test and assessment of the underlying and ground-breaking technology, designed to help call centres improve outbound agent productivity as well as Ofcom s requirements around management of call abandonment. The rigorous test methodology involved analysis of over 3,000 calls over 4 days and found that AMD+ has a high accuracy of detection. Says Ultracomms Chief Operating Officer, Darren Sullivan, who has spearheaded the development of AMD+ at Ultracomms, We are delighted with the results of the report, which underline the ground-breaking benefits of AMD+, the result of several years intensive R&D within our organisation and representing a new approach in managing call abandonment. AMD+ is a patent-pending cloud-based innovation in answering machine detection (AMD). Designed to increase contact centre agent productivity, AMD+ is also the first AMD technology to reduce the call centre administration required in monitoring abandoned outbound calls, helping to simplify Ofcom compliance. Report highlights: Accuracy of over 99.9 per cent Ultracomms AMD+ is using an approach that favours high accuracy of detection, rather than trying to detect all answering machines. In the test 28.9% of answering machines were detected, but for those detected the accuracy level was 99.9026% Compliance - the author concludes that: While Ofcom will always be the final arbiter in the interpretation of their own policies, using the above results we can see no reason why AMD+ would be considered non-compliant in a live environment. Ability to compare agent and system results - Because it can be switched it on or off at any time, while still having the system internally run and log the detection process, it is possible to compare agent and system results side by side for the same call. Page 3

2 Confidentiality This document contains confidential and proprietary information of DJN Solutions Ltd. ( DJN Solutions ) and Ultra Communications Ltd ( Ultracomms ). This document should not be reproduced in part or its entirety without the prior written consent of DJN Solutions or Ultracomms. 3 DJN Solutions Ltd Overview DJN Solutions Ltd has over 19 years experience with outbound dialling systems. We specialise in helping clients use their systems in compliance with Ofcom s policies. We are a member of the UK Direct Marketing Association s Telemarketing and Contact Centre council and were closely involved with the creation of the DMA guidelines that the UK regulator Ofcom later adopted in their own dialling policies. David Nicholls is also a co-author of the abandoned call calculation document cited by Ofcom in their current statement of Policy. 4 Background Ultra Communications have developed proprietary technology for Answer Machine Detection (AMD), known as AMD+, and commissioned DJN Solutions to carry out independent testing in order to calculate the Reasoned Estimate of False Positives required by Ofcom and to look at the overall performance and compliance characteristics of the solution. In order to provide a blind test of the results, specific details of how the solution operates were not disclosed. This solution is the subject of an ongoing patent application, the contents of which are not currently public. Page 4

5 Methodology In order to provide real world information, testing was carried out using real historical data from live contact centre campaigns, rather than creating a specific test suite. The Ultracomms solution permits analysis by the AMD+ technology of historical campaign data, even where AMD+ was not at the time being used by the Contact Centre. This functionality permits all calls that were passed to an agent to be listened to and independently assessed for the presence of an answerphone vs a live person. This mode was used for the testing. Data provided for test purposes included: 1)! Meta data, in the form of an Excel spreadsheet, giving details of each call, including whether AMD+ detected an answering machine and a link to the file containing the associated call recording DJN Solutions requested data covering three weeks in March. Once the data were received it was analysed and specific days were chosen for analysis. Days were chosen randomly with the following constraints: 1) Overall they had to include enough calls for the test results to be statistically robust 2) To meet Ofcom s requirements, they had to cover different days of the week and different times of day It was decided to use four days worth of data. All were different days of the week and a Saturday was included. The specific customer chosen to provide the data does not dial on Sundays. These days were chosen so that the number of calls made in the mornings, afternoons and evenings were close to the proportions of the overall data set. For the days chosen all calls marked by AMD+ as a machine were manually listened to so that the technology s assessment could be checked. Since it is also necessary to know how many live calls were handled during the test period further sampling was undertaken as follows: 1) Calls marked by agents as answer machines that AMD+ did not classify as answering machines were listened to, in order to confirm that they were not live calls, miscategorised by the agents. 3) Calls marked by agents with non specific outcomes (e.g. Other ) were listened to in order to determine whether any answering machines were included During the first listening pass most outcomes were immediately clear from the call recording. A small number (less than 30 of over 3000) were set aside for subsequent, deeper, analysis. Page 5

6 Calculation of False Positive Rate Correct calculation of dialler abandonment and false positive rates is vital. Early Ofcom documents gave theoretical calculations that were difficult to relate to the actual statistics generated by diallers. In the most recent statement of policy Ofcom clarified the situation with new examples. They also specifically cite a DMA document that gives the mathematical background and specific formulae for both AMD users and non-amd users. In a call centre not using AMD the following figures arise: Agents handle 600 live calls Agents also deal with 400 answering machine calls Another 20 calls are abandoned Using live calls in the calculation gives: i.e. Abandoned Calls Abandoned Calls + Live Calls 20 20 + 600 * 100 * 100 = 3.22% However, if AMD is not being used then calls are abandoned before it has been determined whether they are live or an answering machine. It is therefore very likely that some of the abandoned calls were not live and, as such, should not be counted. These are known as False Negatives. It can never be known exactly how many were actually answering machines but, historically, the assumption has been that, statistically, the ratio of live to machines should be the same as the ratio of calls handled by agents. For this example: 1000 calls handled = 600 live and 400 answering machines Therefore Page 6 20 abandons = 12 live and 8 answering machines Therefore, the true abandonment rate is 12 * 100 = 1.96% 12 = 600 Rather than perform this calculation all the time it can be shown that the following formula gives the correct result:

i.e. Abandoned Calls * 100 Abandoned Calls + Live Calls and Answering Machine calls passed to a live Operator 20 20 + 1000 * 100 = 1.96% When AMD is introduced it is necessary to also include the reasoned estimate of false positives. To do this we need the false positive rate (FPR). Again there has been some confusion as to how this is arrived at. Many people count the number of answering machine calls, see how many of them are false positives, and use this to calculate the rate. For example: 1000 answering machines detected 20 false positives within the sample False Positive Rate is therefore 20 1000 * 100 = 2% What this does not take into account is that a false positive is a live call, therefore the number of false positives is sensitive to the number of live connects. In the above calculation no account is taken of this. If the test was run again with data where the contact rate was much higher there would be more connects which increases the likely number of false positives and would change the FPR. Instead the FPR should be related to the number of live connects and calculated as follows: False Positives FPR = Live Connects Note that live connects must include agent detected live calls, abandons, and false positives. Without going into mathematical detail this leads to a new abandonment rate calculation as follows: Abandoned Calls + Live Calls and AM calls passed to a live Operator * FPR * 100 Abandoned Calls + Live Calls and Answering Machine calls passed to a live Operator This formula deals with both the inclusion of false positives and the effect of false negatives where abandoned calls were actually answering machines and not live calls. When AMD is not used the FPR will always be zero, in which case the above formula automatically reverts to the non-amd formula given above. Page 7

7 Results A total of 3079 calls identified by AMD+ as answering machines during the test period were analysed. The following table shows the result of the analysis: Result Count AM s correctly identified 3076 AM s incorrectly identified (False Positives) 3 Total 3079 During analysis some calls had to be rechecked because the outcomes were not initially clear. Most were satisfactorily classified after further analysis because short snippets of audio could be recognised as voicemail greetings or other non-live results. Of the three calls classified as False Positives, one was clearly a live voice answer but the other two were less clear because the calls were terminated early by the agent resulting in recordings that were quite short with limited audio information. It was decided to err on the side of caution and classify them as live answers on the basis that clear evidence could not be found to the contrary, however it is possible that they were correctly identified by AMD+. Therefore, the count of three false positives represents an upper limit. In order to calculate the FPR we need to use the number of live calls passed to agents and the number of abandoned calls. These were taken from the dialler data covering the test period, i.e. the same time periods as for the analysed calls. It was discovered that agents were using an outcome code of OTHER for many calls. Sampling of these calls showed that most were live answers for which no other outcome available to the agent was more appropriate, however some calls were voicemail, automated attendant, or other non-live connections. Original figures for agent handled calls during the test period were: Calls with outcomes indicating live contact: 1011 In addition, calls with outcome of Other : 1184 Sampling showed that approximately 10% of calls marked as Other were non-live. For the purposes of the testing it was decided to use the upper bound of the Margin of Error, which was 12%, when adjusting the figures. Page 8

Adjusted figures for calculation purposes are as follows: Result Count Live calls passed to agents 1011 Adjusted other calls passed to agents 1042 Total 2053 During the test period 315 calls were marked as abandoned, and 10501 answering machines were passed to agents. In accordance with Ofcom s example calculations we adjust the abandonment figure for False Negatives by apportioning the abandoned calls in the same ratio as live calls to answering machines: Live calls: 1) Live calls passed to agents: 2053 2) False positives: 3 Non-Live calls: 1) Answering Machine calls passed to agents (adjusted for Other calls): 7567 2) Answering Machines handled by AMD+: 3076 3 + 2053 2056 315 * = 315 * = 50.999 ~ 51 3 + 2053 + 7567 + 3076 12699 Calculating the FPR from these figures gives: 3 3 + 2053 + 51 = 0.001424 Page 9

8 Conclusion From the testing it is clear that Ultracomms AMD+ is using an approach that favours ensuring high accuracy of detection, rather than trying to detect all answering machines. In the test 28.9% of answering machines were detected, but for those detected the accuracy level was 99.9026%1. When using AMD Ofcom require that the dialler s abandonment rate (which can be a maximum of 3%) must include a Reasoned Estimate of false positives. Using the above FPR is equivalent to loading the abandonment rate by 0.1424%, meaning that the dialler abandonment rate can reach up to 2.8576% before breaching the 3% policy. This allows a large degree of latitude in managing dialler campaigns, which in turn means that regulatory requirements can be easily met. Ofcom also require that calls are handled in a timely manner. Specifically: In the event of an abandoned call (other than an AMD false positive), a very brief recorded information message must start playing no later than two seconds after the telephone has been picked up or within two seconds of the call being answered It is understood that AMD+ does not use a cadence method, where audio information from the call is analysed to determine whether a machine or person has been reached. During testing, a sample of calls from a different live campaign where AMD+ was enabled were listened to. Calls passed to agents sounded very similar to those where AMD+ was switched off. For the tested calls AMD+ did not introduce any discernible delay in the calls that were passed to agents. There should therefore be no issue in adhering to Ofcom s two second policy. While Ofcom will always be the final arbiter in the interpretation of their own policies, using the above results we can see no reason why AMD+ would be considered non-compliant in a live environment. During testing we became aware of one other operational advantage of AMD+. Because it can be switched it on or off at any time, while still having the system internally run and log the detection process, it is possible to compare agent and system results side by side for the same call. Although this process relies on the accuracy of agent results, it does provide an accessible way to check system accuracy with minimal effort. Users could test in this way to track any variance in accuracy or performance over time, which may highlight any potential problems. We would still suggest that, for maximum safety, periodic full tests are carried out. 1 Note: As previously explained two of the three calls classified as false positives were included on the basis that there was not enough information for them to be clearly identified as non-live and the nature of this report dictated the decision to err on the side of caution. Since they could also not be positively identified as live calls it is possible that AMD+ had correctly identified them. In this case, with only one false positive, the accuracy would have been 99.9675% Page 10