Standardization of Field Performance Measurement Methods for Product Acceptance

Similar documents
Sid Roberts Microsoft Corporation Technology Policy Group March

Alcatel-Lucent 5620 Service Aware Manager. Unified management of IP/MPLS and Carrier Ethernet networks and the services they deliver

PiMPro Rack Mount Analyzer

PiMPro Portable Analyzer PiMPro Classic 1821

APM CALIBRATION PROCEDURE Rev. A June 3, 2015

Appendix II Decisions on Recommendations Matrix for First Consultation Round

CABLE LEAKAGE IMPACTS ON LTE PERFORMANCE. July 16, 2014

ERTMS line certification using mobile diagnostic solutions. Vito Caliandro Product Line Manager, Signalling Solutions

TEST REPORT No

Challenger s Position:

10GBASE-R Test Patterns

Challenges of Launching DOCSIS 3.0 services. (Choice s experience) Installation and configuration

In-Line or 75 Ohm In-Line

Digital noise floor monitoring (DNFM)

ITU-T Y Reference architecture for Internet of things network capability exposure

Chapter 2. Analysis of ICT Industrial Trends in the IoT Era. Part 1

Benchtop Portability with ATE Performance

EMI/EMC diagnostic and debugging

WaveDevice Hardware Modules

Satellite Services and Interference: The current situation. ITU International Satellite Communication Symposium Geneva, June 2016

ATSC 3.0 Next Gen TV ADVANCED TELEVISION SYSTEMS COMMITTEE 1

Architecture of Industrial IoT

SNG-2150C User s Guide

Does it Matter What AP You Buy?

InterReach Fusion Data Sheet

AppNote - Managing noisy RF environment in RC3c. Ver. 4

SA9504 Dual-band, PCS(CDMA)/AMPS LNA and downconverter mixers

ITU-T Y Specific requirements and capabilities of the Internet of things for big data

Time Domain Simulations

Fibre Optic Modem ODW-622

Transitioning from LT2510 to RM024 v1.0

Practicing Safe RX-SOP. By Jacob Snyder (Holy Grail Edition)

Cell Phone Signal Booster. User Manual. NEED HELP? support.weboost.com

Draft 100G SR4 TxVEC - TDP Update. John Petrilla: Avago Technologies February 2014

Installation Manual IPT Installation of skillet systems with 125 A track current. MV a-E.

Project Summary EPRI Program 1: Power Quality

ITU-T Y.4552/Y.2078 (02/2016) Application support models of the Internet of things

L12: Beyond 4G. Hyang-Won Lee Dept. of Internet & Multimedia Engineering Konkuk University

Typical applications:

Seminar on Technical Findings from Trials and Pilots. Presentation by: Dr Ntsibane Ntlatlapa CSIR Meraka Institute 14 May 2014

R&S TSMx Radio Network Analyzers Powerful scanner family for mobile applications

BER margin of COM 3dB

quantumdata 980 Series Test Systems Overview of UHD and HDR Support

D R M A X - 2 DDS FREQUENCY SYNTHESIZED DRM MW TRANSMITTER. User s Guide (Please read carefully before using for the first time!)

TransitHound Cellphone Detector User Manual Version 1.3

N5264A. New. PNA-X Measurement Receiver. Jim Puri Applications Specialist March Rev. Jan Page 1

At-speed testing made easy

DELL: POWERFUL FLEXIBILITY FOR THE IOT EDGE

Contributions to SE43 Group 10 th Meeting

Verizon, Ericsson, Samsung, Nokia, LGE, T-Mobile, Qualcomm

DP1 DYNAMIC PROCESSOR MODULE OPERATING INSTRUCTIONS

from ocean to cloud ADAPTING THE C&A PROCESS FOR COHERENT TECHNOLOGY

The first TV Smart Headend designed for Hospitality SOLUTIONS FOR IN-ROOM ENTERTAINMENT PROVIDERS AND INTEGRATORS

DLC SPY maintainance tool User manual

Manual and User Guide

ITU-T Y Functional framework and capabilities of the Internet of things

Advanced Test Equipment Rentals ATEC (2832)

Datasheet. Dual-Band airmax ac Radio with Dedicated Wi-Fi Management. Model: B-DB-AC. airmax ac Technology for 300+ Mbps Throughput at 5 GHz

Project: IEEE P Working Group for Wireless Personal Area Networks (WPANs)

Altum Series DATASHEET

Small footprint, big advantages: how connectors enable the networks of tomorrow

혼합도메인오실로스코프와 USB 스팩트럼분석기를 이용한광대역노이즈최신측정기술소개

Practical Cryogenic Receiver Front Ends for Commercial Wireless Applications

DROP HARDENING. January 21, 2015

LandRake HYC V 4006-MIMO Series 4GHz PTP / NATO Mobile Mesh Series

ETSI/TC/SMG#30 TD SMG 582/99 Brighton, U.K. Agenda Item: November 1999

John Stankey President and CEO AT&T Operations

Mobilising the Smart Cities September Barbara Pareglio IoT Technical Director

How to configure epmp s Frequency Reuse feature in a frequency reuse deployment

Fibre Optic Modem ODW-611

Gain/Attenuation Settings in RTSA P, 418 and 427

Introducing High-Performance FM RDS Data Receiver ICs. March 2008

Off Site Maintenance Goes Mobile Electronic Cellular Time Switch

Clarification for 3G Coverage Obligation Verification Data

Industriefunkuhren. Technical Manual. IRIG-B Generator-Module for analogue / digital Signals of Type: IRIG-B / IEEE C / AFNOR NF S87-500

SatLabs Recommendation for a Common Inter-Facility Link for DVB-RCS terminals

PAM-1840 Preamplifier Operation Manual

EBU Workshop on Frequency and Network Planning Aspects of DVB-T2 Part 2

Cellular Signal Booster

980 Protocol Analyzer General Presentation. Quantum Data Inc Big Timber Road Elgin, IL USA Phone: (847)

FullMAX Air Inetrface Parameters for Upper 700 MHz A Block v1.0

1xEV-DO Test Solutions

SPECIFICATION. DVB-T / Worldwide NIM Tuner

SERVICE DESCRIPTION VIDENS SD-WAN SERVICE MANAGEMENT

User s Manual of Signal Level Meter TABLE OF CONTENTS

Datasheet. Carrier Backhaul Radio. Model: AF-2X, AF-3X, AF-5X. Up to 687 Mbps Real Throughput, Up to 200+ km Range

On the Rules of Low-Power Design

Datasheet. Shielded airmax Radio with Isolation Antenna. Model: IS-M5. Interchangeable Isolation Antenna Horn. All-Metal, Shielded Radio Base

L-BAND over Fiber Link TX/RX. User Manual. L-1Ch-L-Band-TX/RX

RF Solutions Inc. Sanjay Moghe Low Cost RF ICs for OFDM Applications

System Level Simulation of Scheduling Schemes for C-V2X Mode-3

100Gb/s Single-lane SERDES Discussion. Phil Sun, Credo Semiconductor IEEE New Ethernet Applications Ad Hoc May 24, 2017

Parade Application. Overview

SEMI F47 Compliance Certificate KEPCO Low-Power Power Supplies EPRI PEAC Corporation PQ Star sm Test Program. Certification Date: July 23, 2003

A STUDY ON THE DEVELOPMENT OF THE DEDICATED OBU 1 FOR THE HANDICAPPED PERSONS USING HI-PASS 2 SYSTEM

SPK-GGLN-7AP V1 ELECTRONICS CO.,LTD. GPS module Spec Datasheet. Module Number List No Module Note PCI- GGU7A P

XCOM1002JE (8602JE) Optical Receiver Manual

TGL2209 SM 8 12 GHz 50 Watt VPIN Limiter

IoT in Port of the Future

Narda SignalShark vs. Rohde & Schwarz PR100 / DDF007 Profile Comparison

Transcription:

Standardization of Field Performance Measurement Methods for Product Acceptance Greg Twitty R & D Project Manager Product Test Factory Nokia Mobile Phones 1

Overview Current state of product acceptance testing Proposal 1: Reduce high duplication of CDG3. Proposal 2: Standardize call performance methods. Summary Questions 2

Field Testing for Product Acceptance Development timeline Field Test Activities R&D Testing Prep for Lab Entry Product Acceptance (CDG stage 3) Design Concept 3 Development Testing Test with internal methods. Verify compliance to standards. Significant portion of schedule. Approval cycle is significantly longer for CDMA than other protocols. Impacts product cost. Reduces competitiveness of CDMA. Customer Specific Testing 1. Pre-run CDG stage 3. 2. Demonstration in front of each Carrier

Challenges Facing CDG Stage 3 Time consuming and costly approval process. Significant resource and travel costs. CDG64 test cases are loosely defined. Each carrier has its own procedures. Not balanced for best error detection. Each CDG3 has duplicated tests with CDG2 and other CDG3s. Lack of standardized test methods and criteria impede data sharing. Unbalanced test reliability in call performance tests. Results are often not repeatable in weak signal areas. False failures often detected. Analysis is very time consuming. Excessive testing in strong signal areas. Reduce Product Acceptance time and cost Test where the errors are most likely. Improve reliability of results. 4

Standardization of CDG3 Test Methods Proposal 1: Reduce high test duplication of CDG stage 3. Phone software is very mature in many areas of feature test. Use data from vendor, other CDG3s and CDG2 to replace tests in generic areas. Proposal 2: Standardize call performance methods and criteria. Improve test accuracy in weak signal areas, Reduce unnecessary testing in strong signal routes. 5

Proposal 1: Reduce High Test Duplication of CDG Stage 3 6

Return on Investment Over Time Infra deployments of a common vendor differ little between carriers. Number of errors found drops off significantly after first CDG3. CDG3 test time required by carriers continues as product matures. Total Testing on Infra X Errors found R&D Pre Lab 1 st Carrier 2 nd Carrier 3 rd Carrier Time Testing Entry Approval Approval Approval 7

CDG Stage 3 Breakdown Call Performance Mobile Mobile Tests Feature Testing System Terminations Strong Signal Acquisition Originations Mixed Signal 4.1 Maintenance Weak Signal 2G Data HSPD Provisioning 4.10 -OTASP 4.2.1, 4.2.2, 4.3 4.11 Location -OTAPA Determination - IOTA N/A 4.8 Call Types -POTS -3-way - Call Waiting -Voice 4.8 Mail CDG 64 Test Cases - Authentication 4.6, 4.7 Static Tests Other -SMS - MMS -Browser -Java -Brew 4.5, 4.9 3-99 Test Cases 70% of Test Effort 30% Errors reported 37 Test Cases 30% of Test Effort 20% Errors reported 8 Test Cases 30% of Effort 50% Errors reported 8

CDG Stage 3 Breakdown Terminations Originations Maintenance RF Performance Rx - Sensitivity -IMD - Self jamming TX -Rho - Power control -Max Power Call Performance BB & DSP -Signal Acquisition -SHO - Searcher - Finger assignment Strong Signal Mixed Signal Weak Signal Mobile Tests System Acquisition Interoperability w/ Infra Channels -ACH -DPCH -FTCH -RTCH Messaging - Layer 1 - Layer 2 Handoffs -SHO - Interband HHO - Interfreq HHO 2G Data HSD Feature Testing Location Determination Provisioning -OTASP -OTAPA Static Tests Other -SMS - MMS -Browser -Java -Brew Call Types -POTS -3-way - Call Waiting -Voice Mail 9 Tested in Weak Signal conditions High Variability Tested in Good Signal conditions High Duplication Areas are tested typically in generic Tested conditions. in Good Signal conditions These test have commonality with CDG2 High tests Duplication and other CDG3s.

Reduce High Test Duplication of CDG Stage 3 Proposal: Identify & utilize common test results. 1. Carriers and infra vendors should define clear CDG3 test procedures that apply to generic network conditions. (common to all Infra deployments) Strong signal drive routes 2G Data / HSD in static conditions. Provisioning Location Determination (non CDG3) 2. Tests that are duplicated by CDG2 should also be identified. Call Types 3. Test results from CDG2 and previous CDG3 should be considered during product acceptance. Software changes made between test events should be evaluated. 4. Carrier and vendor continue to perform tests unique to carrier s network. System Determination Mobile IP / HSPD Messaging Browser, content downloads, etc. 10

Proposal 2: Standardize call performance methods and criteria. 11

Issues with Call Performance Tests 100-200 Originations Strong Signal 1-12 cities 1-2hr Long Call 100-200 Terminations Mixed Signal 95% absolute.75 2 drops/hr Weak Signal x2 Infras 2% of ref. phone 2% of ref phone Carriers institute a wide range of test cases and criteria. Weak Signal Routes: test results have high variation and are not reliable. Outcome is dependant on time of day. Misleading results cause unnecessary search for errors and retesting Significant pre-testing needed by vendor to measure lab entry readiness. Reducing variability of results will decrease acceptance time and cost. Strong Signal Routes: constitutes mostly generic conditions Test variance is very low. Results are repeatable. Few errors are detected in this test. Can reduce test length for quicker acceptance time. - or accept data from other CDG3s using same infra -> Proposal 1 12

Mixture of Test Conditions 1. Test special conditions separate from call performance. Pilot Pollution: Severity is affected by cell breathing. 2. Measure/control non deterministic factors where possible. Zone 2 Zone boundary: can miss a page when registration timer is active. Zone 1 Competitor or AMPS System Determination Too many variables. Hard to distinguish between other weak signal failures 13 Coverage fringe area Loaded Sector: Calls can be blocked during busy hour.

Example of Weak Signal Variability Failed Terminations per 100 Attempt 14 12 10 8 6 4 2 Comparison of Two Reference Phones Reference Phone A Reference Phone B Captured on weak signal route Same make & model Same software Indicates 2% passing criteria. 0 Day 1 Day 2 Day 3 Day 4 Day 5 Delta: +1 0-7 -2-1 +2 +7 Average = 0 Reference phone is intended to track network conditions. For each test, there is no correlation between the two phones. Correlation is apparent after 700 calls 14

Probability of Failure Creating a histogram from previous slide s data, a probability distribution can be seen. Shape, mean (µ), and variance (σ 2 ) of the distribution are determined by many factors: RF conditions at test time Test sample size (# call attempts) Phone performance Weak signal drive route Probability σ 2 = 8.3 3 2 1 Histogram using data from previous slide. 15 5 10 15 20 µ =9.4 Failures after 100 Attempts

Call Performance Criteria Variance determines effectiveness of the call performance test. For fixed criteria, large variance has a higher chance of failing a good phone. Results are less reliable and repeatable. Hence, cell breathing and sample size effect test reliability. Observation: 100 call attempts has a large variance for weak signal routes. High Variance Low Variance Test phone Chance of failing good phone. Chance of failing good phone. Reference phone 16 0 5 10 15 20 2% margin Failed Calls after 100 Attempts 2% margin

Standardize Call Performance Proposal: Drive test should be created with a target accuracy in mind. Need agreed method to measure test s repeatability. 1. Drive routes should focus on specific signal profile. Other test cases should be executed in a separate test. I.e.: weak or strong signal. No mixed signal routes. 2. Standard guidelines for determining test accuracy / repeatability are needed. Methods should have a statistical basis. 3. Standard methods for determining sample size and pass criteria are needed. This should include measuring network conditions at time of test. EXAMPLE Adaptive Pass Criteria Create an index of signal conditions (Ec/Io & RSSI) vs. reference performance. Test criteria should then be defined for each index based on a standard. Ec/Io and traffic loading should be measured at test time. Ec/Io RSSI Sample size Criteria -15 db -90 dbm 300 4% of ref. -13 db -90 dbm 250 4% of ref. -11 db -90 dbm 200 2% of ref... -11 db -70 dbm 50 95% abs.... 17

Test Methods to Standardize Different RF channels have different quality of service. Different channels will have different traffic loading. BTS antennas, combiners and RF equipment vary over frequency. Carrier and vendor ensure phones are provisioned to same channels. Inconsistent phone handling in test van. Affects antenna performance Use phone cradle. Rotate positions each lap. Controlled call cycle time. Must ensure phone is in idle state between call attempts. All steps in test should be defined and standardized. Laptop PC in test van can induce interference. Test laptop in advance. Rules needed for proximity to test phones. Common test methods are needed to ensure control over sources of variability. 18

Summary Product Acceptance time and associated costs can be reduced by a unified effort of the CDMA community. Proposal 1: Reduce high test duplication of CDG stage 3. Significant areas of test duplication exist. Reduction in test time can be achieved by intelligent reuse of test results from previous CDG3 and CDG2. Proposal 2: Standardize call performance methods and criteria. More repeatable results can be achieved by controlling sources of variance and defining pass criteria. Similar analysis can be applied to strong signal tests to reduce test time. Standard test methods in the vehicle are needed to further reduce non deterministic events. 19

20 Questions