Standardization of Field Performance Measurement Methods for Product Acceptance Greg Twitty R & D Project Manager Product Test Factory Nokia Mobile Phones 1
Overview Current state of product acceptance testing Proposal 1: Reduce high duplication of CDG3. Proposal 2: Standardize call performance methods. Summary Questions 2
Field Testing for Product Acceptance Development timeline Field Test Activities R&D Testing Prep for Lab Entry Product Acceptance (CDG stage 3) Design Concept 3 Development Testing Test with internal methods. Verify compliance to standards. Significant portion of schedule. Approval cycle is significantly longer for CDMA than other protocols. Impacts product cost. Reduces competitiveness of CDMA. Customer Specific Testing 1. Pre-run CDG stage 3. 2. Demonstration in front of each Carrier
Challenges Facing CDG Stage 3 Time consuming and costly approval process. Significant resource and travel costs. CDG64 test cases are loosely defined. Each carrier has its own procedures. Not balanced for best error detection. Each CDG3 has duplicated tests with CDG2 and other CDG3s. Lack of standardized test methods and criteria impede data sharing. Unbalanced test reliability in call performance tests. Results are often not repeatable in weak signal areas. False failures often detected. Analysis is very time consuming. Excessive testing in strong signal areas. Reduce Product Acceptance time and cost Test where the errors are most likely. Improve reliability of results. 4
Standardization of CDG3 Test Methods Proposal 1: Reduce high test duplication of CDG stage 3. Phone software is very mature in many areas of feature test. Use data from vendor, other CDG3s and CDG2 to replace tests in generic areas. Proposal 2: Standardize call performance methods and criteria. Improve test accuracy in weak signal areas, Reduce unnecessary testing in strong signal routes. 5
Proposal 1: Reduce High Test Duplication of CDG Stage 3 6
Return on Investment Over Time Infra deployments of a common vendor differ little between carriers. Number of errors found drops off significantly after first CDG3. CDG3 test time required by carriers continues as product matures. Total Testing on Infra X Errors found R&D Pre Lab 1 st Carrier 2 nd Carrier 3 rd Carrier Time Testing Entry Approval Approval Approval 7
CDG Stage 3 Breakdown Call Performance Mobile Mobile Tests Feature Testing System Terminations Strong Signal Acquisition Originations Mixed Signal 4.1 Maintenance Weak Signal 2G Data HSPD Provisioning 4.10 -OTASP 4.2.1, 4.2.2, 4.3 4.11 Location -OTAPA Determination - IOTA N/A 4.8 Call Types -POTS -3-way - Call Waiting -Voice 4.8 Mail CDG 64 Test Cases - Authentication 4.6, 4.7 Static Tests Other -SMS - MMS -Browser -Java -Brew 4.5, 4.9 3-99 Test Cases 70% of Test Effort 30% Errors reported 37 Test Cases 30% of Test Effort 20% Errors reported 8 Test Cases 30% of Effort 50% Errors reported 8
CDG Stage 3 Breakdown Terminations Originations Maintenance RF Performance Rx - Sensitivity -IMD - Self jamming TX -Rho - Power control -Max Power Call Performance BB & DSP -Signal Acquisition -SHO - Searcher - Finger assignment Strong Signal Mixed Signal Weak Signal Mobile Tests System Acquisition Interoperability w/ Infra Channels -ACH -DPCH -FTCH -RTCH Messaging - Layer 1 - Layer 2 Handoffs -SHO - Interband HHO - Interfreq HHO 2G Data HSD Feature Testing Location Determination Provisioning -OTASP -OTAPA Static Tests Other -SMS - MMS -Browser -Java -Brew Call Types -POTS -3-way - Call Waiting -Voice Mail 9 Tested in Weak Signal conditions High Variability Tested in Good Signal conditions High Duplication Areas are tested typically in generic Tested conditions. in Good Signal conditions These test have commonality with CDG2 High tests Duplication and other CDG3s.
Reduce High Test Duplication of CDG Stage 3 Proposal: Identify & utilize common test results. 1. Carriers and infra vendors should define clear CDG3 test procedures that apply to generic network conditions. (common to all Infra deployments) Strong signal drive routes 2G Data / HSD in static conditions. Provisioning Location Determination (non CDG3) 2. Tests that are duplicated by CDG2 should also be identified. Call Types 3. Test results from CDG2 and previous CDG3 should be considered during product acceptance. Software changes made between test events should be evaluated. 4. Carrier and vendor continue to perform tests unique to carrier s network. System Determination Mobile IP / HSPD Messaging Browser, content downloads, etc. 10
Proposal 2: Standardize call performance methods and criteria. 11
Issues with Call Performance Tests 100-200 Originations Strong Signal 1-12 cities 1-2hr Long Call 100-200 Terminations Mixed Signal 95% absolute.75 2 drops/hr Weak Signal x2 Infras 2% of ref. phone 2% of ref phone Carriers institute a wide range of test cases and criteria. Weak Signal Routes: test results have high variation and are not reliable. Outcome is dependant on time of day. Misleading results cause unnecessary search for errors and retesting Significant pre-testing needed by vendor to measure lab entry readiness. Reducing variability of results will decrease acceptance time and cost. Strong Signal Routes: constitutes mostly generic conditions Test variance is very low. Results are repeatable. Few errors are detected in this test. Can reduce test length for quicker acceptance time. - or accept data from other CDG3s using same infra -> Proposal 1 12
Mixture of Test Conditions 1. Test special conditions separate from call performance. Pilot Pollution: Severity is affected by cell breathing. 2. Measure/control non deterministic factors where possible. Zone 2 Zone boundary: can miss a page when registration timer is active. Zone 1 Competitor or AMPS System Determination Too many variables. Hard to distinguish between other weak signal failures 13 Coverage fringe area Loaded Sector: Calls can be blocked during busy hour.
Example of Weak Signal Variability Failed Terminations per 100 Attempt 14 12 10 8 6 4 2 Comparison of Two Reference Phones Reference Phone A Reference Phone B Captured on weak signal route Same make & model Same software Indicates 2% passing criteria. 0 Day 1 Day 2 Day 3 Day 4 Day 5 Delta: +1 0-7 -2-1 +2 +7 Average = 0 Reference phone is intended to track network conditions. For each test, there is no correlation between the two phones. Correlation is apparent after 700 calls 14
Probability of Failure Creating a histogram from previous slide s data, a probability distribution can be seen. Shape, mean (µ), and variance (σ 2 ) of the distribution are determined by many factors: RF conditions at test time Test sample size (# call attempts) Phone performance Weak signal drive route Probability σ 2 = 8.3 3 2 1 Histogram using data from previous slide. 15 5 10 15 20 µ =9.4 Failures after 100 Attempts
Call Performance Criteria Variance determines effectiveness of the call performance test. For fixed criteria, large variance has a higher chance of failing a good phone. Results are less reliable and repeatable. Hence, cell breathing and sample size effect test reliability. Observation: 100 call attempts has a large variance for weak signal routes. High Variance Low Variance Test phone Chance of failing good phone. Chance of failing good phone. Reference phone 16 0 5 10 15 20 2% margin Failed Calls after 100 Attempts 2% margin
Standardize Call Performance Proposal: Drive test should be created with a target accuracy in mind. Need agreed method to measure test s repeatability. 1. Drive routes should focus on specific signal profile. Other test cases should be executed in a separate test. I.e.: weak or strong signal. No mixed signal routes. 2. Standard guidelines for determining test accuracy / repeatability are needed. Methods should have a statistical basis. 3. Standard methods for determining sample size and pass criteria are needed. This should include measuring network conditions at time of test. EXAMPLE Adaptive Pass Criteria Create an index of signal conditions (Ec/Io & RSSI) vs. reference performance. Test criteria should then be defined for each index based on a standard. Ec/Io and traffic loading should be measured at test time. Ec/Io RSSI Sample size Criteria -15 db -90 dbm 300 4% of ref. -13 db -90 dbm 250 4% of ref. -11 db -90 dbm 200 2% of ref... -11 db -70 dbm 50 95% abs.... 17
Test Methods to Standardize Different RF channels have different quality of service. Different channels will have different traffic loading. BTS antennas, combiners and RF equipment vary over frequency. Carrier and vendor ensure phones are provisioned to same channels. Inconsistent phone handling in test van. Affects antenna performance Use phone cradle. Rotate positions each lap. Controlled call cycle time. Must ensure phone is in idle state between call attempts. All steps in test should be defined and standardized. Laptop PC in test van can induce interference. Test laptop in advance. Rules needed for proximity to test phones. Common test methods are needed to ensure control over sources of variability. 18
Summary Product Acceptance time and associated costs can be reduced by a unified effort of the CDMA community. Proposal 1: Reduce high test duplication of CDG stage 3. Significant areas of test duplication exist. Reduction in test time can be achieved by intelligent reuse of test results from previous CDG3 and CDG2. Proposal 2: Standardize call performance methods and criteria. More repeatable results can be achieved by controlling sources of variance and defining pass criteria. Similar analysis can be applied to strong signal tests to reduce test time. Standard test methods in the vehicle are needed to further reduce non deterministic events. 19
20 Questions