throbber
DSL Forum
`
`Technical Report
`
`TR-023
`
`Overview of ADSL Testing
`
`ADSLF Testing & Interoperability Working Group
`
`Source:
`
`May 26, 1999
`
`Abstract:
`
`This document provides an overview of static interoperability, dynamic interoperability, and conformance
`testing of ADSL equipment.
`
`Egtice:
`
`This Working Text represents work in progress by the DSL Forum and must not be construed as an official
`DSI.. Forum Technical Report. Nothing in this document is binding on the DSL Forum or any of its
`members. The document is offered as a basis for discussion and communication, both within and without
`the DSL Forum.
`
`‘2000 Digital Subscriber Line Forum. All Rights Reserved.
`
`DSL Forum technical reports may be copied, downloaded, stored on a server or otherwise redistributed in
`their entirety oniy.
`
`Notwithstanding anything to the contrary, the DSL Forum makes no representation or warranty, expressed
`or implied, concerning this publication, its contents or the completeness, accuracy, or applicability of any
`information contained in this publication. No liability of any kind shall be assumed by the DSL Forum as
`a result of reliance upon any information contained in this publication. The DSL Forum does not assume
`any responsibility to update or correct any information in this publication.
`
`The receipt or any use of this document or its contents does not in any way create by implication or
`otherwise any express or implied license or right to or under any patent, copyright, trademark or trade
`secret rights which are or may be associated with the ideas, techniques, concepts or expressions contained
`herein.
`
`'
`
`Dish
`
`Exhibit 1026, Page 1
`
`

`
`
`
`Overview ofADSL Testing
`
`PROJECT:
`
`ADSL Forum Testing & Interoperability Working Group
`
`
`
`SOURCE:
`
`ADSL Forum Testing & Interoperability Working Group
`
`
`
`TITLE:
`
`Working Text 2? — Overview of ADSL Testing
`
`
`
`DATE:
`
`May 26, 1999
`
`
`
`DISTRIBU'I1ON:
`
`ADSL Forum Testing & Interoperability Working Group members.
`
`
`
`Revision History
`
`Date (MIDN)
`
`Version Major Changes.
`
`611998
`
`9!18f98
`
`1‘Iz‘28!98
`
`3i16{99
`
`5;’26!99
`
`1
`
`2
`
`3
`
`4
`
`5
`
`First draft
`
`First straw ballot
`
`Second straw ballot
`
`Incorporate comments from second straw ballot
`and editing during the March 4X5. 1999 meetings,
`editing session, and suggestions from
`members.
`
`Incorporate comments from third straw ballot and
`editing during the May 26-28, 1999 meetings,
`and editing session.
`
`
`
`Dish
`
`Exhibit 1026, Page 2
`
`

`
`Purpose and
`
`1
`
`1.1
`
`1.2
`
`1.3
`
`Overw'ew of/IDSL Testing
`
`CONTENTS
`
`_\4...'L.&.
`
`1.3.1 Basic Terms ........................................................................................................... ..1
`
`1.3.2 Terminology for Conformance
`
`2
`
`1.3.3 Terminology for Static Interoperability Testing
`
`1.3.4 Terminology for Dynamic Interoperability Testing ........................................... .. 3
`
`1.4
`
`List of Acronyms ........................................................................................................... ..3
`
`2 Overview of Testing ............................................................................................................. ..4
`
`3Test
`
`3.1 Electrical
`
`3.2 Basic Features and Capabilities ............................................................................... ..5
`
`3.3 Test Support and Management Requirements ...................................................... ..5
`
`4 Conformance Testing ........................................................................................................ .. 5
`
`4.1
`
`Levels of ConformanceTesting
`
`5
`
`Interoperability
`
`5.1
`
`Static Interoperability
`
`5.2 Dynamic Interoperability Testing ............................................................................... ..7
`
`5.2.1
`
`Performance Metrics for Dynamic Interoperability Testing ............................ ..7
`
`5.2.2 Performance Test Conditions for Dynamic Interoperability Testing ............ ..8
`
`5.2.3 End-to-End Performance. Parameters for Dynamic Interoperability Testing 8
`
`5.3 Levels of Interoperability Testing ............................................................................... ..8
`
`6
`
`Dish
`
`Exhibit 1026, Page 3
`
`

`
`Overview of.-i|DSL Testing
`
`1
`
`Introduction
`
`The goal of the ADSL Forum Testing & Interoperability Working Group is to assist in
`generating interoperability test standards that ensure interoperability among various
`implementations. The success of ADSL deployment depends on a fully standardized
`technology and complete cross-vendor interoperability. Current ADSL standards
`focus on system level specifications, and do not fully address individual product
`specifications. the basis for cross-product interoperability.
`
`This document is designed to be a living document that changes to adapt to the
`current technology and the market. The objective is to create a skeleton of test
`requirements and configurations for all major phases of tests. reflecting ADSL ‘
`operation.
`It is envisioned that release of this document will foster additional
`discussion regarding when testing should occur and test methods that are both
`adequate and cost effective.
`
`In the following sections, the various test requirements. as well as rationale for these
`tests are highlighted. Future work will address test configurations and suites for
`ADSL in controlled environments. Field-testing may be a subset of the requirements
`in the laboratory.
`
`1.1
`
`Purpose and Scope
`
`This document provides an overview of the different areas of testing. The areas of
`testing include conformance testing, static interoperability testing, and dynamic
`interoperability testing. This is a generic introduction document that establishes a
`skeleton for the various testing needs.
`
`1.2
`
`References
`
`1. ETR~212, Methods for Testing and Specification (MTS), Implementation
`Conformance Statement (ICS) Proforma Style Guide. ETSI, December 1995.
`
`2. ADSL Forum TR-002, ATM over ADSL Recommendations.
`
`3. ADSL Forum TR-003, Framing and Encapsulation Standards for ADSL: Packet
`Mode.
`
`4. ATM Forum Test Specifications, af—test-0022.000, December 1994.
`
`1.3
`
`Terminology
`
`The following definitions are used in this document:
`
`1.3.1 Basic Terms
`
`Abstract Test Case: A complete and independent specification of the action required
`to achieve a specific test purpose (or a specific combination of test purposes),
`
`Dish
`
`Exhibit 1026, Page 4
`
`

`
`WT-2 7- v5
`
`Overview of A DSL Teen‘:1g
`
`defined at the level of abstraction of a particular Abstract Test Method, starting at a
`stable testing state and ending in a stable testing state.
`
`Abstract Test Method: The description of how a Unit Under Test (UUT) is to be
`tested, given an appropriate level of abstraction to make the description independent
`of any particular realization of a Means of Testing, but with enough detail to enable
`tests to be specified for this test method.
`
`Abstract Test Suite: A complete set of abstract test cases, possibly combined into
`nested test groups, that is necessary to perform conformance testing or IOP testing.
`
`ADSL System: An ATU-C and an ATU-R.
`
`Conformance Testing: Testing the extent to which a unit—under test (UUT) conforms to
`a specification.
`
`End—to-End Test: Tests the UUT in the context of a network and its architecture.
`
`Executable Test Case: A realization of an abstract test case.
`
`Executable Test Suite: A complete set of executable test cases, possibly combined
`into nested test groups, that is necessary to perform conformance testing or IOP
`testing.
`
`implementation Conformance Statement (ICS): A statement made by the supplier of
`the system implementation or system claimed to conform to a given specification,
`stating the capabilities and options implemented, and those that have been omitted.
`For the purposes of this document, the three types of ICS are ADSL ICS, ADSL
`Electrical ICS, and ADSL Protocol ICS.
`
`Interoperability (IOP) Testing: Testing the degree of compatibility between an ATU-C
`and an ATU~R based on the features that both have implemented.
`
`Means of testing: The combination of equipment and procedures that can perform
`the derivations, selection, parameterization and execution of test cases, in
`conformance with a reference standardized Abstract Test Suite, and can produce a
`
`conformance log.
`
`Test Case: Either an abstract or an executable test case.
`
`Test Group: A named set of related test cases.
`
`Test Suite: Either an abstract or executable test suite.
`
`Unit Under Test (UUT): The part of the system that is to be tested.
`
`1.3.2 Terminology for Conformance Testing
`
`Conforming Implementation: An implementation that satisfies both the static and
`dynamic conformance requirements in accordance with the specifications stated in
`the ADSL ICS,
`
`Static Conformance Requirement: A requirement that specifies the values with
`associated limits of the implemented capabilities permitted in a system that is
`claimed against the specifications.
`
`Dish
`
`Exhibit 1026, Page 5
`
`

`
`WT—2 7- v5
`
`Overview of/IDSL Testing
`
`Dynamic Conformance Requirement: A requirement that specifies the observable
`behavior permitted by the specification.
`
`1.3.3 Terminology for Static Interoperability Testing
`
`Static Interoperability: An ATU-C and an ATU-R are statically interoperable if they
`implement a common and compatible set of features, functions and options and can
`demonstrate satisfactory mutual communication in a benign environment (r'.e., over a
`NULL loop with no noise intrusions). "Compatible" means that there are no
`conflicting requirements that will prevent the ADSL system from achieving
`interoperability. Static interoperability testing is often referred to as interoperability
`testing in other standards [4].
`
`1.3.4 Terminology for Dynamic Interoperability Testing
`
`Dynamic Interoperability: An ATU—C and an ATU-R are dynamically interoperable if
`they implement a common and compatible set of features, functions and options and
`can demonstrate satisfactory mutual communication in a real network architecture
`environment as performance test conditions are varied and exercised. "Compatible“
`means that there are no conflicting requirements that will prevent the ADSL system
`from achieving interoperability. Dynamic interoperability testing is often referred to as
`performance testing in other standards [4].
`
`Performance: The behavior of a system with respect to time and resources.
`
`Performance Parameters: The performance aspects of a system that can be
`measured.
`
`Performance Metrics: A set of parameters that characterize the performance of a
`
`system.
`
`1.4
`
`List of Acronyms
`
`ADSL
`
`ANSI
`
`ATM
`
`Asymmetric Digital Subscriber Line
`
`American National Standards Institute
`
`Asynchronous Transfer Mode
`
`ATU—C
`
`ADSL Terminal Unit at the Central Office
`
`ATU-R
`
`ADSL Terminal Unit at the Remote site
`
`BER
`
`Bit Error Ratio
`
`BR—ISDN
`
`Basic Rate ISDN
`
`HDSL
`
`High Bit-rate Digital Subscriber Line
`
`ICS
`
`IOP
`
`Implementation Conformance Statement
`
`lnterOPerabi|ity
`
`lSDN
`
`Integrated Service Digital Network
`
`3
`
`Dish
`
`Exhibit 1026, Page 6
`
`

`
`WE"-27-v5
`
`Overview of/IDSL Testing
`
`LB
`
`LOV
`
`POTS
`
`PSD
`
`PSTN
`
`RL
`
`TC
`
`Longitudinal Balance
`
`Longitudinal Output Voltage
`
`Plain Old Telephone Service
`
`Power Spectral Density
`
`Public Switched Telephone Network
`
`Return Loss
`
`Transmission Convergence
`
`UUT
`
`Unit Under Test
`
`2 Overview of Testing
`
`The end goal for the proposed testing is to create confidence that successfu|ly—tested
`products will provide reasonable performance when connected to similar
`successfully-tested equipment and when deployed "in the field" on reasonable
`copper loops.
`
`In order to achieve this goal, it must be determined if the products meet the
`specifications and if they can interoperate without observable problems. Furthermore.
`the products should be able to perform under various load conditions. Three types of
`testing, static interoperability, dynamic interoperability, and conformance testing are
`used to provide the users with some level of confidence that the products meet these
`requirements. Each of these types of tests can be quite extensive. They are also
`independent and one is not necessarily a prerequisite to the other two.
`In addition,
`success or failure of one type of testing is neither a prerequisite for, nor indicative of
`the expected results of, the others. The combination of all three types of testing will
`provide the highest degree of confidence that tested equipment will interoperate at
`acceptable levels of performance when deployed on real loops.
`
`0 Conformance testing attempts to evaluate an implementation against a
`specification.
`
`- Static interoperability testing attempts to evaluate two interconnected
`implementations, regardless of how well they conform to a specification or how
`well they perform in a network.
`
`- Dynamic interoperability testing attempts to evaluate an implementation in a
`real network environment under varying conditions to see how it performs.
`
`The common requirement for all the three areas of testing is the ADSL
`Implementation Conformance Statement (ICS). The ADSL ICS is used to determine
`which tests are necessary; what modifications, if any, are needed; and, in some
`cases, which tests can be omitted.
`
`Companion documents will define requirements for ADSL Implementation
`Conformance Statements (ADSL ICS). The information in the ADSL ICS can be used
`for selecting test cases. The ADSL ICS is constructed using the guidelines proposed
`in ETR—212°[1].
`
`Dish
`
`Exhibit 1026, Page 7
`
`

`
`WT-2 7—v5
`
`Overview ofADSL Tesfirig
`
`3
`
`Test Conditions
`
`3.1
`
`Electrical Specifications
`
`To evaluate any implementation, it is necessary to have a completed ADSL electrical
`ICS. The Electrical Specifications and permissible limits must be provided. This
`includes requirements such as POTS splitter characteristics, line termination, ADSL
`electrical characteristics, etc.
`
`The ADSL electrical ICS is constructed using the guidelines proposed in ETR-212 [1].
`
`3.2
`
`Basic Features and Capabilities
`
`The ICS may contain the following:
`
`1.
`
`2.
`
`Information needed by the testing laboratory in order to run an appropriate test
`suite on the specific system;
`
`Information to help determine which supported capabilities are testable and which
`are not testable.
`
`in order to test an implementation, the test laboratory requires information relating to
`the UUT and its testing environment in addition to that provided by the ICS.
`
`3.3
`
`Test Support and Management Requirements
`
`The test support and management requirements are the test hooks and UUT control
`mechanism for placing the product in the appropriate test condition needed to
`facilitate the test. This includes a common and simple physical access (e.g., a serial
`interface) to the UUT, access to the relevant UUT performance monitoring registers
`needed to evaluate performance, control of the UUT to enable it to produce or inhibit
`stimuli for measuring electrical conformance, etc. Test access is required for the
`majority of test phases. For example, during conformance testing of a stand-alone
`ATU-R, one needs to provision the UUT to transmit test stimuli during a power
`spectral density (PSD) mask test or a longitudinal output voltage (LOV) test, and also
`to inhibit transmission during a return loss (RL) and longitudinal balance (LB) test.
`
`4 Conformance Testing
`
`Conformance testing is the verification of the capabilities and options implemented by
`UUTs in a system architecture configuration when compared to specifications defined
`by standards development organizations (as declared in the ICS completed with the
`requirements of these standards). A conformance test suite can target a specific
`protocol layer or protocol. A product can meet conformance at one protocol layer, but
`not at another. Conformance testing can also be used to localize the nature of a
`problem after failure of an interoperability test. Conformance testing is accomplished
`with conformance test equipment connected to the UUTs. Based on the filled out ICS,
`conformance test suites are executed, and the results are analyzed.
`
`Dish
`
`Exhibit 1026, Page 8
`
`

`
`WT-2 7- v5
`
`Overview of/iDSL Testing
`
`4.1
`
`Levels of Conformance Testing
`
`There are three general levels of conformance testing:
`
`- Electrical conformance
`
`a Physical conformance
`
`Electrical and physical conformance are closely related. Examples of areas
`evaluated during electrical and physical conformance testing include DC
`characteristics, voiceband characteristics, POTS splitter characteristics, and ADSL
`band characteristics.
`-
`
`0 Higher layers of conformance
`
`Protocols supported between the ATU-R and ATU—C above the physical layer may
`need to be tested to validate conformance with architectural specifications such as
`ADSL Forum TR~0O2 [2] and TR~003 [3]. Areas for which test suites may need to
`be specified include:
`
`— PPP over ATM on ADSL
`
`— Frame—based architectures (e.g., FUNI on ADSL or HDLC on ADSL)
`
`Other areas may need test suites defined as the ADSL Forum specifies additional
`architectural issues for ADSL networks.
`
`It may not be practical to test conformance at higher layers. Verifying higher layers
`is more easily accomplished as part of interoperability testing.
`
`5
`
`Interoperability Testing
`
`5.1
`
`Static Interoperability Testing
`
`Static Interoperability Testing verifies the operation of a pair of modems to operate in a
`benign environment (i.e., NULL loop, and no noise intrusions).
`
`The purpose of static interoperability testing is to confirm the degree to which two
`units of equipment can communicate with each other, based on the set of
`implemented features derived from the ADSL ICS. Static interoperability enables end-
`users to interconnect equipment from different manufacturers with a certain
`confidence level that these pieces of equipment can satisfactorily communicate with
`each other in a stable laboratory environment.
`
`An interoperability test is performed in an ADSL system with an ATU-C and an ATU-R
`that implement the same mandatory features and functions, yet may differ with regard
`to their optional implementations.
`in some cases, their ability to interoperate often
`depends on these optional features.
`
`Static interoperability testing verifies the system behavior of two connected systems
`and can be limited to specific protocols within the stack.
`It involves testing both the
`capabilities and the behavior of an implementation in an interconnected environment
`
`Dish
`
`Exhibit 1026, Page 9
`
`

`
`WT—27—v5
`
`Overview of/.lD.S'L Testing
`
`and verifying whether or not an implementation can communicate satisfactorily with
`another implementation of the same or of a different type.
`
`Static interoperability testing does not include assessment of the performance,
`robustness, or reliability of an implementation. Such evaluation is the result of
`dynamic interoperability testing discussed in Section 5.2.
`
`Static interoperability testing does not include the assessment of the conformance of
`an implementation relative to the standards. This assessment is the result of
`conformance testing discussed in Section 4. Note that two implementations can be
`non-standard but still inter-operate. Static interoperability testing does not
`necessarily check each mandatory feature defined in a standard against the
`implementation of the UUTs.
`
`5.2
`
`Dynamic Interoperability Testing
`
`Dynamic Interoperability Testing verifies the ability of a pair of modems to interoperate
`in an environment related to a real network architecture.
`It is intended to measure the
`joint operability of the pair of UUTs over an appropriate range of operating parameters
`and is an evaluation of the robustness, reliability, and adherence to defined
`performance metrics of UUTs in a system architecture configuration. Dynamic
`Interoperability Testing entails a measurement procedure during which the ability of
`the UUTs to communicate with each other is evaluated as the performance test
`conditions are varied and exercised.
`
`The extent of dynamic interoperability testing is variable and is defined for each
`dynamic interoperability test suite.
`
`The parameter of physical layer performance has traditionally been the Bit Error Ratio
`(BER), which is the ratio of errored bits over the total number of bits transmitted.
`ADSL has many provisions, which correct limited bursts of errors, and adapt the
`overall bandwidth of the line to maintain a specific BER with a set noise margin and
`latency. The classic BER alone may not be sufficient as a sole performance metric,
`and additional elements such as transmission rates achieved on a test link, noise
`margin attained at a given rate, number of errored seconds, may be needed.
`
`5.2.1 Performance Metrics for Dynamic Interoperability Testing
`
`The following are examples of performance metrics used during dynamic
`interoperability testing of ADSL equipment:
`
`- Upstream and downstream transmission rates
`
`- Noise margin
`
`- ADSL line status
`
`- Transmitted FEC blocks
`
`a Corrected FEC blocks
`
`- Transmitted superframes
`
`Dish
`
`Exhibit 1026, Page 10
`
`

`
`WT-27-v5
`
`Overview of A DSL Testing
`
`- Uncorrectable superframes
`
`o Counters for current and previous loss of signal, loss of frame. loss of power,
`errored seconds
`
`- Bits per carrier
`
`-
`
`lnterleave delay
`
`- Rate adaptation (e.g., dynamic rate repartitioning. dynamic rate adaptation, and
`fast retrain)
`
`5.2.2 Performance Test Conditions for Dynamic Interoperability Testing
`
`Following are examples of performance test conditions used during dynamic
`interoperability testing of ADSL equipment:
`
`a Loop characteristics: (length, wire gauge, bridged taps, attenuation. etc.)
`
`- Customer premises wiring
`
`o PSTN conditions: (ringing, ring—trip, battery feed, signaling, etc.)
`
`o Co-channel noise interference: (BR—lSDN, HDSL, ADSL, etc.)
`
`- Other interference: (Impulse noise. RFI. etc.)
`
`5.2.3 End-to-End Performance Parameters for Dynamic Interoperability Testing
`
`To network providers, performance through the network to the user is critical.
`Following are examples of parameters that may be measured:
`
`a Service Quality
`
`- Latency
`
`- Bandwidth and Data Rate availability
`
`5.3
`
`Levels of Interoperability Testing
`
`There are several levels of interoperability that affect both the vendor and the service
`provider. On a case-by-case basis, interoperability at a particular layer can be
`evaluated in either a static or a dynamic interoperability test scenario. Some
`examples are:
`
`0 Physical layer
`
`0
`
`Interoperability at the TC layer
`
`- ATM over ADSL Layer
`
`-
`
`STM over ADSL
`
`- Higher Layers of interoperability
`
`Protocols supported between the ATU-R and ATU-C above the physical layer may
`need to be tested to validate interoperability with architectural specifications such
`
`Dish
`
`Exhibit 1026, Page 11
`
`

`
`W712 7- v5
`
`Overview ofADSL Testing
`
`as ADSL Forum TR-002 [2] and TR-003 [3]. Areas for which tests may need to be
`specified include:
`
`— PPP over ATM on ADSL
`
`— Frame-based architectures (e.g., FUNI on ADSL or HDLC on ADSL)
`
`Other areas may need test suites defined, as the ADSL Forum specifies other
`architectural issues for ADSL networks.
`
`6
`
`Summary
`
`This document has summarized each of the different testing areas. As stated earlier,
`each area serves its own purposes and the use of all three will give the end—user the
`highest level of confidence that the equipment meets the appropriate specification
`and will interoperate.
`
`Dish
`
`Exhibit 1026, Page 12

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket