`
`-Zt"m|.I
`
`l
`
`§LHiMFR [' y6.d v
`
`,_q.d 9
`
`2
`
`;7,r;
`
`]
`
`Hpqr
`
`r
`
`~7,.1 .
`Oyi
`,
`93.0
`
`2
`
`IJ . ij u‘; . c:~(_vm :-—
`
`\
`
`designer to describe flight software. This language
`specifies the DFCS in a form that is machine trans-
`latable.
`It also embodies constructs that enable
`the designer to express facts about system pgrfor_
`mance that can later be employed to set up executa-
`ble assertions for software verification.
`The
`language can be readily updated to express new fea-
`tures of DFCS by the addition of keywords.
`The
`salient features that the language translators
`should have are described and the language is
`applied to a typical DFCS module.
`
`References
`
`COMP F u.doL.‘vn4 r u.dnt.tomP §
`-).¢LA
`2
`
`fl.:: s» r.flu'F
`1
`when Fig_ 1 is examined, it is Seen that the speCi_
`fication is written by simply replacing the blocks
`with their keywords and their input—output signals.
`These signals are “declared” at the top of the
`specification. Assertions have also been intro-
`duced at Various points that use the '/$' facility;
`for example,
`the variable ‘a4’
`is declared to be
`less than 488_3_
`
`Conclusions
`
`In this paper,
`the problem of automatically generat-
`or benchmarks of DFCS modules
`ing simplified models
`A specification language has
`has been addressed.
`the system
`been proposed which makes it easy for
`
`A System for Process
`"ESPRESO:
`1) Ludewig, Jochen,
`Control Software Specification,” IEEE Trans. on
`Software Engg., Vol. SE—9, July 1983,
`pp. 427-436.
`2) Beichter, Friedrich W., Herzog, Otthein, Petzsch,
`Heik°s HSLAN 4: Software 3PeCifiC3ti0n and Design
`Languages” IEEE Tra“S~ 0“ Software Engg-2
`V0l- SE“10» March 1984: PP- 155'161-
`3) Rajah; N-3 de Feo, P. V., Saito, J., "Stress
`Testing of Digital Flight C°“tr°l System Soft“
`ware,” IEEE/AIAA 5th Digital Avionics Systems
`Conference, Seattle, Wash., Nov. 1983.
`
`4) Andrews, Dorothy M. and Benson, Jeoffrey P.,
`Automated Program Testing Methodology and Its
`Implementation.” Proceedings, 5th International
`Conference on Software Engineering, 1981,
`pp. 254-261. Also reprinted in "Tutorial:
`Software Testing and Validation Techniques,”
`2nd edition, 1981, pp. 349-356.
`
`"An
`
`
`
`500knou
`
`250 ft/secz
`
`
`cas,ms
`512knom/FS
`
`169 ft/sec
`
`udotcomp.
`
`0.926 ft/sec
`
`
` Iongecoob.
`
`120 ft/secz FS
`u.dot.avrg.
`64 ft/sec? FS
`
`
`
`
`
` 3000 ft/sec FS
`159 ft/sec
`
`o.cas.ms.p.
`
`o.cas.ms
`512 ""°‘5/F3
`
`
`
`|TERATION RATE 10/sec
`
`Fig.
`
`l Block diagram for the speed—comp module.
`317
`
`BOEING
`
`Ex. 1031, p. 376
`
`BOEING
`Ex. 1031, p. 376
`
`
`
`SUMMER
`
` SUMMER
`HISTORY
`
`HISTORY
`
`a) without history keywords.
`
`b) with history keywords introduced.
`
`Fig.
`
`2 Cross connection of keywords.
`
`Z5.D4 ZGAD
`
`1
`
`LOW PASS FILTER REPRESENTATION
`
`DIFFERENCE EON.:
`
`PRESENT OUTPUT =
`
`T|ME—CONSTANT ~ SAMPLING TIME/2 * PAST OUTPUT
`TIME—CONSTANT + SAMPLING TIME/2
`
`SAMPLING TIME/2
`TIME—CONSTANT + SAMPLING TIME/2
`
`* (PRESENT INPUT + OLD INPUT)
`
`Fig. 3 Representation of a low—pass filter.
`
`a7 a8
`
`Fig. 4 Representation of a limiter.
`
`318
`
`BOHNG
`
`Ex.1031,p.377
`
`BOEING
`Ex. 1031, p. 377
`
`
`
`AUTOMATED SOFTWARE TEST SYSTEM
`FOR THE 737-300 FLIGHT MANAGEMENT COMPUTER
`
`84-2665
`
`Steven C. Runo*
`
`Senior Engineer
`The Boeing Commercial Airplane Company
`Seattle, Washington
`
`Abstract
`
`Traditional manual approaches to software validation
`could not produce the required testing rigor within
`the inflexible 737-300 Flight Management Computer
`System program schedule.
`In response, an integrated
`and highly automated validation test system was
`developed based on experience gained from previous
`validation efforts. The "user-based" system consists
`of three basic elements:
`I) a specialized simulation
`
`for generating expected results, 2) an automated test
`bench
`that
`interrogates
`the
`operational
`flight
`program "in situ", and 3) a program to document and
`compare test results to expected results. The entire
`procedure is automated from test case design through
`final analysis of the test results. The system has
`proved to be an efficient and rigorous validation of
`the flight software within a tight time schedule and
`limited budget. Further,
`the user is released from
`tedious laboratory testing and allowed to concentrate
`on test analysis.
`
`LDLEQQLLCLIQD
`
`In recent years, automated software testing methods
`have been suggested as a means to provide rigorous
`verification and validation of operational
`flight
`programs without
`the drain on time and manpower
`that manual techniques normally require. Suggested
`methods include: specialized simulations, theoretical
`
`"best" approaches to test design, result comparators
`and automated test benches. However,
`in many cases
`these automated methods reduce the effort required
`in a specific area only to increase the overall testing
`effort by expanding the number of test conditions
`measured or the required degree of analysis.
`
`the 737-300
`The time and budget constraints of
`Flight Management Computer System (FMCS) project
`would
`not
`permit manual
`testing methods
`or
`inefficient
`automated methods.
`For
`example,
`approximately one third of the validation program for
`
`
`*Member AlAA
`Collyright © American Institute of Aeronautics and
`Astronautics, lnc., 1984. All rights reserved.
`
`319
`
`the 737-300 Flight Management Computer was testing
`of
`the performance functions.
`This was initially
`expected to require at
`least
`l0,000 separate test
`points and hundreds of laboratory test hours for each
`new software release. As a result, validation of the
`performance functions was considered "risky" and it
`was determined that the required testing could only
`be
`accomplished with an
`integrated,
`automated
`system for preparing expected results, conducting the
`laboratory testing and finally, analyzing the results.
`
`The system that was developed is largely the result
`of several years of experience in similar testing
`efforts. Appropriately therefore,
`this paper begins
`with a discussion of
`the previous Boeing Co.
`
`validation programs that significantly shaped the
`737-300
`Flight Management Computer
`System
`validation effort. Next is a detailed discussion of the
`
`performance function
`the
`specific elements of
`the plan of test,
`the
`validation testing,
`including:
`simulation software for generating expected results,
`the
`automated laboratory test
`system and
`the
`methods used to compare and archive the test results.
`The paper concludes with a brief description of the
`
`experience-to-date utilizing this approach.
`
`Bistocy.
`
`mmmmLm . The
`Performance Data Computer was originally developed
`in the mid-seventies in response to rapidly increasing
`fuel costs. The system was designed by Boeing and
`the hardware / software vendor, Lear Siegler, Inc. of
`Grand Rapids, Michigan, to optimize the performance
`of 727 and 737 aircraft via a mixture of stored and
`
`throttle setting
`and
`computed speed schedules
`targets based on current flight conditions.
`
`Validation testing of the PDCS was one of the first
`efforts of its type and thus has played a significant
`role in shaping the development of
`later validation
`efforts. The earliest test cases were designed to
`test each of the functions of the PDCS at conditions
`
`that were
`
`likely to be
`
`encountered in normal
`
`BOEING
`
`Ex. 1031, p. 378
`
`BOEING
`Ex. 1031, p. 378
`
`
`
`the PDCS was
`All early testing of
`operations.
`accomplished via manual entries through the Cockpit
`Display Unit
`(CDU) keypad or
`through test bench
`discrete
`switches
`and
`variable
`potentiometers.
`
`Results were manually recorded from the CDU
`responses of the PDCS. This test approach was highly
`time-consuming and thus, necessarily limited the
`scope and detail of practical
`testing to about 1000
`total test points.
`
`Late in the PDCS development program an automated
`test bench was developed. The automated test system
`would enter Cockpit Display Unit and aircraft system
`inputs, wait an appropriate length of time, read the
`CDU responses and compare the
`test
`results to
`pre-stored expected results. Because this system
`was somewhat inflexible and required a great deal of
`pre-test effort to convert the manual
`test cases to
`the automated format, it was primarily used to test
`only the uniformly-formatted propulsion test cases.
`However, the system did provide valuable experience
`for development of the 737-300 test system.
`
` i.
`The Performance Navigation Computer System was
`developed
`as
`one
`of
`the
`earliest
`"flight
`management"-type systems. The original
`intent was
`to integrate PDCS performance information with
`navigation and guidance
`capability in
`a
`single
`computer
`for 737-200 aircraft.
`Limited sales
`interest and technical problems forced cancellation
`of PNCS development prior to the anticipated system
`certification,
`Again,
`the accumulated experience
`would prove to be valuable.
`
`This system introduced the complexity of navigation
`and guidance computations overlayed on the basic
`performance information. Using a "flight plan buffer
`dump" developed by
`Lear Siegler
`for
`the PNCS
`program, much of the performance information could
`be captured at each waypoint in the predicted flight
`plan. The data could then be analyzed to determine if
`aircraft performance was being computed correctly.
`However, if an error was found, it was often difficult
`to trace it to its origin because the flight plan buffer
`dump could not
`include all
`of
`the performance
`variables
`and their intermediate values. The PNCS
`
`performance plan of test did not actually increase the
`total number of test points acquired, partly because
`of confidence in
`the previous Performance Data
`
`and engine models and partly
`Computer aircraft
`because the higher-order
`functions of
`the PNCS
`required significantly greater test and analysis time.
`The experience of the PNCS test program suggested an
`entirely new approach to validation testing of flight
`computer software was required.
`
` m). When the
`new generation airliner programs were launched in
`the late seventies, it was recognized that testing or
`the
`757 and 767 would be the most rigorous ever
`attempted. This especially applied to the new "glass"
`cockpits
`and
`flight management
`systems.
`in
`response, major projects were undertaken to develop
`automated testing techniques
`for validating the
`performance
`functions
`of
`the
`757/767
`Flight
`Management System. The first major project was the
`creation of a series of programs to generate flight
`management system expected results. These series
`of programs became the Boeing Standard Programs
`(BSP) and are detailed below. Other test tool projects
`included the development of an automated test bench
`and a test report comparator program.
`The latter is
`also detailed in the discussion below.
`
`The Performance Algorithm Test System (PATS) was
`designed to set test conditions and extract results
`from the operational flight program (OFP). The OFP
`source code is first prepared by adding input and
`output routines to translate between the unique FMS
`software structure and
`the PATS driver.
`This
`modified code is then loaded and executed in the
`
`Flight Management Computer hardware according to
`commands given in the PATS driver file. Test results
`are recorded in a file formatted identically to the
`expected results generated by the Boeing Standard
`Programs simulation software.
`The 757/767 work
`was a major influence on the development of
`the
`737-300 Flight Management Computer performance
`function test program and many of the features of the
`system described below were originally developed for
`the 757/767 FMS test program.
`
`_..I.
`
`the 737-300 Flight Management
`of
`Development
`Computer performance function test program began
`with the design of an overall plan of
`test and the
`development
`of
`requirements
`for
`the
`individual
`elements of
`the test
`system.
`The three major
`elements required under the plan of test were;
`I) the
`Boeing Standard Programs (BSP) expected results
`generator, 2) the Performance Validation Test System
`(PVTS) automated test bench and 3) the COMPEX test
`results comparator
`/
`report generator.
`The BSP
`simulation software would require major revisions to
`reflect unique 737-300 Flight Management Computer
`design requirements and the 737-300 airframe /
`engine combination. The Performance Validation Test
`System would have to be developed from the "ground
`up"
`to
`interact with the
`new flight computer
`hardware. However,
`the COMPEX report generator
`
`320
`
`BOEING
`
`Ex. 1031, p. 379
`
`BOEING
`Ex. 1031, p. 379
`
`
`
`are checked against expected results derived from the
`Boeing Standard Programs. Tolerances at this level
`are
`a
`combination
`of
`the database
`and
`FMCS
`
`requirements-specified tolerances.
`
`Ibe mission level of performance function testing is
`designed to validate the complete integration of a
`flight plan or "mission". This "top-level" testing also
`includes
`examination
`of
`performance
`function
`interactions with
`the
`other Flight Management
`System components. However,
`the scope of
`these
`tests are
`limited by
`separately defined "flight
`scenario"
`tests that are designed to check the
`complete
`range
`of
`Flight Management System
`component
`interactions.
`The intermediate results
`examined in the flight phase tests are also available
`on this level, but testing tolerances are now based
`only on tolerances specified in FMCS requirements.
`
`The Boeing Standard
`Programs (BSPs) are a set of mainframe computer
`programs
`originally
`developed
`to validate
`the
`performance management functions
`of the 757/767
`Flight Management Computer (FMC) system. The BSPs
`consist of nine separate, but inter-related programs
`that generate expected results in a format permitting
`automated comparsion with test results.
`Previous
`Boeing performance programs were generalized to
`support a wide variety of uses and not specifically
`tailored to the requirements of flight management
`software validation. The programs emulate the nine
`functions shown in Table l.
`
`
`
`A524 PROPEX Propulsion data base
`A525 ATMUSX Atmosphere and wind
`prediction
`
`A527 ALTEX
`A528 SPEEDX
`
`Altitude limits calculation
`Speed eneration
`
`A529 LEGEX
`A530 STEPEX
`
`Leg integration calculation
`Step-cruise optimization
`
`A531 BSPEX
`
`
`
`Full flight path trajectorg
`
`Table l Boeing Standard Programs
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`The BSPs are designed to share common modules
`with each other. This is best illustrated in Figure 2.
`The solid-lined boxes represent a related set of
`routines;
`the solid-lined boxes
`inside the larger
`dash-lined
`box
`(e.g.
`speed
`generators,
`leg
`integrators, etc.) are termed "functional" modules
`and represent sets of routines used by more than one
`BSP. The solid-lined boxes outside the dash-lined
`
`321
`
`developed as part of the 757/767 BSP project would
`only require user experience.
`
`lest. The performance plan of
` e Elan of
`test was developed given two major considerations:
`1)
`the experience gained in the PDC5, PNCS and
`757/767 FMS test programs, and 2)
`the timing of
`required software deliveries from Lear Siegler. The
`plan of test calls for a "bottom-up" approach starting
`with module-level validation of
`the aerodynamic,
`propulsion and atmospheric data bases. The "flight
`phase"
`testing level
`is designed to validate the
`integration of data base information.
`The
`final
`"mission"-level testing is directed at validating the
`the complete performance function system.
`Each
`subsequent
`level of
`testing is dependent on the
`successful validation of the lower levels of testing.
`This scheme not only simplifies pinpointing errors,
`but also blends well with 737-300 program schedules
`that called for the delivery of
`increasingly capable
`software in distinct packages. The plan of test is
`depicted graphically in Figure 1.
`
`FLIGHT ENVELOPE
`
`
`AERODYNAMIC
`PROPULSION
`ATMOSPHERIC
`
`
`DATA BASE
`DATA BASE
`DATA BASE
`
`
`
`
`Figure 1 Performance Plan of Test
`
`IbeJ1at.aI2a§e.1e1e1 is designed to check the individual
`software modules by varying each of
`the input
`parameters over,
`and outside,
`the full
`range of
`possible values, with particular attention to regions
`where errors are likely to occur Testing tolerances
`are limited to computer "round—off" errors since the
`polynomial nature of
`the data bases eliminates
`inexact interpolated or approximate extractions. This
`level also verifies the correct computation of
`the
`complete flight envelope defined by the combination
`of aerodynamic, propulsion and atmospheric data
`bases
`
`the
`testing examines
` of
`e.g.,
`computation of each separate mode of flight,
`maximum gradient climb,
`long range cruise, economy
`descent).
`Intermediate results are compared at each
`Performance integration step, rather than simply at
`each navigation—defined waypoint. Further, the values
`of intermediate parameters, such as thrust and drag,
`
`BOEING
`
`Ex. 1031, p. 380
`
`BOEING
`Ex. 1031, p. 380
`
`
`
`results and even development of
`data base equations.
`
`the performance
`
` . The
`Performance Validation Test System (PVT )
`is an
`automated test bench designed to accept
`Boeing
`Standard
`Program format
`inputs,
`execute
`the
`in
`corresponding
`software modules
`the
`Flight
`Management Computer, and
`record the results in a
`format
`suitable for
`automated comparsion with
`
`BSP-derived expected results. The system is unique
`in several
`respects.
`First,
`the vendor—supplied
`operational
`flight program (0FP)
`is
`interrogated
`without modification while it
`is resident
`in the
`
`vendor-—delivered hardware.
`
`This
`
`is
`
`significant
`
`because it ensures testing of the flight software in
`nearly actual operational form. Second, PVTS utilizes
`the same inputs to drive the execution of the OFP that
`were used to execute the BSP simulation software.
`
`This allows the test engineer to rapidly redesign a
`given test case to meet changing requirements and
`obtain both the test results and expected results
`
`without having to formulate a separate, and possibly
`different, test condition file.
`Finally, PVTS allows
`testing of a single function or a complete series of
`functions without user interface except at startup.
`This feature relieves the tester from monotonous and
`
`producing
`testing, while
`error—prone manual
`repeatable
`results much
`faster
`than manually
`possible.
`
`The PVTS hardware consists of
`a VAX l l/780
`computer connected to a Lear Siegler,
`inc.-supplied
`Computer Control Unit (CCU) via an eight-bit parallel
`bus.
`The Lear Siegler-developed interface system
`allows the VAX to set and examine variables internal
`to the Flight Management Computer and execute the
`operational flight program (OFP) from breakpoint to
`breakpoint. Lear Siegler "symbolic debug" software
`looks up the memory addresses of variables allowing
`the variables to be accessed by name. The system is
`shown in Figure 3.
`
`
`
`
`
`FMCS
`
`Egirfiger
`Um
`_
`‘
`Simulations,etc
`Test Station
`
`Test
`Driver
`
`Operating
`Sustem,etc.
`
`
`
`
`
`Test
`pawn,
`.
`
`
`
`0
`Test
`
`Conditions
`
`
`
`Boeing
`
`
`Standard
`
`Proqrsrns,
`
`etc.
`
`
`
`Test
`Cases
`Mainframe
`Compufing
`Sgsteni
`
`Data
`Analusis
`
`output
`F°"'“°‘ _
`
`Command
`Processor
`
`I Conversion
`
`Input
`Format
`
`Conversion
`
`F|“lCS VAX l I/780
`
`Figure 3 PVTS Overall System Diagram
`
`322
`
`BOEING
`
`Ex. 1031, p. 381
`
`g
`
`and represent
`termed "driver" modules
`box are
`routines which are used by only one BSP (e.g. SPEEDX,
`LEGEX, etc.). Arrows point from calling routines to
`the routines called.
`
`SPEEDX
`A528
`
`
`
`
`
`
`FNLIMX
`A526
`
`Figure 2 BSP System Architecture
`
`Execution of the BSPS requires files containing the
`
`necessary aerodynamic and propulsion coefficients,
`and a performance
`input
`file
`that
`selects the
`particular
`test
`conditions
`and
`functions
`to be
`
`executed. As flight testing of the 737-300 updated
`the available performance data, this new information
`
`could be periodically added to the propulsion and
`aerodynamic data bases by editing the coefficient
`files.
`Updated expected results could then be
`generated without
`redesigning the test cases or
`recoding the BSP routines.
`
`For the 737-300 FMCS performance validation effort,
`
`the BSPS required a new engine model, an enhanced
`aerodynamic model, polynomial speed generators and
`additional
`logic in the phase predictors, and leg
`integrators for
`the specific requirements of
`the
`737-300 FMCS. Since each version of
`the BSPS is
`
`based on a single engine model, a new BSP version
`was created for
`the 737-300 FMCS
`program.
`However,
`the other changes were added to the
`existing BSPs as selectable options available to all
`versions of the BSPs. For example, all versions of
`
`the BSPs currently allow the user to select speed
`generation based on 757/767 table look-up methods,
`737-300
`polynomial
`equations
`or
`generalized
`computational methods.
`
`the BSPS
`Despite these relatively major changes,
`were ready for use in a short period of time because
`of the extensive 757/767 development work that had
`
`In addition, a
`already been completed and validated.
`generalized version of
`the BSPs was also available
`that premitted preliminary generation of expected
`
`BOEING
`Ex. 1031, p. 381
`
`
`
`percentage or an absolute value. For each test run a
`complete set of comparsion statistics is generated
`and test points failing the tolerance criteria are
`highlighted.
`In addition, COMPEX produces a file
`containing the test conditions, the respective results
`and the amount out-of-tolerance in the same format
`as the test results and expected results. This file
`format is unique because it allows nearly immediate
`graphical analysis of the data at Interactive Graphics
`and Data Analysis sites located throughout
`the
`company.
`
`The test case inputs, BSP execution output, expected
`results, test results and comparsions are archived in
`summary hardcopies, and in detail on microfiche and
`magnetic tape.
`All previous test points can be
`accessed and reviewed to analyze software changes
`or to re-create previous test conditions. The basic
`flow of data through the performance validation
`system is shown in Figure 4. with the exception of
`operator commands to initiate the individual steps,
`the procedure is fully automated from entry of
`the
`BSP-format
`inputs into the mainframe computing
`system until
`the test
`report
`is ready for
`final
`analysis.
`
`Execution
`Outputs
`
`
`
`
`
`BSPS
`Execution
`
`Expected
`Results
`
`BSP
`InputFHe
`
`PVTS
`BenchTest
`
`
`
`Test
`Results
`
`
`
`
`
`
`
`
`COMPEX
`Execuuon
`
`Figure 4 Test Procedure Flow
`
`Validation testing of the 737-300 Flight Management
`Computer has been at
`least
`a
`six-day-a-week,
`sixteen-hour-a-day effort
`since early in
`l984.
`
`323
`
`BOEING
`
`Ex. 1031, p. 382
`
`Actual operation of the Performance Validation Test
`System is in two steps. The first stage reads the
`Boeing Standard Program-formatted input files and
`produces a command file that describes the test
`conditions in a language familiar to the second stage.
`A PVTS data base maintains information on the name
`and numerical conversions required to translate the
`inputs. This first step is typically accomplished off
`the test bench prior to the actual test session.
`
`The second PVTS stage is executed on one of two
`737-300 FMCS test benches using the previously-
`generated command file.
`Computer Control Unit
`the
`(CCU)
`commands
`set
`appropriate
`program
`counters, breakpoints and test conditions
`in the
`Flight Management Computer for each test procedure.
`The OFP is then executed and the test
`results
`recorded. The memory locations of FMC variables and
`modules are determined from the Lear Siegler symbol
`table routines mentioned previously.
`
`The development of the Performance Validation Test
`System (PVTS) was in three phases to support
`the
`three phases of
`the performance function plan of
`test. Phase I development was designed to execute
`individual data base modules
`and therefore
`is
`specialized for each test procedure that
`tests a
`separate
`operational
`flight
`program
`(OFP)
`computation.
`Phase II of
`the PVTS development
`concentrated on the the flight phase test cases
`where a single module calls the execution of several
`other modules. This required additional PVTS logic
`to
`capture
`and
`interpret
`the
`integration—
`by-integration results of the OFP. The final phase of
`PVTS work required a separate Cockpit Display Unit
`(CDU) keypush driver. The keypush driver enters the
`required flight plan information via the CDU and the
`PVTS captures the mission information as it is built
`in the OFP.
`
` . After the
`test results are obtained from the Performance
`Validation Test System,
`they are transferred via
`tape back to the mainframe computing system where
`the expected results are stored after being generated
`by the Boeing Standard Programs. The test results
`are compared to the BSP expected results by COMPEX.
`COMPEX is a point—by-point comparator and report
`writer program developed as part of
`the 757/767
`BSP project. On a database level, the comparsions
`are one-to-one and required to be exact (allowing
`only for computer round-off errors). For the flight
`phase and mission test cases, COMPEX interpolates
`the more dense expected result file to compare the
`test
`and
`expected
`results
`at
`the
`same
`test
`conditions. Tolerances can be defined as either a
`
`BOEING
`Ex. 1031, p. 382
`
`
`
`Aclsnomtedgments
`
`While it is not possible to thank everyone involved in
`this project, the efforts of B. Blair in developing the
`Boeing Standard Programs, and the efforts of K. Singh
`and J, Weston in Performance Validation Test System
`development must be recognized.
`The author is
`grateful
`to them and the scores of others that
`contributed to the success of
`this program.
`The
`author would also like to thank the FMC vendor, Lear
`Siegler, Inc., for their outstanding support.
`
`Nearly one hundred performance function tests, each
`with about ten separate cases and over one hundred
`test points, have been designed,
`executed,
`and
`analyzed in this period. The sheer volume of testing
`accomplished to date
`clearly demonstrates
`the
`capabilities of the system. But this level of detail
`and test rigor has not come at the expense of scarce
`engineering or laboratory test bench time. While
`nearly a
`third of
`the planned 737-300 Flight
`Management Computer testing is in the performance
`functions,
`less than a tenth of
`the scheduled test
`
`assigned to performance
`been
`bench time has
`function testing. Using the Performance Validation
`Test System,
`the
`complete Flight Management
`Computer performance data base
`(including all
`aerodynamics, propulsion, and atmospheric data) can
`be interrogated in less than eight hours.
`No user
`interaction is required to execute the over fifty
`complete and separate tests designed to rigorously
`exercise the operational flight program performance
`data base software. Often this testing is completely
`unattended and is accomplished between the hours of
`midnight and 8:00 AM, while the test bench would
`normally be unused.
`
`Test analysis has kept pace with the rapid rate and
`high volume of test results. Test reports are nearly
`all complete the week after a series of twenty or
`more tests are run. Further, the test documentation
`is far more detailed and complete than has been
`possible on previous programs. This is expected to
`have a
`large payoff
`in the future in update or
`modification validation efforts.
`
`CQDQJLISIQDS
`
`The above described procedures and testing tools
`have been highly successful
`in the 737-300 Flight
`Management Computer validation program. By taking
`a "user-based" and integrated approach to the test
`system design, validation
`of
`the
`performance
`functions has been transformed from the "most
`
`risky" area of the 737-300 FMCS testing to one of
`the
`"least
`risky."
`Further,
`testing rigor
`and
`documentation have been enhanced while the tester
`
`is freed from the laboratory to concentrate on test
`analysis. More traditional approaches to software
`validation could not have suceeded under the time
`
`the 737-300 program.
`and budget constraints of
`Even without those constraints,
`it
`is unlikely that
`manual testing methods could have provided even a
`small fraction of the results already produced. The
`above described approach has proved to be a very
`efficient and effective technique for the validation
`testing of the 737-300 Flight Management Computer.
`
`324
`
`BOEING
`
`Ex. 1031, p. 383
`
`BOEING
`Ex. 1031, p. 383
`
`
`
`A METHOD FOR TESTING A
`DIGITAL FLIGHT CONTROL SYSTEM WITHOUT
`THE USE OF GROUND SUPPORT EQUIPMENT
`
`84-2664
`
`H. E. HANSEN, LEAD ENGINEER, ELECTRONICS
`MCDONNELL DOUGLAS CORP.
`ST. LOUIS, MO.
`
`Suflace
`Commands
`CYBER
`
`
`- Equafions
`of Motion
`
`
`
`
`Suck
`_ Rudder
`Dome, ’ ’ and Power
`
`Cockpn
`- Pnm
`
`\
`
`and
`‘~___Rwdm
`
`
`
`GP-13 D648 3 Fl
`
`Figure 1
`Block Diagram Man-In-Loop Simulator
`
`limited research project would have been too costly
`and time consuming. Nevertheless, it was extremely
`important
`to verify proper integration of
`the DFCS
`and the input signals.
`
`During lab testing, a Digital Development Set
`(DDS) was used to interrogate memory to check the
`input signals.
`The DDS consisted of a computer
`(PDP 11/34), a Keyboard to input commands, software
`(operating system)
`to process those commands, an
`output device (CRT)
`to observe the contents of
`memory and a link (data bus) between the computer
`and the DFCS.
`The F-15 test bed had a computer
`the Central Computer
`(CC). Functionally,
`the DDS
`could have been used for DFCS checkout at the
`aircraft. However,
`the DDS consisted basically of
`commercial equipment designed for a laboratory
`environment.
`To ruggedize this equipment for
`flight line use would have been too costly and time
`consuming. Consequently it was necessary to use a
`different approach.
`
`the standard F-15
`It was observed that
`equipment could be adapted to serve as the controls
`and displays to a self-contained DDS;
`the
`Navigation Control Indicator (NCI) panel could
`serve as a keyboard and the Vertical Situation
`Display (VSD) could serve as an output device,
`while the MIL-STD-1553A multiplex bus was
`the data
`link between the CC and the DFCS.
`The only missing
`element was
`the software which could use this
`equipment
`to verify the integration of
`the DFCS
`with the aircraft. This paper describes how,
`through use of standard F-15 equipment and the
`addition of
`the proper software,
`the F-15 test
`aircrft was transformed into its own self-tester.
`The software that uses the resident equipment
`to
`convert
`the airplane into a self-tester is referred
`to as "Maintenance BIT" and "Preflight BIT". That
`software will be described in detail.
`
`BOHNG
`
`Ex.1031,p.384
`
`ABSTRACT
`
`On 26 May 1983, a McDonnell Douglas F-15 Eagle
`fighter became the first to fly with a Digital
`Flight Control System (DFCS) programmed in a higher
`order language.
`The topic of this paper is the
`testing of
`the F-15 DFCS prior to that historic
`flight. Laboratory testing procedures and test
`equipment are described. Testing at the airplane
`is the major emphasis, describing the equipment
`resident on the airplane and the software, dubbed
`Maintenance BIT,
`that was used to transform the
`airplane into its own self-tester. Also covered
`are actual examples of how Maintenance BIT was used
`to solve problems at the airplane that might have
`been difficult or impossible with special
`test
`equipment. Other accomplishments resulting from
`this approach are also enumerated.
`
`Introduction
`
`technological
`Recognizing a number of
`advancements were occurring which could importantly
`affect the direction of digital flight control
`system (DFCS) design in the near future,
`the
`McDonnell Aircraft Company initiated an Independent
`Research and Development
`(IRAD) program in the
`summer of 1981 designed to assess advancements in
`the following technology areas: microprocessors,
`higher order languages (HOL's), floating point
`arithmetic, and parallel processing.
`1
`
`The particular hardware configuration that
`served as the supporting system for technology
`evaluation is built around the present F-l5 Eagle
`Dual Control Augmentation System (CAS). This was
`done for two reasons:
`the electronic portion of
`the
`CAS could be modified at minimum cost/time to
`provide all of
`the required digital features; and,
`the system could be flight tested with minimum
`aircraft modification.
`
`the DFCS included
`The software testing of
`static gain checks,
`frequency responses,
`integration in the F-15 hydraulics lab and
`evaluation by experienced F-15 test pilots in a
`simulator. This latter test, depicted in Figure 1,
`allows the pilot - should he suspect an anomaly in
`the DFCS -
`to immediately make a comparison with
`the production flight control system. These tests
`demonstrated that an F-15 with the DFCS would
`perform exactly like production F-15's that have
`analog Flight Control Computers, if the DFCS
`performed in the aircraft as it had during testing.
`This could happen only if the more than 100
`discrete and analog signals were brought into the
`DFCS in the memory locations expected by the
`Software.
`
`Traditionally, special support equipment has
`bfen used to check for proper operation at the
`aircraft. Developing such equipment for this
`
`C0Pyrigh! © American Institute of Aeronautics and
`Astronautics. Inc.. 1984. All rights reserved.
`
`325
`
`BOEING
`Ex. 1031, p. 384
`
`
`
`System Description
`
`The flight control hardware configuration that
`served as the supporting system for this project is
`built around the present F-15 Eagle Dual Control
`Augment