throbber
NUREG/CR-6316
`SAIC-95/1028
`Vol. 3
`
`
`
`Guidelines for the
`Verification and Validation of
`Expert System Software and
`Conventional Software
` RecelvE>
`‘OST
`
`Survey and Documentation of Expert System
`Verification and Validation Methodologies
`
`
`
`Prepared by
`E. H. Groundwater, L. A. Miller, S. M. Mirsky
`
`Science Applications International Corporation
`
`Prepared for
`U.S. Nuclear Regulatory Commission
`
`and
`
`Electric Power Research Institute
`
`.
`'
` SSee
`
`CONFIGIT 1033
`
`
`DISTRIBUTION OF THIS DOCUMENT 1S UNLIMITED
`
`CONFIGIT 1033
`
`1
`
`

`

`AVAILABILITY NOTICE
`
`Availability of Reference Materials Cited in NRC Publications
`
`Most documents clted In NRC publications will be avaliable frorn one of the following sources:
`
`1.
`
`The NRC Public Document Room, 2120 L Street, NW., Lower Level, Washington, DC 20555-0001
`
`The Superintendent of Documents, U.S. Government Printing Office, P. O. Box 37082, Washington, DC
`20402-9328
`
`The Natlonal Technical Information Service, Springtleld, VA 22161-0002
`
`Although theIlsting that follows represents the majority of documents cited in NAC publications, it is not In-
`tended to be exhaustive.
`
`Referenced documents avallable for Inspection and copying for a fee from the NAC Public Document Room
`include NRC correspondence and Internal NRC memoranda; NRCbulletins, circulars, information notices, In-
`spection andInvestigation notices; llcensee event reports; vendor reports and correspondence; Commission
`Papers; and applicant and llcensee documents and correspondence.
`
`The following documents In the NUREGseries are available for purchase from the GovernmentPrinting Office:
`formal NRC staff and contractor reports, NRC-sponsored conference proceedings, international agreement
`reports, grantee reports, and NAC booklets and brochures. Also available are regulatory guides, NRC regula-
`tlons In the Code of Federal Regulations, and Nuclear Regulatory Commission Issuances.
`
`Documents avallable fromm the National Technical Information Service Include NUREG-serles reports and tech-
`nical reports prepared by other Federal agencies and reports prepared by the Atomic Energy Commission,
`forerunner agency to the Nuclear Regulatory Commission.
`
`Documents available from public and special technicallibraries include all open literature items, such as books,
`Journal articles, and transactions. Federal Register notices, Federal and State legislation, and congressional
`reports can usually be obtained from theselibraries.
`
`Documents such as theses, dissertations. foreign reports and translations, and non-NRC conference pro-
`ceedings are avallable for purchase from the organization sponsoring the publication cited.
`
`Single coples of NAC draft reports are available free, to the extent of supply, upon written requestto the Office
`of Administration, Distribution and Mall Services Section, U.S. Nuclear Regulatory Commission, Washington,
`De 20555-0001.
`
`by such third party would not infringe privately owned rights.
`
`Coplesof industry codes and standards used In a substantive mannerIn the NRC regulatory process are main-
`tained at the NAC Library, Two White Flint North, 11545 Rockville Pike, Rockville, MD 20852-2738, for usa by
`the public. Codes and standards are usually copyrighted and may be purchased from theoriginating organiza-
`tlon or, If they are American National Standards, from the American National StandardsInstitute. 1430 Broad-
`way, New York, NY 10018-3308.
`
`DISCLAIMER NOTICE
`
`This report was prepared as an account of work sponsored by an agencyof the United States Goverment.
`Neitherths United States Governmentnorany agency thereof, norany oftheir employees, makes any warranty,
`expressed or implied, or assumes any legalliability or responsibility for any third party’s use,or the results of
`such use, ofany information, apparatus, product, or processdisclosedin this report, or represents thatits use
`
`2
`
`

`

`DISCLAIMER
`
`
`
`Portions of this document maybeillegible
`in electronic image products.
`Images are
`produced from the best available original
`document.
`
`3
`
`

`

`NUREG/CR-6316
`SAIC-95/1028
`Vol. 3
`
`Guidelines for the
`Verification and Validation of
`Expert System Software and
`Conventional Software
`
`Survey and Documentation of Expert System
`Verification and Validation Methodologies
`
`Manuscript Completed: February 1995
`Date Published; March 1995
`
`Prepared by
`E. H. Groundwater, L. A. Miller, S. M. Mirsky
`
`Science Applications International Corporation
`1710 Goodridge Drive
`McLean, VA 22102
`
`Prepared for
`Division of Systems Technology
`Office of Nuclear Regulatory Research
`U.S. Nuclear Regulatory Commission
`Washington, DC 20555-0001
`NRC Job Code L1530
`
`Palo Alto, CA 94303
`
`and
`
`Nuclear Power Division
`Electric Power Research Institute
`3412 Hillview Avenue
`
`DISTRIBUTION OF THIS DOCUMENT IS UNLI
`
`MITED
`
`4
`
`

`

`5
`
`

`

`ABSTRACT
`
`This report is the third volumein the final report for the Expert System Verification and Validation (V&V)
`project which was jointly sponsored by the Nuclear Regulatory Commission andthe Electric Power Research Institute.
`The ultimate objective is the formulation of guidelines for the V&V ofexpert systems for use in nuclear power
`applications. The purpose ofthis activity was to survey and document techniques presently in use for expert system
`V&V.
`
`The survey effort included an extensive telephone interviewing program,site visits, and a thorough
`bibliographic search and compilation, The majorfinding was that V&V ofexpert systems is not nearly as established or
`prevalent as V&V ofconventional software systems. When V&V was used for expert systems, it was almost always at
`the system validation stage after full implementation andintegration usually employing the non-systematic dynamic
`method of“ad hoc testing." There were few examples ofemploying V&Vin the early phases of developmentand only
`weak sporadic mention ofthe possibilities in the literature. There is, however, a very active research area concerning
`the developmentofmethods andtools to detect problemswith, particularly, rule-based expert systems. Four suchstatic-
`testing methods were identified which were not discovered in a comprehensive review of conventional V&V methods in
`an earlier task.
`
`
`
`6
`
`

`

`7
`
`

`

`TABLE OF CONTENTS
`
`ABSIRAGT)
`
`ih boi cxanSeeeri eae aie ais AER PERO ST ICG EF PEERED EER RRTSE VEDEYZCLEPIIT EC ETE ESAS OE a
`
`iii
`
`EXECUTIVESUMMARY 0.555 oFeesp ns ascaeesSoeauloeabontercdedavera¥ers eopnaspeeurgeseaneee
`
`ix
`
`TOUINTRODUCTION: <5 <5; csv sewwhw ate asa pg ad eo aviscpe bey et es CACBLE SS EERE RPV NGGurapaeeeg estes OFe
`1,1
`Purpose and Scope ofActivity? .......... 00000-2222 eee eee eee eee eee eee eee
`1,2
`Reniort Orennization fv
`.lo sac 2slawise, estes, xalee dnp cdelein dale tte clans ae godt alsawioela's lass etafeatste’s
`
`1
`1
`1
`
`2,0 “TECHNICAL APPROACH iu cctes os os clgactae ere eh wousiare tcaeeet risk iee ss deesticaWgacegan 3
`2.1
`Overall Approach ......... 2.00 ee eee ee eee eee pening ota species ely cnietewew wg tend ewe
`3
`2.2
`TelephoneInterviews and Data Collection ..............2..2222 22222222 e cece e eee ees
`3
`2.3
`Site Selection arid: Vists:
`. ca. vc ve ve ee see epee opeis e ee eeg we ees cabs ea eaeeereeernres
`12
`2.4
`Technique Characterization and Analysis .......--.......---2+2:2e0ceeec eee cece eee eeee
`17
`
`3,0 REFERENCE LIFECYCLE FOR THIS ACTIVITY -...2....-..202sc0cscssee eee ee anseeeeeeeenaues
`
`23
`
`25
`4.0 EXPERT SYSTEM V&V TECHNIQUE DESCRIPTIONS...- 2... cc en ecceceseeeeeeeeeteneeenceeees
`4.1
`Types oF Tedhiiqnes 4 fcc cg os ds 6S ee acess da eed Sa Neda hada ne edaaaigenbeebeaeaces 25
`4.2
`Requirements and Design Testitig <. 2. sccade daca ctvteagedadadadavaruncrvesereseubaes
`25
`43
`Dlatic POSich $6955 See Gh SEES oa lta etayae 4s coo sagadap hy aed peeps ease exe ond hone 29
`44
`Beynarine Testing jt pssciata!le gp ssc gle siete stellate eatab este feleralelelateletatale a tatale ofaralalatarsla cla altattt 40
`4.5
`The Stite-omihe-Arb sig: caccs nei hehe gigi tiniekii e555 2596 cee tases 53
`
`SU AUTOMATED TOOUS 252565 25.5596 28 pag SDSS ake e re cece enh hrh preter CRS Guba Neen ys 55
`5.]
`Syntax Checking Toole’: 325 08353 ¢iscgitiaieia ese bigad ai miatawke b2 aah P aie GRA eee 55
`3,2
`Semantic Checking TOs 6.40: schoo ee aed Sa aa as Eade Rake Gee EE 63
`5.3
`Knowledge Acquisition/Refinement Tools ............-.-.+-2---+-+-+-- picnae theta leas 64
`5.4
`Intelligent Compilers ............2.2-22002 00 ude va ates we deta Mamdeagaclcwere sees clea 65
`aA
`Other Dynamic Testing Tools ......-.2. 20.2222 eee eee eee eee eee 65
`5.6
`Subnary On AMA TOS ss on Ss ses ce er epee een Sees pie Grab Maceuenswar rads 66
`
`6,0. CATEGORIZATION AND ANALYSIS OF TECHNIQUES AND TOOLS .......-.--.....----------- 67
`6.1
`Components ofExpert Systems .....-- 2-20.22 222202 eee eee eee ee eee phatclesssre tects Siecle ibe ss 67
`6.2
`Expert System Faults: Anomalies or Invalidities ..... 2.2.2. 22-22. cece ee eee cece 70
`
`TO CONCLUSIONS 2 se: c2 55543555 AS OEE eRe ge eee eee al aes ee aa eeaaragi aes
`
`75
`
`APPENDIX A.BIBLIOGRAPEDY siecle caer eens eelsca dor acivlnlnetne dba stwacsesedikeaaladadnas 77
`
`v
`
`
`
`8
`
`

`

`List of Figures
`
`Figure ES-1
`
`Classes of V&V Techniques Which Have Been Applied to
`Conventional Systems: 2025 c522 pees ese Ses 4 SSUES Lak Sola asco la saad a cwawres x
`
`Figure 2.1-1
`
`USNRC/EPRI Expert System V&V Project
`Detailed Activity 2 Work Plan ....... 02... c cece cece rete cnet tenet tence net etege a
`
`Figure 2.2-1
`
`USNRC/EPRI V&V Interview Questionnaire .... 2.2.2.0. eee nee 9
`
`Figure 2.2-2
`
`Survey Form: Nuclear Industry Expert Systems
`that Have Been Tested . 0.0.6... cece cece eer e ee eee eee eeeee Te eee ETT eT Eee Te ee 13
`
`Figure 2.2-3
`
`Activity 2 Telephone Interviews .........----- eee ee cee ce een eet c eect heen es eebees 14
`
`Figure 2.4-1
`
`Tools/Techniques Worksheet
`
`.. 2.2.2.5 eee eee ee eee eee eee eens 20
`
`Figure 3.0-1
`
`Relationship of V&V Activities to
`Generio-Projedt Activites c..5.4 5.5 hides hea 24 cae ed bal ta troedeebsaSebamediy nd Eaatnygde we aan 24
`
`Figure 4.1-1
`
`Classification of V&ViTesting: Techniques 35s iaec iad aecdessdceada dedceasaveransias 26
`
`Figure 6.2-1
`
`Comprehensive Expert System V&V Matrix ...........:20s cece ce ete e eee eee ee eeeeeseees 71
`
`vi
`
`9
`
`

`

`List of Tables
`
`Table 2,2-1]
`
`Persons Contacted for Telephone Interviews ..........6.066 5.6 sce cc cece eee eee ete 5
`
`Table 2.3-1
`
`Preliminary Site List for Visit Evaluation ...........-......2.. 02-202 eee eee ee eee eee eee evils
`
`Table 2,3-2
`
`Site Selection Crtferia:s oyds ger cges aes g ogee essa eee esac Sch nb bb nase pues 16
`
`Table 2.3-3
`
`Relative Site Rankings UsingSite Selection
`OHS onic h anki ca dew ae sea ca nd dane shad Heed Mae os een eyed tip sis gapigneSenmeanie 18
`
`Table 2.3-4
`
`Recommended Site Visits Based on Site
`Selection Criteria Ranking... 25.522 -nyvevwwyecey eyes he cece st er etre ep eaves erenegees 19
`
`Table 4.2-1
`
`Table 4,3-]
`
`Requirements and Design V&V Techniques
`Applied ty Expert Systems: Lo. 52266 6h GaSS Gash Ses F epee par eeaesradsedvereugeasecau' 27
`
`Static Testing V&V Techniques Applied to
`Bxmert Sostetnis: wails oauu teu 2co%2 pet pa betee is peAineaed eeatatea dena sans dels 30
`
`Table 4.3-2
`
`Types of Syntactic Errors Found by Automatic
`Rule Base Syntax Checkers, 0... cc inccccecee eco becececneubectbesagaceaaucsmangueeens 39
`
`Table 4.4-1
`
`Dynamic Testing V&V Techniques Applied to
`Expert Systeme i. (cisici gaa Sige sete ee iveeeiea oe ee caveewteiads rates cia aia eada. 4)
`
`Table 5.0-1]
`
`Description ofAutomated Tools for Expert
`iE Vase CES, Lides cds ple Baas Gees lees Gucacees recarseaeSeaneranescels Oaks pane 56
`
`Table 6.1-1
`
`Components and GeneralFeatures ofKnowledge
`Based Systems | OP oi coisas ca we ea aa bis Dees Graie eis pie Cape's bas Coe a vowel we ce bane tee 68
`
`Table 6,1-2
`
`Components and Typical Testing-Related
`Features ofKnowledge-Based Systems, with
`Testing Rectoririendations',.62525%.0etadcecdencdens ae cetapanecenteeaerbaee bane dave 69
`
`vii
`
`10
`
`10
`
`

`

`11
`
`

`

`EXECUTIVE SUMMARY
`
`This report is the third volumein the final report for the Expert System Verification and Validation (V&V)
`project, which wasjointly sponsored by the Nuclear Regulatory Commission (USNRC)andthe Electric Power Research
`Institute (EPRI), Theultimate objectiveis the formulation ofguidelines for the V&V ofexpert systemsfor use in
`nuclear power applications. The purpose ofActivity 2 was to survey and documenttechniques for expert system V&V.
`The survey used the results ofActivity |, a survey oftechniques for conventional software V&V,to determine which of
`these techniques are being applied to expert systems, and what new techniques have been developed solely for expert
`system V&V.
`
`The survey effort included: 1) an extensive telephone interviewing campaign to over 130 points of contact, 2)
`site visits to nine institutions conducting research in or applying expert system V&V,and 3) the collection of an
`extensivelibrary ofwell over 300 bibliographic references. The survey encompassed work doneboth within the nuclear
`powerindustry and in other industries as well. Contacts included corporations, universities, governmentagencies, and
`utilities, Within the last four to five years, there has been an explosive growth ofinterest and work in the field. It has
`nowreached a level ofmaturity where expert system V&V techniques are being implemented in automated tools and
`being applied to operational expert systems development and maintenanceefforts.
`
`As can be seen in Figure ES-], many of the classes of V&V techniquesidentified in Volume 2 as being applied
`to conventional software systems are also being researched for, or applied to, expert systems. Thisis particularly true in
`the areas ofStatic Testing (tests performed directly on the codeitself) and Dynamic Testing (tests performed by running
`the code and evaluating the results). Fewer formal techniques are applied during the Requirements and Design phases
`ofexpert systems development(only five outoften possible methods) and then only infrequently. This is primarily
`because the activities performed during these phases for expert systems are usually informal themselves. Requirements
`and Design documents for expert systems are often not written atall, or written after-the-fact, and thus cannotbe used as
`a basis for V&V activities. When they are written, usually no more is done with them than to review them and,possibly,
`trace requirements to design elements.
`
`Fifteen of 58 possible Static Testing techniques were researched or applied for expert systems (including four
`new ones). Most of the workin Static Testing of expert systems has focused on the development of automated toolsto
`perform sophisticated syntactic checking ofrule bases. Thetypesoferrors that may be found by such checkers include
`redundant or subsumed rules (one rule's conditions are a subset of another's), rule cycles (there is a path from a rule back
`to itself), unreachable or dead-end mules, inconsistent rules, and incompleteness (e.g., not all possible input values are
`covered), Someofthe mule base checkers will perform semantic checks ofthe rule base using meta-constraints defined
`by the programmer, and others will perform checking on thefly during knowledge acquisition and/or refinementofthe
`rule base. Other work in Static Testing has included conducting various kinds ofinspections (e.g., structured walk-
`throughs and expert panel reviews), performance ofdependency analyses ofthe output valueson the inputs, and
`attempts at applying program proving techniques. A point ofview becoming strongly accepted is that it may not be as
`vital to provethat a safety-critical expert system is totally error-free as it is to prove thatifit fails, it will not fail badly
`(i.e., compromise safety).
`
`In Dynamic Testing,there is a wide range of activities: 38 of67 techniques have been researched or applied to
`expert systems. The state-of-the-art in the operational expert system worldis still Ad Hoc Testing, or defining test cases
`at whim, with no systematic guidance. Newer work has focused on more systematic methods for specifying test case
`sets, such as Structural Testing (attempting to cover all mules or rule pathsin the expert system), Random Testing
`(attempting to covera representative sample ofthe possible inputs), and Performance Testing (to assure timing,
`memory, and other constraints are met). Some operational expert systems, such as those developed for safety-related
`
`ix
`
`12
`
`12
`
`

`

`JBEIjSHe}S/[P419Usy
`
`
`
`Hurjsayjeuoyoun.
`
`
`
`Buysayonsyeay
`
`
`
`sishjeuyJosjuog
`
`
`
`sishjeuye}eq
`
`
`
`sishjeuywyywobly
`
`sjuswenbay
`
`
`
`UOHeSHUS//eOjewWayjey
`
`
`
`BUISS8D04obenbue’7sjuawasinbay
`
`sisfjeuy
`
`aBenbHue’
`
`
`Buljsalssaijs
`
`
`(mau)siskjeuyyoajeq
`
`
`
`BuseyAouajadwo5g
`
`
`
`Bunseluojnnsexy
`
`
`
`Burjsayaoejiazu]
`
`
`
`Buyjsayjeanjonsjs
`
`
`
`Bursa,volonpodjufsouzy
`
`suonoadsu}
`
`
`
`
`
`aben6ue7uoijduosagwesbolg
`
`
`
`malAsyUBisagjeuuo4
`
`
`
`uoveinwigubiseg
`
`
`
`
`
`sisfjeuymoj4/Bulull,LOUD
`
`
`
`
`
`sishjeulyeoueljdiwoyubiseq
`
`
`
`Buldes,sjuawisiinbay
`
`
`BuijsaldoUeWIOLIdd
`
`sisfjeuydinjesjjne4
`
`
`MAalABYSJUaWaIINbayjeULIO4
`
`13
`
`
`
`
`
`swiaysAgpadxgJojpasnusaqaaeysanbiuysespjog{pusba7)
`
`
`
`
`
`
`
`
`
`
`
`
`
`sulajsASyadxzjojpasnuaagaaeysanbiuyoa}Jey)aouapiAeONoye)
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`SUl9}SAs[PEUOHUSAUOS0}paljddeuaeqsAeYYIMSenbiuYyd9}AEAJOSesseiOL-Syoni,
`
`
`
`13
`
`
`

`

`functions (e.g,, NASA space shuttle diagnostics), do undergo various formsofRealistic Testing using scenariofiles,
`simulators, or actualfield conditions. Lastly, there are a few automated tools to support generating, managing, or
`scoring test cases.
`
`Uponanalysis ofthe V&V techniques being applied to expert systems, it was found thatthere is sufficient
`coverage across all the components ofexpert systems andacross all error types(static vs. dynamic, anomalies vs.
`invalidities), The challenge is in selecting the appropriate combination of techniques to use for performing V&V ona
`particular expert system that is both effective and cost efficient.
`
`
`
`
`
`14
`
`14
`
`

`

`1, INTRODUCTION
`
`This report is the third volumein the final report for the Expert System Verification and Validation (V&V)
`project. The ultimate objective is the formulation of guidelines for V&V ofexpert systems for use in nuclear power
`applications. This workis jointly sponsored by the Nuclear Regulatory Commission (USNRC)and the Electric Power
`ResearchInstitute (EPRI).
`
`1.1
`
`Purpose and Scopeof Activity 2
`
`The purpose ofActivity 2 was to survey and document techniques for expert system V&V. This report is a
`companion to Volume2 that surveys techniques for conventional software V&V. As will be seen, there is (and should
`be) considerable overlap in the techniques being applied to both types ofsoftware. Thus, this report will reference and
`draw upon the contents of Volume 2 considerably.
`
`The survey included both techniques being applied in the field to operational expert systems andthose being
`researched in Artificial Intelligence (AI) laboratories. With the help ofDr. John Bernard, from the Massachusetts
`Institute ofTechnology (MIT), we surveyed V&V techniques being applied to expert systems for nuclear power
`applications (Bernard & Washio, 1989). However, the survey also encompassed work in otherfields such as space
`operations, manufacturing, military, and otherutilities. We contacted a diverse range oforganizations including
`government agencies and laboratories, universities, contractors and other commercial concerns, and power utilities. We
`attempted to comprehensively cover the work being performed in the United States and opportunistically included work
`done abroad.
`
`As in Activity 1, we covered both lifecycle management and testing techniques, focusing primarily on the
`testing techniques, Again, as in Activity 1, we examined V&V techniques applied to all phases ofthe development
`lifecycle, versus justto the testing phase. Finally, we examined both manual and automated techniques, providing a
`separate description ofdetailed automated tools.
`
`As part ofthe survey effort, nine sites, where work was being performed in V&V ofexpert systems, were
`selected andvisited,
`
`1,2
`
`Report Organization
`
`The next section, 2.0, describes our technical approach to the Activity 2 survey, beginning with a description of
`our overall approach, then telephone surveys andsite visits, followed by a description ofour characterization and
`analysis ofthe techniques. Section 3.0 describes the reference lifecycle to be used for discussing and characterizing the
`techniques found in the survey. Section 4.0 presents a briefdescription ofeach ofthe techniques found. Section 5.0
`describes separately the automated tools for expert system V&V that were found. A categorization and analysis ofthe
`techniques and tools follows in Section 6.0. This is followed by a summary in Section 7.0, which primarily contains
`recommendations for how the Activity 2 results can be applied in subsequentactivities. Appendix A contains the
`Bibliography of materials collected over the course ofthe survey.
`
`15
`
`15
`
`

`

`16
`
`

`

`2. TECHNICAL APPROACH
`
`2.1
`
`Overall Approach
`
`Thedetailed work plan for Activity 2 is shown in Figure 2.1-1. Three main "threads" can be seen in the
`Activity 2 work plan diagram. Thefirst onein the left and lower middle involvestelephone interviewing and reference
`documentcollection, and will be described below in Section 2.2. The second, in the upper middle involves site selection
`and survey, and will be described in Section 2.3. Thelast, on the right involves characterizing and analyzing the
`techniques, and will be discussed in Section 2.4.
`
`2.2
`
`Telephone Interviews and Data Collection
`
`Thefirst step in conducting telephone interviews was to developalist of people to call. Names, addresses, and
`phone numbers ofknowledgeable practitioners and researchers came from a numberofsources throughoutthe activity
`period;
`

`

`

`

`
`Team members'existing professional contacts,
`
`Referrals from Dr. Bemardofpeople involved in nuclear power expert systems development and
`testing,
`
`Authors ofpapers on operational nuclear power expert systems (Artificial Intelligence and other
`Innovative Computer Applicationsin the Nuclear Industry, 1988, EPRI 1989b, 1988a,d, f, 1987d,
`Motoda, 1990, Moradian et al, Nelson,.1989, Osborne, 1986, Proceedings ofthe Intemational
`WorkshoponArtificial Intelligence for Industrial Applications, 1988),
`
`Attendees and speakers at the 1988, 1989, and 1990 AAAI, and IJCAI Workshops on V&V ofExpert
`Systems,
`
`@ Membersofstandards organizations,
`
`@
`

`
`@
`
`Authors ofpapers collected from automated bibliographic search,
`
`Other references and acknowledgements in the papers we collected, and
`
`Referrals from other telephone contacts.
`
`Thelist ofnames was organized into a Point ofContact (PoC) List, which was continuously updated and
`distributed to team members during the activity period. A list ofthe 97 names and organizations ofthe contacts is
`shown in Table 2.2-1,
`
`Interview forms were prepared for collecting information from the telephoneinterviewees. After a few trial
`calls with the first draft ofthe form, it was shortened and modified to the one shown as Figure 2.2-1. The first page was
`followed by a totally blank page, on which answers to the discussion points on the bottom ofthefirst page could be
`transcribed. The Activity 2 team members weretrained in structured interviewing and the use of the form,and the
`
`
`
`17
`
`17
`
`

`

`
`
`
`
`avid42047Qlaqoypapejagpeya0DAVAWeyskgprsdxyHAT/OUNSMI-17sans
`
`
`
`
`
`
`
`eur
`
`yodayyzAwanoy
`
`yeid
`
`yoday
`
`auMLojoy
`
`Ayqiqeoyddy
`
`xXUJeAT
`
`
`
`B@quosaqIsr]
`
`‘sonbrayoay,
`
`
`
`puno.jsjooy,
`
`dojaaoq
`
`anbiuyoa],
`
`wonezuodaye~)
`
`doyaaaq
`
`Aydexsouqua
`
`angdopaasg
`
`Sas199]9g
`
`a[Npayospue
`
`SUSTA
`
`uolDI]9S
`
`eno)
`
`
`
`BE SMOIAIOU]
`
`amoyd
`
`aredaig
`
`MAIAIOIU]
`
`Salevonsang)
`
`SsurpNG
`
`yoday
`
`jeulydn-Mo[joy
`
`
`wpal[oD
`
`aouaIayey
`
`squamEND0q]
`
`18
`
`
`
`18
`
`
`
`
`
`
`
`
`

`

`Table 2.2-1 Persons Contacted for TelephoneInterviews
`
`
`
`| Point of Contact[iiation
`
`Adelman, Leonard
`
`George Mason University
`
`Bahill, Terry
`
`University of Arizona
`
`Bartschat, Steffen
`
`Ultrasystems
`
`Basti, D.W.
`
`Bayse,Al
`
`Bernard, John
`
`Bloom, Howard
`
`Bond, David
`
`Boose, John H.
`
`Bray, Mike
`
`Forschungsgelande (Germany)
`
`Federal BureauofInvestigation (FBI)
`
`MassachusettsInstitute of Technology
`
`NationalInstitute of Standards & Technology (NIST)
`
`SAIC, COMSYSTEMSDivision
`
`Boeing Computer Services
`
`EG&G IdahoInc.
`
`Buchanan, Bruce G.
`
`University of Pittsburgh
`
`Renesselaer Polytechnic Institute
`
`Carbonara, Joe
`
`Chee, Christine
`
`Cohen, Paul R.
`
`Combs, Jacqueline
`
`Cragun,Brian J.
`
`Cross, Steve
`
`Culbert, Chris
`
`Duckworth, Jim
`
`Edwards, Robert
`
`Fausett, Mark
`
`Franklin, Randolph
`
`Consolidated Edison - Indian Point 2
`
`BD Systems,Inc.
`
`University of Massachusetts
`
`Lockheed Missiles and Space Company, Inc.
`
`IBM
`
`Defense Advanced Research Projects Agency (DARPA)
`
`NASA/Johnson Space Center
`
`Worcester Polytechnic Institute
`
`Pennsylvania State University
`
`Rome Laboratory/COES
`
`19
`
`19
`
`

`

`Table 2.2-1 (Continued).
`
`Point of Contact
`
`Freeman, Michael
`
`Friedland, Peter
`
`Fujii, Roger U.
`
`Fussel, Louise
`
`NASA
`
`NASA-Ames Research Center
`
`Logicon, Inc.
`
`Rockwell Space Operations Company
`
`Gabrielian, Armen
`
`Thomson-SCF,Inc./Pacific Rim
`
`Geissman,Jim
`
`Gelperin, David
`
`Garrett, Randy
`
`Gilstrap, Lewey
`
`Ginsberg, Allen
`
`Gowens, Jay
`
`Griebenow, Ronald
`
`Griesmer, James
`
`Hajek, Brian K.
`
`Hamilton, David
`
`Harder, Bob
`
`Harrison, Patrick
`
`Abacus Programming Corporation
`
`Software Quality Engineering
`
`Institute for Defense Analysis
`
`Computer Science Corporation
`
`AT&T Bell Labs
`
`U.S. ArmyInstitute for Research in Management
`Information
`
`NUS Corporation
`
`Thomas Watson Research Center
`
`The Ohio State University
`
`IBM
`
`USAEPG; STEEP-ET-S
`
`U.S. Naval Academy
`
`Hayes-Roth, Frederick
`
`Cimflex TeknowledgeInc.
`
`Klein Associates
`
`U.S. Army Missile Command Research, Development &
`Engineering Center
`
`Heindel, Troy
`
`Hirschberg, Morton
`
`Holmes, Willard
`
`Johnson, Sally C.
`
`Kiguchi, Takashi
`
`Kiss, Peter
`
`Klein, Gary A.
`
`NASA/Johnson Space Center
`
`U.S. ArmyBallistic Research
`
`NASA
`
`Hitachi, Ltd.
`
`Sentar, Inc,
`
`20
`
`20
`
`

`

`Table 2.2-1 (Continued).
`
`Point of Contact
`
`Affiliation
`
`
`
`Intellicorp, Inc.
`
`University of Michigan
`
`George Mason University
`
`Microelectronics and Computer Corp.
`
`Rochester Gas & Electric Company
`
`George Washington University
`
`Advanced Decision Systems,Inc.
`
`University of Southern Louisiana
`
`Chalk River Nuclear Laboratories
`
`Digital Equipment Corporation (DEC)
`
`Dupont Corporation
`
`George Mason University
`
`Westinghouse Electronic Corporation
`
`University of Wisconsin-Milwaukee
`
`Georgia Power Company
`
`RensselaerPolytechnicInstitute
`
`University of Southern California
`
`Data Systems Technology
`
`Westinghouse Electric Corporation
`
`University of California, Irvine
`
`Laning, David
`
`Lee, John Cc,
`
`Lehner, Paul
`
`Lenat, Doug
`
`Leoni, Nicholas
`
`Liebowitz, Jay
`
`Linden, Theadore
`
`Loganantharaj, R.
`
`Lupton, Lawrence
`
`Lutsky, Patty
`
`Mahler, Ed
`
`Michalski, R.S.,
`
`Moradian,Ali
`
`Nazareth, Derek
`
`Nelson, Robert
`
`O'Keefe, Robert
`
`O'Leary, Daniel
`
`Odubiyi, Jide B.
`
`Osborne, Robert
`
`Owens,Jerry
`
`Owre,Fridtjov
`
`Parsaye, Kamran
`
`Pazzani, Michael
`
`Navy Center for Applied Researchin Artificial Intelligence
`
`institutt fur Engergeteknikk
`
`Intelligence Ware
`
`
`
`21
`
`21
`
`

`

`|
`
`Point of Contact
`
`Plant, Robert T.
`
`Preece, Alun
`
`Table 2.2-1 (Continued).
`
`University of Miami
`
`Concordia University
`
`Affiliation
`
`|
`
`Rossomando,Philip J.
`
`General Electric Corporation
`
`Rousset, Marie-Christine
`
`L.R.I. - University’ d'Orsay
`
`Rushby, Dr. John
`
`St. Clair, Daniel
`
`Sharma, Ravi S.
`
`Sizemore, Nick L.
`
`Stewart, Tammy
`
`Sudduth, Al
`
`Surko, Pam
`
`Sztipanovits, Dr.
`
`Takahaski, Makoto
`
`Terano, Takoa
`
`Touchton, Robert
`
`Ulvila, Jacob
`
`Vesonder, Gregg
`
`SRIInternational
`
`McDonnel Douglas Corporation
`
`University of Waterloo
`
`COMARCO,Inc.
`
`USAEPG
`
`Duke Power Company
`
`Vanderbilt University
`
`Tohoku University
`
`The University of Tsukuba, Tokoyo
`
`Pathfinder Advanced Computing,Inc.
`
`Decision Sciences Consortium,Inc.
`
`AT&T Beli Labs
`
`Japan Atomic Energy Research
`
`Science Applications International Corporation (SAIC)
`
`Vignollet, Laurence
`
`University of Savoie
`
`Watson, David
`
`Williams, Robert
`
`Williamson, Keith
`
`Yen, John
`
`Yokobayaski, Masao
`
`Martin Marietta
`
`U.S. Army Electronic Proving Ground
`
`Boeing Computer Services
`
`Texas A&M University
`
`22
`
`
`
`22
`
`

`

`Figure 2.2-1 USNRC/EPRI V&V Interview Questionnaire
`
`| am X from SAIC, working under contract to the NRC and EPRI, on a survey task. We are interested in
`finding out what, if anything, you might be doing in the area of verification, validation or testing of expert
`systems or knowledge-based systems (ES/KBS V&V).
`
`SAIC Interviewer:
`
`DATE/TIME:
`
`Person(s) Interviewed:
`
`Contactlist entry correct? Yes__=—s- No____
`
`FAX:
`
`E-MAIL:
`
`PHONE:
`
`Title/Role:
`
`Type of Work: Research ___ ES Development ____ Services ___
`
`Study ____ Standards
`
`Project/System Name:
`
`Length of Work:
`
`Numberof People:
`
`Customers? Yes___ (see referrals) No___
`
`Funding source:
`
`Can we visit? Yes___-~No__
`
`Project/System Description: (next page)
`
`- Development/productplans
`
`- Who should beinterested (industry/ES type)
`
`- Problem areas encountered
`
`- Tool/technique needsidentified
`
`- Success?
`
`23
`
`
`23
`
`

`

`Figure 2.2-1 (Continued).
`
`Expert Systems Tested:
`
`SYSTEM NAME
`
`OPpsYS
`PLATFRM
`1)DOS
`1)PG
`02) Apple
`2)Apple
`3)SUN 3)Unix 2)Embed
`4)VAX 4)IBM 3)Stand
`5)Other
`5)Other
`
`SOFTWARE
`TYPE
`1)Real
`Time
`
`SIZE
`ENV.
`1)LISP,
`Prolog
`2)Shell
`3)Other
`Alone
`
`Rul
`1)Small(<50)
`2)Med(<500)
`3)Lge (<3000)
`4)Very Las
`(>3000)
`
`Testing Techniques:
`
`TECH
`NAME
`
`1)EVA
`2)random
`testing
`3)other
`
`ES COMPON
`
`1)KB
`2)infEng
`3)MMI
`4)Shell
`5)Other
`
`TYPE
`ERRORS
`
`Stai(1)
`Dynam(2)
`Anom(3)
`Valid(4)
`
`AUTOMATED
`TOOLS?
`
`(¥ or N)
`
`EASEof
`SET-UP
`
`4(Io)
`7(hi)
`
`POWE
`BR
`Abllity
`to find
`errors
`1(lo)
`7(hi)
`
`
`
`
`
`Automated Toois:
`
`TOOLNAME AVAIL SOURCE PLATFORM ENV,
`
`SOFTWARE
`
`
`
`
`(Y or N)
`1)PG
`1)DOS
`1)LISP, Prolog
`2)Apple
`2)Apple
`2)Shell
`3)SUN
`3)Unix
`3)Other
`4) VAX
`4)IBM
`5)Oiher
`5)Other
`
`
`24
`
`

`

`Referrals (Colleagues/Customers):
`Name
`Affiliations
`Topic
`
`Address
`
`Phone
`
`Figure 2.2-1 (Continued).
`
`Publication/Documentation References:
`
`Action Items:
`
`PERSON
`
`REQUIRED ACTION
`
`DATE REQUIRED
`
`
`
`ll
`
`25
`
`
`
`25
`
`

`

`contacts were distributed among them. Weekly meetings were held during the heavy period oftelephoningto collect
`referrals and other changes to the PoClist and to share information and interviewing hints.
`
`A separate form was prepared for Dr. Bernard tofill out on operational or nearly operational expert systems
`within the nuclear industry. This form is reproduced as Figure 2.2-2. The aim was to draw uponhis experience in
`writing his book, Expert Systems Within the Nuclear Industry, to gain an understandingofthe state-of-the-art of expert
`system V&V within the nuclear industry. He sent along references with the forms, and if needed, follow up contacts
`were made.
`
`In all, 138 PoCs were contacted which yielded the 97 doing current work mentioned above. However, many
`more people were called to generate these PoCs, This is because we would ofien have one nameas an entry into the
`organization, and would chase through a number ofreferrals to obtain the best and most knowledgeable PoCin that
`organization. Also, some people were not doing work in the field themselves, but gavereferrals to those wha were.
`Then there were referrals by the referrals. We got to the point in the survey where PoCs were referring to each other and
`we had both the funder/sponsor and contractor/university PoCsoffunding relationships on our list. This fact, and our
`limited resources, led us to limit the telephone survey, except for PoCs we knew were important, at some point so we
`could move on with the activity. A breakdown of the PoCsis shown in Figure 2.2-3.
`
`Publications were collected from a number of sources. These included:
`
`e
`
`a
`

`
`*
`
`a
`
`Keyword-based search of the DIALOG and Defense Technical Information Center (DTIC) on-line
`computerized bibliographic services, followed by obtaining the mostsuitable publications,
`
`Conference and workshop proceedings and reports already on hand,
`
`Publications in Dr. Bernard's possession,
`
`Publications sent to us by PoCsafter telephone interviews, and
`
`Publications collected atsite visits.
`
`The result was a very extensivelibrary of materials on experi sysiem V&V (well over 300 references). The
`bibliography forthis library is included as Appendix B.
`
`2.3
`
`Site Selection and Visits
`
`As a result of the telephone interview and data collection process which was described in Section 2.2, the
`project team determinedthat a numberofsites offered the potential for obtaining significant additional information on
`expert system V&V techniques and tools. This preliminary list of sites was chosen after analyzing telephone interview
`data sheets and papers that were collected. Only those locations with robust ongoing expert system V&V activities that
`required an onsite, face-t

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket