throbber
RTIFICIAL
`
`NTELLIGENCE
`
`Third Edition
`
`
`
`
`Patrick Henry Winston
`Professor of Computer Science
`Director, Artificial Intelligence Laboratory
`Massachusetts Institute of Technology
`
`
`
`
`
`
`ADDISON-WESLEY PUBLISHING COMPANY
`
`Reading, Massachusetts I Menlo Park, California
`New York I Don Mills, Ontario I Wokingham, England
`Amsterdam I Bonn I Sydney I Singapore I Tokyo
`Madrid I San Juan I Milan I Paris
`
`A
`VV
`
`
`
`Page 1 of 34
`
`FORD 1321
`
`Page 1 of 34
`
`FORD 1321
`
`

`
`Library of Congress Cataloging—in-Publication Data
`Winston, Patrick Henry.
`Artificial Intelligence / Patrick Henry Winston. — 3rd ed.
`p.
`cm.
`Includes bibliographical references (p.
`ISBN 0-201-53377-4
`
`) and index.
`
`1. Artificial Intelligence.
`Q335.W56
`1992
`O06.3—dc20
`
`1. Title.
`
`91~41385
`CIP
`
`Reproduced by Addison-Wesley from camera-ready copy supplied by the author.
`
`Reprinted with corrections May, 1993
`
`Copyright © 1992 by Patrick H. Winston. All rights reserved. No part of this
`publication may be reproduced, stored in a retrieval system, or transmitted in
`any form or by any means, electronic, mechanical, photocopying, recording, or
`otherwise, without prior written permission. Printed in the United States of
`America.
`
`56789l0MU959l+
`
`Page 2 of 34
`
`FORD 1321
`
`Page 2 of 34
`
`FORD 1321
`
`

`
`Contents
`
`ACKNOWLEDGMENTS
`
`SOFTWARE
`
`PREFACE
`
`PART I
`Representations and Methods
`
`CHAPTER 1I
`The Intelligent Computer
`
`The Field and the Book
`
`This Book Has Three Parts 6 I The Long-Term Applications Stagger the
`Imagination 6 I The Near—Term Applications Involve New Opportunities 7
`I Artificial Intelligence Sheds New Light on Traditional Questions 7 I
`Artificial Intelligence Helps Us to Become More Intelligent 8
`
`What Artificial Intelligence Can Do
`
`Intelligent Systems Can Help Experts to Solve Difficult Analysis Problems 8
`I Intelligent Systems Can Help Experts to Design New Devices 9 I In-
`telligent Systems Can Learn from Examples 10 I Intelligent Systems Can
`
`xviii
`
`xx
`
`xxii
`
`1
`
`5
`
`5
`
`8
`
`Page 3 of 34
`
`FORD 1321
`
`Page 3 of 34
`
`FORD 1321
`
`

`
`iv
`
`Contents
`
`Provide Answers to English Questions Using both Structured Data and Free
`Text 11 I Artificial Intelligence is Becoming Less Conspicuous, yet More
`Essential 12
`
`Criteria for Success
`
`Summary
`
`Background
`
`CHAPTER 2I
`Semantic Nets and Description Matching
`
`Semantic Nets
`
`Good Representations Are the Key to Good Problem Solving 16 I Good
`Representations Support Explicit, Constraint-Exposing Description 18 I A
`Representation Has Four Fundamental Parts 18 I Semantic Nets Convey
`Meaning 19 I There Are Many Schools of Thought About the Mean-
`ing of Semantics 20 I Theoretical Equivalence Is Different from Practical
`Equivalence 22
`
`The Describe-and-Match Method
`
`Feature-Based Object identification Illustrates Describe and Match 24
`
`The Describe-and-Match Method and Analogy
`Problems
`
`Geometric Analogy Rules Describe Object Relations and Object Transfor-
`mations 25 I Scoring Mechanisms Rank Answers 30 I Ambiguity
`Complicates Matching 32 I Good Representation Supports Good Per-
`formance 32
`
`The Describe-and-Match Method and Recognition
`of Abstractions
`
`Story Plots Can Be Viewed as Combinations of Mental States and Events 33
`I Abstraction—Unit Nets Enable Summary 36 I Abstraction Units Enable
`Question Answering 41 I Abstraction Units Make Patterns Explicit 42
`
`Problem Solving and Understanding Knowledge
`
`Summary
`
`Background
`
`CHAPTER 3I
`Generate and Test, Means-Ends Analysis,
`and Problem Reduction
`
`The Generate-and-Test Method
`
`Generate-and-Test Systems Often Do Identification 48 I Good Genera-
`tors Are Complete, Nonredundant, and Informed 49
`
`13
`
`13
`
`14
`
`15
`
`16
`
`22
`
`25
`
`33
`
`42
`
`45
`
`47
`
`47
`
`Page 4 of 34
`
`FORD 1321
`
`Page 4 of 34
`
`FORD 1321
`
`

`
`The Means-Ends Analysis Method
`
`The Key Idea in Means-Ends Analysis is to Reduce Differences 50 I Den-
`dral Analyzes Mass Spectrograms 51 I Difference-Procedure Tables Often
`Determine the Means 53
`
`The Problem-Reduction Method
`
`Moving Blocks Illustrates Problem Reduction 54 I The Key ldea in Problem
`Reduction Is to Explore a Goal Tree 56 I Goal Trees Can Make Procedure
`Interaction Transparent 57 I Goal Trees Enable Introspective Question
`Answering 59 I Problem Reduction Is Ubiquitous in Programming 60
`I Problem-Solving Methods Often Work Together 60 I Mathematics
`Toolkits Use Problem Reduction to Solve Calculus Problems 61
`
`Summary
`
`Background
`
`CHAPTER 4T
`Nets and Basic Search
`
`Blind Methods
`
`Net Search is Really Tree Search 64 I Search Trees Explode Exponen-
`tially 66 I Depth-First Search Dives into the Search Tree 66 I Breadth-
`First Search Pushes Uniformly into the Search Tree 68 I The Right Search
`Depends on the Tree 68 I Nondeterministic Search Moves Randomly into
`the Search Tree 69
`
`Heuristically Informed Methods
`
`Quality Measurements Turn Depth-First Search into Hill Climbing 70 I
`Foothills, Plateaus, and Ridges Make Hills Hard to Climb 72 I Beam Search
`Expands Several Partial Paths and Purges the Rest 73 I Best-First Search
`Expands the Best Partial Path 75 I Search May Lead to Discovery 75 I
`Search Alternatives Form a Procedure Family 78
`
`Summary
`
`Background
`
`CHAPTER 5
`Nets and Optimal Search
`
`The Best Path
`
`The British Museum Procedure Looks Everywhere 81 I Branch-and-Bound
`Search Expands the Least-Cost Partial Path 82 I Adding Underestimates
`Improves Efficiency 84
`
`50
`
`53
`
`60
`
`60
`
`63
`
`63
`
`69
`
`78
`
`79
`
`81
`
`81
`
`Page 5 of 34
`
`FORD 1321
`
`Page 5 of 34
`
`FORD 1321
`
`

`
`VI
`
`Contents
`
`Redundant Paths
`
`Redundant Partial Paths Should Be Discarded 90 I Underestimates and
`Dynamic Programming improve Branch-and—Bound Search 91 I Several
`Search Procedures Find the Optimal Path 94 I Robot Path Planning Illus-
`trates Search 94
`
`Summary
`
`Background
`
`CHAPTER 6T
`Trees and Adversarial Search
`
`Algorithmic Methods
`Nodes Represent Board Positions 101 I Exhaustive Search is Impossi-
`ble 102 I The Minimax Procedure Is a Lookahead Procedure 103 I The
`Alpha-Beta Procedure Prunes Game Trees 104 I Alpha-Beta May Not
`Prune Many Branches from the Tree 110
`
`Heuristic Methods
`
`Progressive Deepening Keeps Computing Within Time Bounds 114 I
`Heuristic Continuation Fights the Horizon Effect 114 I Heuristic Pruning
`Also Limits Search 115 I DEEP THOUGHT Plays Grandmaster Chess 117
`
`Summary
`
`Background
`
`CHAPTER 7T
`Rules and Rule Chaining
`
`Rule-Based Deduction Systems
`Many Rule-Based Systems Are Deduction Systems 120 I AToy Deduction
`System Identifies Animals 121 I Rule-Based Systems Use a Working Mem-
`ory and a Rule Base 124 I Deduction Systems May Run Either Forward or
`Backward 126 I The Problem Determines Whether Chaining Should Be
`Forward or Backward 128
`
`Rule-Based Reaction Systems
`Mycin Diagnoses Bacterial Infections of the Blood 130 I A Toy Reaction
`System Bags Groceries 132 I Reaction Systems Require Conflict Resolu-
`tion Strategies 137
`
`Procedures for Forward and Backward Chaining
`Depth-First Search Can Supply Compatible Bindings for Forward Chain-
`ing 138 I XCON Configures Computer Systems 139 I Depth-First
`Search Can Supply Compatible Bindings for Backward Chaining 142 I
`
`90
`
`99
`
`1 00
`
`101
`
`101
`
`113
`
`116
`
`118
`
`119
`
`119
`
`129
`
`137
`
`Page 6 of 34
`
`FORD 1321
`
`Page 6 of 34
`
`FORD 1321
`
`

`
`Relational Operations Support Fon/vard Chaining 147 I The Rete Ap-
`proach Deploys Relational Operations incrementally 152
`
`Summary
`
`Background
`
`CHAPTER 8T
`Rules, Substrates, and Cognitive Modeling
`
`Rule-based Systems Viewed as Substrate
`
`Explanation Modules Explain Reasoning 163 I Reasoning Systems Can
`Exhibit Variable Reasoning Styles 164 I Probability Modules Help You to
`Determine Answer Reliability 167 I Two Key Heuristics Enable Knowl-
`edge Engineers to Acquire Knowledge 167 I Acquisition Modules Assist
`Knowledge Transfer 168 I Rule interactions Can Be Troublesome 171 I
`Rule-Based Systems Can Behave Like idiot Savants 171
`
`Rule-Based Systems Viewed as Models for Human
`Problem Solving
`Rule-Based Systems Can Model Some Human Problem Solving 172 I
`Protocol Analysis Produces Production-System Conjectures 172 I SOAR
`Models Human Problem Solving, Maybe 173 I SOAR Searches Problem
`Spaces 173 I SOAR Uses an Automatic Preference Analyzer 175
`
`Summary
`
`Background
`
`CHAPTER 9T
`Frames and Inheritance
`
`Frames, Individuals, and Inheritance
`
`Frames Contain Slots and Slot Values 180 I Frames may Describe In-
`stances or Classes 180 I Frames Have Access Procedures 182 I Inheri-
`tance Enables When-Constructed Procedures to Move Default Slot Values
`from Classes to instances 182 I A Class Should Appear Before All its Su-
`perciasses 185 I A Ciass's Direct Superciasses Should Appear in Order 187
`I The Topological-Sorting Procedure Keeps Classes in Proper Order 190
`
`Demon Procedures
`
`When-Requested Procedures Override Slot Values 197 I When—Read and
`When-Written Procedures Can Maintain Constraints 198 I With-Respect-
`to Procedures Deal with Perspectives and Contexts 199 I inheritance and
`Demons introduce Procedural Semantics 199 I Object-Oriented Program-
`ming Focuses on Shared Knowledge 201
`
`160
`
`161
`
`163
`
`163
`
`172
`
`176
`
`177
`
`179
`
`179
`
`197
`
`Page 7 of 34
`
`FORD 1321
`
`Page 7 of 34
`
`FORD 1321
`
`

`
`viii
`
`Contents
`
`Frames, Events, and Inheritance
`
`Digesting News Seems to Involve Frame Retrieving and Slot Filling 202 I
`Event—Describing Frames Make Stereotyped Information Explicit 206
`
`Summary
`
`Background
`
`CHAPTER T
`Frames and Commonsense
`
`Thematic-role Frames
`
`An Object's Thematic Role Specifies the Object’s Relation to an Action 209
`I Filled Thematic Roles Help You to Answer Questions 212 I Various
`Constraints Establish Thematic Roles 214 I A Variety of Constraints Help
`Establish Verb Meanings 215 I Constraints Enable Sentence Analysis 216
`I Examples Using Take Illustrate How Constraints Interact 218
`
`Expansion into Primitive Actions
`Primitive Actions Describe Many Higher-Level Actions 221 I Actions Of-
`ten imply Implicit State Changes and Cause-Effect Relations 222 I Actions
`Often imply Subactions 223 I Primitive-Action Frames and State-Change
`Frames Facilitate Question Answering and Paraphrase Recognition 224 I
`Thematic—Role Frames and Primitive-Action Frames Have Complementary
`Foci 226 I CYC Captures Commonsense Knowledge 229
`
`Summary
`
`Background
`
`CHAPTER 1 1 I
`Numeric Constraints and Propagation
`
`Propagation of Numbers Through Numeric
`Constraint Nets
`
`Numeric Constraint Boxes Propagate Numbers through Equations 231
`
`Propagation of Probability Bounds Through
`Opinion Nets
`Probability Bounds Express Uncertainty 234 I Spreadsheets Propagate
`Numeric Constraints Through Numeric-Constraint Nets 235 I Venn Di-
`agrams Explain Bound Constraints 237 I Propagation Moves Probability
`Bounds Closer Together 241
`
`Propagation of Surface Altitudes Through Arrays
`Local Constraints Arbitrate between Smoothness Expectations and Actual
`Data 242 I Constraint Propagation Achieves Global Consistency through
`
`202
`
`206
`
`206
`
`209
`
`209
`
`221
`
`228
`
`228
`
`231
`
`231
`
`234
`
`241
`
`Page 8 of 34
`
`FORD 1321
`
`Page 8 of 34
`
`FORD 1321
`
`

`
`Local Computation 244 I GENINFER Helps Counselors to Provide Precise
`Genetic Advice 246
`
`Summary
`
`Background
`
`CHAPTER 12I
`Symbolic Constraints and Propagation
`
`Propagation of Line Labels through Drawing
`Junctions
`
`There Are Only FourWays to Label a Line in the Three-Faced-Vertex World 250
`I There Are Only 18 Ways to Label a Three-Faced Junction 254 I Finding
`Correct Labels Is Part of Line—Drawing Analysis 257 I Waltz’s Procedure
`Propagates Label Constraints through Junctions 262 I Many Line and
`Junction Labels Are Needed to Handle Shadows and Cracks 266 I Illumi-
`nation lncreases Label Count and Tightens Constraint 267 I The Flow of
`Labels Can Be Dramatic 270 I The Computation Required ls Proportional
`to Drawing Size 272
`
`Propagation of Time-Interval Relations
`
`There Are 13 Ways to Label a Link between Interval Nodes Yielding 169
`Constraints 272 I Time Constraints Can Propagate across Long Dis-
`tances 275 I AComplete Time Analysis |sComputationally Expensive 276
`I Reference Nodes Can Save Time 278
`
`Five Points of Methodology
`
`Summary
`
`Background
`
`CHAPTER 1 3T
`Logic and Resolution Proof
`
`Rules of Inference
`
`Logic Has a Traditional Notation 284 I Quantifiers Determine When Ex-
`pressions Are True 287 I Logic Has a Rich Vocabulary 288 I Interpre-
`tations Tie Logic Symbols to Worlds 288 I Proofs Tie Axioms to Conse-
`quences 290 I Resolution is a Sound Rule of inference 292
`
`Resolution Proofs
`
`Resolution Proves Theorems by Refutation 293 I Using Resolution Re-
`quires Axioms to Be in Clause Form 294 I Proof is Exponential 300 I
`Resolution Requires Unification 301 I Traditional Logic ls Monotonic 302
`
`245
`
`248
`
`249
`
`249
`
`272
`
`278
`
`280
`
`280
`
`283
`
`283
`
`293
`
`Page 9 of 34
`
`FORD 1321
`
`Page 9 of 34
`
`FORD 1321
`
`

`
`X
`
`Contents
`
`I Theorem Proving Is Suitable for Certain Problems, but Not for All Prob-
`lems 302
`
`Summary
`
`Background
`
`CHAPTER T
`Backtracking and Truth Maintenance
`
`Chronological and Dependency-Directed
`Backtracking
`
`Limit Boxes Identify Inconsistencies 305 I Chronological Backtracking
`Wastes Time 306 I Nonchronological Backtracking Exploits Dependen-
`cies 308
`
`Proof by Constraint Propagation
`
`Truth Can Be Propagated 309 I Truth Propagation Can Establish lus-
`tifications 315 I Justification Links Enable Programs to Change Their
`Minds 316 I Proof by Truth Propagation Has Limits 319
`
`Summary
`
`Background
`
`CHAPTERT
`Planning
`
`Planning Using If-Add-Delete Operators
`
`Operators Specify Add Lists and Delete Lists 324 I You Can Plan by
`Searching for a Satisfactory Sequence of Operators 326 I Backward
`Chaining Can Reduce Effort 327 I Impossible Plans Can Be Detected 331
`I Partial instantiation Can Help Reduce Effort Too 336
`
`Planning Using Situation Variables
`
`Finding Operator Sequences Requires Situation Variables 338 I Frame
`Axioms Address the Frame Problem 343
`
`Summary
`
`Background
`
`303
`
`304
`
`305
`
`305
`
`309
`
`320
`
`320
`
`323
`
`323
`
`338
`
`345
`
`346
`
`Page 10 of 34
`
`FORD 1321
`
`Page 10 of 34
`
`FORD 1321
`
`

`
`PART
`Learning and Regularity Recognition
`
`CHAPTER I
`Learning by Analyzing Differences
`
`Induction Heuristics
`
`Responding to Near Misses Improves Models 351 I Responding to Ex-
`amples Improves Models 354 I Near-Miss Heuristics Specialize; Example
`Heuristics Generalize 355 I Learning Procedures Should Avoid Guesses 357
`I Learning Usually Must Be Done in Small Steps 358
`
`Identification
`
`Must Links and Must—Not Links Dominate Matching 359 I Models May
`Be Arranged in Lists or in Nets 359 I ARIEL Learns about Proteins 360
`
`Summary
`
`Background
`
`CHAPTER I
`Learning by Explaining Experience
`
`Learning about Why People Act the Way they Do
`Reification and the Vocabulary of Thematic-Role Frames Capture Sentence-
`Level Meaning 366 I Explanation Transfer Solves Problems Using Anal-
`ogy 367 I Commonsense Problem Solving Can Generate Rulelike Prin-
`ciples 372 I The Macbeth Procedure Illustrates the Explanation Princi-
`ple 373 I The Macbeth Procedure Can Use Causal Chains to Establish
`Common Context 374
`
`Learning about Form and Function
`Examples and Precedents Help Each Other 377 I Explanation—Based
`Learning Offers More than Speedup 380
`
`Matching
`Stupid Matchers Are Slow and Easy to Fool 380 I Matching inexact
`Situations Reduces to Backward Chaining 381 I Matching Sheds Light
`on Analogical Problem Solving 383
`
`Summary
`
`Background
`
`CHAPTER I
`Learning by Correcting Mistakes
`
`Isolating Suspicious Relations
`Cups and Pails Illustrate the Problem 385 I Near-Miss Groups Isolate
`
`347
`
`349
`
`349
`
`359
`
`362
`
`363
`
`365
`
`365
`
`376
`
`380
`
`383
`
`384
`
`385
`
`385
`
`Page 11 of 34
`
`FORD 1321
`
`Page 11 of 34
`
`FORD 1321
`
`

`
`X
`
`Contents
`
`Suspicious Relations 386 I Suspicious Relation Types Determine Overall
`Repair Strategy 388
`
`Intelligent Knowledge Repair
`
`The Solution May Be to Explain the True-Success Suspicious Relations 388
`I Incorporating True-Success Suspicious Relations May Require Search 391
`I The Solution May Be to Explain the False-Success Suspicious Relations,
`Creating a Censor 393 I Failure Can Stimulate a Search for More Detailed
`Descriptions 395
`
`Summary
`
`Background
`
`CHAPTER T
`Learning by Recording Cases
`
`Recording and Retrieving Raw Experience
`
`The Consistency Heuristic Enables Remembered Cases to Supply Proper-
`ties 397 I The Consistency Heuristic Solves a Difficult Dynamics Prob-
`lem 398
`
`Finding Nearest Neighbors
`
`A Fast Serial Procedure Finds the Nearest Neighbor in Logarithmic Time 403
`I Parallel Hardware Finds Nearest Neighbors Even Faster 408
`
`Summary
`
`Background
`
`CHAPTER T
`Learning by Managing Multiple Models
`
`The Version-Space Method
`
`Version Space Consists of Overly General and Overly Specific Models 411
`I Generalization and Specialization Leads to Version-Space Convergence 414
`
`Version-Space Characteristics
`
`The Version-Space Procedure Handles Positive and Negative Examples Sym-
`metrically 420 I The Version-Space Procedure Enables Early Recogni-
`tion 421
`
`388
`
`396
`
`396
`
`397
`
`397
`
`403
`
`408
`
`409
`
`411
`
`411
`
`420
`
`Page 12 of 34
`
`FORD 1321
`
`Page 12 of 34
`
`FORD 1321
`
`

`
`Summary
`
`Background
`
`CHAPTER T
`Learning by Building Identification Trees
`
`From Data to Identification Trees
`
`The World Is Supposed to Be Simple 423 I Tests Should Minimize Disor-
`der 427 I Information Theory Supplies a Disorder Formula 427
`
`From Trees to Rules
`
`Unnecessary Rule Antecedents Should Be Eliminated 432 I Optimizing a
`Nuclear Fuel Plant 433 I Unnecessary Rules Should Be Eliminated 435 I
`Fisher's Exact Test Brings Rule Correction in Line with Statistical Theory 437
`
`Summary
`
`Background
`
`CHAPTER T
`Learning by Training Neural Nets
`
`Simulated Neural Nets
`
`Real Neurons Consist of Synapses, Dendrites, Axons, and Cell Bodies 444
`I Simulated Neurons Consist of Multipliers, Adders, and Thresholds 445
`I Feed-Forward Nets Can Be Viewed as Arithmetic Constraint Nets 446
`I Feed-Forward Nets Can Recognize Regularity in Data 447
`
`Hill Climbing and Back Propagation
`
`The Back-Propagation Procedure Does Hill Climbing by Gradient Ascent 448
`I Nonzero Thresholds Can Be Eliminated 449 I Gradient Ascent Requires
`a Smooth Threshold Function 449 I Back Propagation Can Be Understood
`Heuristically 451 I Back-Propagation Follows from Gradient Descent and
`the Chain Rule 453 I The Back-Propagation Procedure Is Straightfor-
`ward 457
`
`Back-Propagation Characteristics
`
`Training May Require Thousands of Back Propagations 458 I ALVINN
`Learns to Drive 459 I Back Propagation Can Get Stuck or Become Un-
`stable 461 I Back Propagation Can Be Done in Stages 462 I Back
`Propagation Can Train a Net to Learn to Recognize Multiple Concepts Si-
`multaneously 463 I Trained Neural Nets Can Make Predictions 464 I
`Excess Weights Lead to Overfitting 465 I Neural-Net Training is an Art 467
`
`422
`
`422
`
`423
`
`423
`
`431
`
`442
`
`442
`
`443
`
`443
`
`448
`
`457
`
`Page 13 of 34
`
`FORD 1321
`
`Page 13 of 34
`
`FORD 1321
`
`

`
`XiV
`
`Contents
`
`Summary
`
`Background
`
`CHAPTER T
`Learning by Training Perceptrons
`
`Perceptrons and Perceptron Learning
`Perceptrons Have Logic Boxes and Stair-Step Thresholds 471 I The Per-
`ceptron Convergence Procedure Guarantees Success Whenever Success is
`Possible 474 I Ordinary Algebra is Adequate to Demonstrate Conver-
`gence When There Are Two Weights 477 I Vector Algebra Helps You to
`Demonstrate Convergence When There Are Many Weights 480
`
`What Perceptrons Can and Cannot Do
`A Straight-Through Perceptron Can Learn to Identify Digits 482 I The
`Perceptron Convergence Procedure Is Amazing 484 I There Are Simple
`Tasks That Perceptrons Cannot Do 486
`
`Summary
`
`Background
`
`CHAPTER I
`Learning by Training Approximation Nets
`
`Interpolation and Approximation Nets
`Gaussian Functions Centered on Samples Enable Good interpolations 492
`I Given Sufficient Nodes, Nets Can interpolate Perfectly 494 I Given
`Relatively Few Nodes, Approximation Nets Can Yield Approximate Results
`for All Sample Inputs 496 I Too Many Samples Leads to Weight Train-
`ing 497 I Overlooked Dimensions May Explain Strange Data Better than
`Elaborate Approximation 499 I The lnterpolation-Approximation Point
`of View Helps You to Answer Difficult Design Questions 501
`
`Biological Implementation
`Numbers Can Be Represented by Position 501 I Neurons Can Compute
`Gaussian Functions 501 I Gaussian Functions Can Be Computed as Prod-
`ucts of Gaussian Functions 502
`
`Summary
`
`Background
`
`CHAPTER I
`Learning by Simulating Evolution
`
`Survival of the Fittest
`
`Chromosomes Determine Hereditary Traits 506 I The Fittest Survive 507
`
`468
`
`469
`
`471
`
`471
`
`482
`
`488
`
`488
`
`491
`
`491
`
`501
`
`503
`
`503
`
`505
`
`505
`
`Page 14 of 34
`
`FORD 1321
`
`Page 14 of 34
`
`FORD 1321
`
`

`
`Genetic Algorithms
`Genetic Algorithms Involve Myriad Analogs 507 I The Standard Method
`Equates Fitness with Relative Quality 510 I Genetic Algorithms Generally
`Involve Many Choices 512 I It Is Easy to Climb Bump Mountain With-
`out Crossover 513 I Crossover Enables Genetic Algorithms to Search
`High—Dimensional Spaces Efficiently 516 I Crossover Enables Genetic Al-
`gorithms to Traverse Obstructing Moats 516 I The Rank Method Links
`Fitness to Quality Rank 518
`
`Survival of the Most Diverse
`
`The Rank-Space Method Links Fitness to Both Quality Rank and Diversity
`Rank 520 I The Rank-Space Method Does Well on Moat Mountain 523
`I Local Maxima Are Easier to Handle when Diversity Is Maintained 526
`
`Summary
`
`Background
`
`PART
`Vision and Language
`
`CHAPTER I
`Recognizing Objects
`
`Linear Image Combinations
`Conventional Wisdom Has Focused on Multilevel Description 531 I Im-
`ages Contain Implicit Shape Information 532 I One Approach ls Matching
`Against Templates 533 I For One Special Case, Two Images Are Sufficient
`to Generate a Third 536 I Identification Is a Matter of Finding Consistent
`Coefficients 537 I The Template Approach Handles Arbitrary Rotation and
`Translation 539 I The Template Approach Handles Objects with Parts 542
`I The Template Approach Handles Complicated Curved Objects 545
`
`Establishing Point Correspondence
`Tracking Enables Model Points to Be Kept in Correspondence 547 I Only
`Sets of Points Need to Be Matched 547 I Heuristics Help You to Match
`Unknown Points to Model Points 549
`
`Summary
`
`Background
`
`CHAPTER T
`Describing Images
`
`Computing Edge Distance
`Averaged and Differenced Images Highlight Edges 553 I Multiple-Scale
`Stereo Enables Distance Determination 557
`
`XV
`
`507
`
`519
`
`527
`
`528
`
`529
`
`531
`
`531
`
`546
`
`550
`
`550
`
`553
`
`553
`
`Page 15 of 34
`
`FORD 1321
`
`Page 15 of 34
`
`FORD 1321
`
`

`
`XVi
`
`Contents
`
`Computing Surface Direction
`
`Stereo Analysis Determines Elevations from Satellite Images 563 I Re-
`flectance Maps Embody Illumination Constraints 564 I Making Synthetic
`Images Requires a Reflectance Map 567 I Surface Shading Determines
`Surface Direction 567
`
`Summary
`
`Background
`
`CHAPTER T
`Expressing Language Constraints
`
`The Search for an Economical Theory
`You Cannot Say That 576 I Phrases Crystallize on Words 576 I Re-
`placement Examples Support Binary Representation 578 I Many Phrase
`Types Have the Same Structure 579 I The X-Bar Hypothesis Says that All
`Phrases Have the Same Structure 583
`
`The Search for a Universal Theory
`
`A Theory of Language Ought to Be a Theory of All Languages 586 I A
`Theory of Language Ought to Account for Rapid Language Acquisition 588
`I A Noun Phrase’s Case Is Determined by Its Governor 588 I Subjacency
`Limits Wh- Movement 592
`
`Competence versus Performance
`
`Most Linguists Focus on Competence, Not on Performance 596 I Analy-
`sis by Reversing Generation Can Be Silly 596 I Construction of a Language
`Understanding Program Remains a Tough Row to Hoe 597 I Engineers
`Must Take Shortcuts 597
`
`Summary
`
`Background
`
`CHAPTER L
`Responding to Questions and Commands
`
`Syntactic Transition Nets
`
`Syntactic Transition Nets Are Like Roadmaps 600 I A Powerful Computer
`Counted the Long Screwdrivers on the Big Table 601
`
`Semantic Transition Trees
`
`A Relational Database Makes a Good Target 604 I Pattern instantiation
`is the Key to Relational-Database Retrieval in English 604 I Moving from
`Syntactic Nets to Semantic Trees Simplifies Grammar Construction 605 I
`
`562
`
`572
`
`573
`
`575
`
`575
`
`585
`
`594
`
`598
`
`598
`
`599
`
`599
`
`603
`
`Page 16 of 34
`
`FORD 1321
`
`Page 16 of 34
`
`FORD 1321
`
`

`
`Count the Long Screwdrivers 609 I Recursion Replaces Loops 612 I
`Q&A Translates Questions into Database-Retrieval Commands 615
`
`Summary
`
`Background
`
`APPENDIX
`
`Relational Databases
`
`Relational Databases Consist of Tables Containing Records 617 I Rela-
`tions Are Easy to Modify 617 I Records and Fields Are Easy to Extract 618
`I Relations Are Easy to Combine 619
`
`Summary
`
`EXERCISES
`
`Exercises for Chapter 1 627 I Exercises for Chapter 2 628 I Exercises
`for Chapter 3 630 I Exercises for Chapter 4 633 I Exercises for Chapter
`5 635 I Exercises for Chapter 6 635 I Exercises for Chapter 7 639 I
`Exercises for Chapter 8 643 I Exercises for Chapter 9 647 I Exercises
`for Chapter 10 649 I Exercises for Chapter 11 650 I Exercises for
`Chapter 12 652 I Exercises for Chapter 13 656 I Exercises for Chapter
`14658 I Exercises for Chapter 15 659 I Exercises for Chapter 16663 I
`Exercises for Chapter 17 667 I Exercises for Chapter 18 669 I Exercises
`for Chapter 19 671 I Exercises for Chapter 20 674 I Exercises for
`Chapter 21 675 I Exercises for Chapter 22 677 I Exercises for Chapter
`23 678 I Exercises forChapter24 679 I Exercises forChapter25 680 I
`Exercises for Chapter 26 682 I Exercises for Chapter 27 684 I Exercises
`for Chapter 28 689 I Exercises for Chapter 29 690
`
`BIBLIOGRAPHY
`
`INDEX
`
`COLOPHON
`
`XVII
`
`614
`
`614
`
`617
`
`626
`
`627
`
`693
`
`725
`
`737
`
`Page 17 of 34
`
`FORD 1321
`
`Page 17 of 34
`
`FORD 1321
`
`

`
`Backtracking and
`Truth Maintenance
`
`In this chapter, you see how logic serves as a foundation for other
`problem—solving methods. In particular, you learn to prove statements by
`constraint propagation. Although ordinary resolution theorem proving is
`more powerful in some ways, proof by constraint propagation offers dis-
`tinctive benefits. For one thing, proof by constraint propagation makes it
`easy to keep track of justifications enabling truth-maintenance procedures
`to withdraw assumptions simply, without deriving still-true expressions all
`over again.
`
`First, however, you learn what dependency—directed backtracking is, and
`how it differs from chronological backtracking. The key idea is illustrated
`through a scheduling example involving exercise, entertainment, and study
`time.
`
`CHRONOLOGICAL AND
`
`DEPENDENCY-DIRECTED
`
`BACKTRACKING
`
`In Chapter 11, you learned how arithmetic constraints can be enforced in
`nets. In this section, you learn about dependency-directed backtrack-
`ing, which makes it possible to find the sources of inconsistencies.
`
`305
`
`Page 18 of 34
`
`FORD 1321
`
`Page 18 of 34
`
`FORD 1321
`
`

`
`305
`
`Chapter
`
`14 Backtracking and Truth Maintenance
`
`Limit Boxes Identify Inconsistencies
`Imagine that each day of the week involves a set of choices. Some choices
`involve entertainment, others involve exercise, and still others involve study.
`The choices are constrained by weekly objectives: There must be enough
`entertainment, enough exercise, enough study, and not too much money
`spent. To make things specific, let us assume these are the options:
`I
`Tuesday, Wednesday, and Thursday are study days. You can study 2,
`4, or 6 hours, or not at all.
`I Monday and Friday are exercise days. One choice is to take a walk,
`producing 5 exercise units; another is to jog 10 kilometers, producing
`10 exercise units; another is to work out at a health club, producing
`15 exercise units; and still another is to do nothing. The health club
`costs $20 per use in fees and taxi fares.
`I Monday and Friday are also entertainment days. One choice is to go
`out to eat, which costs $20 and returns 2 pleasure units. Another is to
`read a library book, which costs nothing and returns 1 pleasure unit.
`Another is to do nothing, which costs nothing and returns 0 pleasure
`units.
`
`Further assume that, each week, to stay stimulated and healthy, you must
`have 6 hours of study, 2 units of pleasure, and 20 units of exercise. To keep
`your budget in line, you must limit your expenses to $30 per week.
`All these assumptions can be expressed in the slightly generalized nu-
`meric constraint net shown in figure 14.1. The daily choices are expressed
`in choice boxes. Numbers are propagated through adder boxes. The
`weekly needs are expressed in limit boxes.
`If the value at the terminal
`of a limit box is outside the allowed range, the plan is unacceptable. Now
`suppose you plan the week as follows:
`I On Monday, you expect snow, making walking and jogging diflicult, so
`you plan to go to the health club. This takes time, so you will not have
`time for entertainment.
`
`I You plan to study 2 hours on Tuesday.
`I You plan to study 2 more hours on Wednesday.
`I You plan to study 2 more hours on Thursday.
`I On Friday, you plan to take a walk and to read a library book.
`Now, propagating these choices through the numeric constraint net to the
`limit boxes, you see that you have met your study needs, for you have
`studied 6 hours, yielding 6 study units. Exercise is also under control, for
`you have managed to get 20 exercise units. On the other hand, you are in
`trouble with entertainment. Worse yet, you cannot correct the problem by
`going out to dinner because that would put you over your budget, because
`you have already spent $20 on visiting your health club.
`You must alter a choice to fix the plan. The question is, How can you
`find a good plan quickly, preserving as much of the bad plan as possible?
`
`Page 19 of 34
`
`FORD 1321
`
`Page 19 of 34
`
`FORD 1321
`
`

`
`Limit Boxes Identify Inconsistencies
`
`307
`
`Exercise
`
`Monday's exercise choices
`
`0 Exercise Units; $0
`5 Exercise Units; $0
`10 Exercise Units; $0
`15 Exercise Units; $20
`
`Monday's entertainment choices
`
`0 Pleasure Units; $0
`1 Pleasure Unit; $0
`2 Pleasure Units; $20
`
`Tuesday's study choices
`
`0 Study Units
`2 Study Units
`4 Study Units
`6 Study Units
`
`Wednesday's study choices
`
`0 Study Units
`2 Study Units
`4 Study Units
`6 Study Units
`
`Thursday's study choices
`
`0 Study Units
`2 Study Units
`4 Study Units
`6 Study Units
`
`Friday's exercise choices
`0 Exercise Units; $0
`5 Exercise Units; $0
`10 Exercise Units; $0
`15 Exercise Units; $20
`
`gee
`re-—-e—
`
`Friday's entertainment choices
`
`g,___j
`me
`
`0 Pleasure Units; $0
`1 Pleasure Unit; $0
`2 Pleasure Units; $20
`
`Figure 14.1 A numeric constraint net expressing activity-planning constraints.
`
`Page 20 of 34
`
`FORD 1321
`
`Page 20 of 34
`
`FORD 1321
`
`

`
`308
`
`Chapter
`
`14 Backtracking and Truth Maintenance
`
`Chronological Backtracking wastes Time
`
`One way to work on a faulty plan is to withdraw the most recently made
`choice, and its consequences, to select an alternative at that choice point,
`and to move ahead again. If all the alternatives at the last choice point have
`been explored already, then go further back until an unexplored alternative
`is found.
`
`The whole process resembles what you do when you work your way
`through a maze: you come to dead ends, you retreat, you go forward
`again. As you learned in Chapter 4, this procedure is called depth-first
`search. That part of the procedure that responds to dead ends is called
`chronological backtracking, to stress that everything is undone as you
`move back in time.
`
`Chronological backtracking normally begins as soon as a dead end is
`detected, as stipulated in the following procedure description:
` —?j_:
`
`To backtrack chronologically,
`
`t> Whenever you reach a dead end,
`
`> Until you encounter a choice point with an unexplored
`alternative,
`
`D Withdraw the most recently made choice.
`
`> Undo all consequences of the withdrawn choice.
`
`D Move forward again, making a new choice.
`
`The problem with chronological backtracking is clear: Many of the with-
`drawn choices may have nothing whatever to do with why the dead end is
`a dead end. In the activity—planning example, chronological backtracking
`would begin on Friday, when the problem of meeting the entertainment
`objective is detected. After quickly withdrawing and trying both of the
`remaining entertainment choices, chronological backtracking works its way
`back in time, trying other exercise alternatives for Friday, and then going
`through all of Thursday’s, Wednesday’s, and 'I‘uesday’s study alternatives,
`none of which have anything to do with the entertainment problem that ini-
`tiated the backup. Chronological backtracking must work its way through
`43 = 64 study combinations before reaching back to the choice point that
`matters.
`
`Thus, chronological backtracking can be inefficient. In real problems,
`the inefficiency can make a problem—solving system absurdly impractical.
`
`Nonchronological Backtracking Exploits Dependencies
`
`Another way to work on the faulty plan is to withdraw the choices that
`matter. To identify the choices that matter, you need only to keep track of
`how propagation reaches from choice boxes to the limit boxes that announce
`dead ends.
`
`Page 21 Of 34
`
`FORD 1321
`
`Page 21 of 34
`
`FORD 1321
`
`

`
`Truth Can Be Propagated
`
`309
`
`The highlighted lines in figure 14.2 show that the entertainment limit
`box complains because the choices made on Monday and Friday are incom-
`patible with the weekly entertainment requirement.

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket