throbber
INTERNATIONAL JOURNAL OF COGNITIVE ERGONOMICS, 2001, 5(1), 37–57
`Copyright © 2001, Lawrence Erlbaum Associates, Inc.
`
`On the Design of Adaptive Automation for Complex
`Systems
`
`David B. Kaber
`Department of Industrial Engineering
`North Carolina State University
`
`Jennifer M. Riley and Kheng-Wooi Tan
`Department of Industrial Engineering
`Mississippi State University
`
`Mica R. Endsley
`SA Technologies
`Marietta, Georgia
`
`ABSTRACT
`
`This article presents a constrained review of human factors issues relevant to adaptive automation
`(AA), including designing complex system interfaces to support AA, facilitating human–computer in-
`teraction and crew interactions in adaptive system operations, and considering workload associated
`with AA management in the design of human roles in adaptive systems. Unfortunately, these issues
`have received limited attention in earlier reviews of AA. This work is aimed at supporting a general the-
`ory of human-centered automation advocating humans as active information processors in complex
`system control loops to support situation awareness and effective performance. The review demon-
`strates the need for research into user-centered design of dynamic displays in adaptive systems. It also
`points to the need for discretion in designing transparent interfaces to facilitate human awareness of
`modes of automated systems. Finally, the review identifies the need to consider critical human–human
`interactions in designing adaptive systems. This work describes important branches of a developing
`framework of AA research and contributes to the general theory of human-centered automation.
`
`1.
`
`INTRODUCTION
`
`Adaptive automation (AA) has been described as a form of automation that allows for dy-
`namic changes in control function allocations between a machine and human operator based
`on states of the collective human–machine system (Hilburn, Byrne, & Parasuraman, 1997;
`Kaber & Riley, 1999). Interest in dynamic function allocation (DFA, or flexible automation)
`
`Requests for reprints should be sent to David B. Kaber, Department of Industrial Engineering, North Carolina
`State University, Raleigh, NC 27695–7906. E-mail: dbkaber@eos.ncsu.edu
`
`Yuneec Exhibit 1026 Page 1
`
`

`
`38
`
`KABER, RILEY, TAN, ENDSLEY
`
`has increased within the recent past as a result of hypothesized benefits associated with the
`implementation of AA over traditional technology-centered automation. Purported benefits
`include alleviating operator out-of-the-loop performance problems and associated issues,
`including loss of situation awareness (SA) and high mental workload. Though the expected
`benefits of AA are encouraging, there are many unresolved issues regarding its use. For ex-
`ample, there is currently a lack of common understanding of how human–machine system
`interfaces should be designed to effectively support implementation of AA.
`In this article, current AA literature is reviewed in the context of a theoretical framework
`of human-centered automation research with the objective of identifying critical factors for
`achieving human–automation integration to support the effective application of AA to com-
`plex systems. We describe branches of a research framework supporting human-centered
`automation that seems to have been neglected by previous literature reviews, including the
`implications of the design of AA on operator workload and the effects of AA on hu-
`man–computer interaction (HCI) and crew interaction. This work is important because an
`optimal approach to AA remains elusive. Developing a unified perspective of the aforemen-
`tioned issues may serve as a basis for additional design guidance to structure AA applica-
`tions beyond that previously provided.
`
`1.1. Human-Centered Automation Theory and AA
`
`A theory of human-centered automation closely related to AA states that complex systems
`should be designed to support operator achievement of SA through meaningful involvement
`of operators in control operations (Endsley, 1995b, 1996; Kaber & Endsley, 1997). Involve-
`ment may occur through intermediate levels of automation (LOAs) or through AA. Both
`techniques may be effective for increasing operator involvement in control operations as
`compared to full automation. Human-centered automation is concerned with SA because it
`has been found to be critical in terms of successful human operator performance in complex
`and dynamic system operations (cf. Endsley, 1995a). AA has been proposed as a vehicle for
`moderating operator workload or maintaining it within predetermined acceptable limits,
`based on task or work environment characteristics, to facilitate and preserve good SA
`(Hilburn et al., 1997; Kaber & Riley, 1999). Therefore AA might be considered a form of hu-
`man-centered automation. Unfortunately, the relation between SA and workload presents a
`conundrum to those designing automation. Optimization of both SA and workload in the
`face of automation can prove difficult. Under low workload conditions associated with high
`levels of system automation, operators may experience boredom and fatigue due to lack of
`cognitive involvement, or interest in, control tasks. Operators of autonomous systems are of-
`ten forced into the task of passive monitoring of computer actions rather than active task pro-
`cessing. Even when attending to the monitoring task, decreased task involvement can com-
`promise operator SA (Endsley & Kaber, 1999; Endsley & Kiris, 1995; Pope, Comstock,
`Bartolome, Bogart, & Burdette, 1994). This is an important issue because operators with
`poor SA may find it difficult to reorient themselves to system functioning in times of system
`failure or unpredicted events. Therefore, automated system performance under failure
`modes may be compromised.
`Conversely, cognitive overload may occur when operators must perform complex, or a
`large number of, tasks under low levels of system automation (e.g., complete manual con-
`trol). High workload can lead directly to low levels of SA and task performance, as opera-
`
`Yuneec Exhibit 1026 Page 2
`
`

`
`ADAPTIVE AUTOMATION
`
`39
`
`tors struggle to keep up with the dynamically changing system. Increasing task
`requirements beyond that which the human is cognitively capable of managing can also lead
`to feelings of frustration and defeat, as well as a loss of confidence in ability to complete the
`task. The operator may then become detached from the task, resulting in loss of SA. Again,
`the loss of SA can lead directly to poor human–machine system performance.
`The first situation described above may be due to system and task design. The second sit-
`uation may result from operator reactions to a difficult task. It should be noted that, between
`these two extremes, it has been found that SA and workload can vary independently
`(Endsley, 1993). The challenge for AA research is to identify the optimal workload, or func-
`tional range, under which good levels of operator SA and total system performance will be
`possible.
`The key issues that must be addressed to meet this need include determining how the de-
`sign of automation or AA methods affect operator workload and how system information
`should be communicated to operators to facilitate SA under AA. Several studies have dem-
`onstrated positive results in terms of operator SA when applying AA as an approach to hu-
`man-centered automation of complex systems. For example, Kaber (1997) observed
`improvements in SA in a simulated automatic dynamic, control task (“radar” monitoring
`and target elimination) when using a preprogrammed schedule of periodic shifts of task
`control between intermediate- and high-level automation and manual control, as compared
`to fully autonomous or completely manual control. Although important for establishing
`preliminary system design guidelines and providing insights into methods of AA, this work
`and other recent studies (e.g., Kaber & Riley, 1999) have been conducted using specific task
`and operational scenarios and, therefore, results may have limited generalizability to a
`broad range of systems.
`Unfortunately, at this point there exists no theory of AA that can optimally address SA
`and workload tradeoffs across all types of complex systems (e.g., air traffic control, produc-
`tion control, and telerobotic systems). This article seeks to address this issue by supporting
`the concept of human-centered automation and presenting an understanding of aspects of
`the relation of AA to SA and workload not previously explored in detail.
`
`1.2. Previous Research
`
`Preliminary or casual reviews of AA research have been published (cf. Parasuraman,
`Mouloua, Molloy, & Hilburn, 1996; Scerbo, 1996), summarizing empirical studies of the
`concept, which make inferences toward a general theory of AA. For example, Scerbo’s work
`includes a brief review of traditional automation, proposed AA mechanisms and strategies,
`and potential benefits and concerns with the implementation of AA. Our work complements
`this effort by discussing some new issues, such as
`
`1. Failures in AA design to consider operator workload requirements associated with
`managing dynamic control allocations between themselves and automated systems
`in addition to maintaining system task responsibilities.
`2. The need to determine how human–computer interfaces should be designed to sup-
`port effective human–automation communication under AA.
`3. The need to evaluate the impact of implementation of AA on human crew interac-
`tions in systems control.
`
`Yuneec Exhibit 1026 Page 3
`
`

`
`40
`
`KABER, RILEY, TAN, ENDSLEY
`
`These issues are considered in the context of the human-centered automation theory with the
`intent of developing a more complete knowledge of AA.
`
`2. WORKLOAD AND AA
`
`Unfortunately, it has been observed through empirical study of AA that operators of many
`complex, dynamic systems may experience workloads above desired levels as a result of
`concentrating on control function allocations and maintaining task responsibilities simulta-
`neously (Kaber & Riley, 1999; Scerbo, 1996). An increase in human operator workload as-
`sociated with introduction of automation in complex systems is not a new issue. Selcon
`(1990) observed that fighter aircraft pilot perceptions of flight workload increased signifi-
`cantly with the introduction of automated decision aids into aircraft cockpits.
`There are two general cases in which perceived workload increases may occur in appli-
`cations of AA. First, operators may perceive increased cognitive load in monitoring com-
`puter management of function allocations between themselves and automated subsystems
`(Endsley, 1996). This may be due in part to operator anxiety about the timing of allocations
`and the need to complete a particular task during system operations. It may also be attrib-
`uted to an additional load on the visual channel in perceiving task-relevant information on
`“who is doing what.”
`The second involves implementation strategies of AA where the human has the task of
`managing function allocations in addition to performing routine operations. Under these
`circumstances, workload increases may be even greater than that associated with monitor-
`ing computer-based dynamic control allocations (Selcon, 1990). Additional problems indi-
`cate operators may have trouble in identifying when they need to switch from manual to
`automated modes or vice versa (Air Transport Association, 1999). Failures to invoke auto-
`mation or manual control have been identified as occurring due to operator overload,
`incapacitence, being unaware of the need for a different LOA, or poor decision making
`(Endsley, 1996).
`Kaber and Riley (1999) studied the effect of AA on operator workload during
`dual-task performance involving a primary dynamic control task and an embedded sec-
`ondary monitoring task. Participants in this study were provided with a computer deci-
`sion aid that either suggested or mandated DFAs between manual and automated control
`of the primary task based on participant performance in the secondary task. The authors’
`objective was to maintain secondary task performance within 20% of optimal perfor-
`mance observed during testing in the absence of primary task control. Average secondary
`task performance levels during dual-task functioning were within approximately 30% of
`optimal secondary task performance. It is important to note that when the primary task
`was fully automated, secondary task performance was within 5% of optimal. However,
`automated primary task performance may not have been superior to AA of the task.
`Kaber and Riley attributed the observed decrease in performance (indicative of increased
`workload) to the need for individuals to monitor automated dynamic control allocations
`or to manage them, which was not considered in establishing optimum secondary task
`performance baselines or the design of the dual-task paradigm. This is an important issue
`that needs to be considered by future research to ensure that AA achieves the objectives
`of human-centered automation (i.e., moderating workload and maintaining SA). Methods
`
`Yuneec Exhibit 1026 Page 4
`
`

`
`ADAPTIVE AUTOMATION
`
`41
`
`for dealing with AA-induced workload must be devised. A critical step to developing
`such techniques would be to evaluate operator workload associated with the implementa-
`tion of general AA strategies separate from system task workload. These workload com-
`ponents could then be used to drive AA design.
`
`3.
`
`INTERFACE DESIGN FOR AA
`
`In addition to considering the effects of AA on workload, the effects on operator SA must
`also be considered. Implementation of AA may introduce added complexity into system
`functioning and control. Consequently, operators require advanced interfaces that are useful
`for dealing with this complexity to enhance, rather than hinder, system performance. AA
`will require extra attention to developing interfaces that support operator SA needs at vary-
`ing LOAs and in ways that support their ability to transition between manual and automated
`control and back again.
`Scerbo (1996) suggested that the success of AA will in large part be determined by sys-
`tem interface designs that include all methods of information exchange (e.g., visual, audi-
`tory, haptic, etc.). With this in mind, one goal of the interface design for AA systems is akin
`to that of HCI research, that is, to facilitate the transmission of information to and from the
`human and system without imposing undue cognitive effort on the operator in translating
`the information. There are many other general human factors interface design principles for
`complex systems that may have applicability to interfaces for AA, including, for example,
`the list provided by Noah and Halpin (see Rouse, 1988). However, what is needed at this
`point are high-level and specific interface design recommendations that are presented in the
`context of systems to which AA is most common, such as aircraft.
`
`3.1. AA and Cockpit Interfaces
`
`Although aircraft systems currently support a crude level of AA (pilots may shift between
`manual and automated control at will), a number of problems with this process have been
`noted. For instance, today’s automated flight management systems do not adequately sup-
`port pilots in coordinating between information meant to support manual flight and that
`meant to support automated flight (Abbott, Slotte, & Stimson, 1996). For example, the
`American Airlines Flight 965 aircrew that crashed in Cali, Columbia, in 1995 was forced to
`struggle with paper maps and displays that used different nomenclatures and provided dif-
`ferent reference points, making it very difficult to coordinate between manual and automated
`operations (Endsley & Strauch, 1997). They furthermore had only partial information pro-
`vided through any one source and, therefore, were required to integrate cryptic flight plan in-
`formation in working memory. These discrepancies leave pilots faltering in trying to work
`with systems that do not support their operational needs. The systems interfaces are poorly
`designed in terms of providing the SA needed for understanding the behavior of the aircraft
`in automated modes, and predicting what a system may do in any given situation has proven
`erratic. Attempts by pilots to make dynamic shifts in LOAs in situationally appropriate ways
`have been shown to be fraught with problems (Air Transport Association, 1999; Endsley &
`
`Yuneec Exhibit 1026 Page 5
`
`

`
`42
`
`KABER, RILEY, TAN, ENDSLEY
`
`Strauch, 1997), and aircraft interfaces do not allow pilots to track shifts and to effectively and
`efficiently adapt to them.
`At a very basic level, system displays for supporting manual and automated control need
`to be consistent and coordinated to allow smooth transition from one mode of operation to
`another. In the context of aviation systems, Palmer, Rogers, Press, Latorella, and Abbott
`(1995) stated that interface design should
`
`1. Foster effective communication of activities, task status, and mission goals, as well
`as the development of useful and realistic conceptual models of system behavior.
`2. Enhance operator awareness of his or her own responsibilities, capabilities, and limi-
`tations, as well as those of other team members.
`3. Support DFA that is quick, easy, and unambiguous.
`
`The latter recommendation is directed at AA and supporting pilot performance when shifts
`in LOAs occur. These are important recommendations because the way in which an interface
`presents information to the user will impact what is perceived, how accurately information is
`interpreted, and to what degree it is compatible with user needs or models of task perfor-
`mance (all of which may critically influence operator development of good SA on modes of
`operation of a complex system).
`Unfortunately, the application of AA to complex systems like aircraft often increases
`rather than decreases the amount of information an operator must perceive and use for task
`performance, including data on system automation configuration and schedules of control
`function allocations. On the basis of Palmer et al.’s (1995) recommendations, interfaces for
`AA must support integration of such data regarding “who is doing what” with task-relevant
`data. And they should ensure that all information is presented in a cohesive manner; there-
`fore, function allocation information should have meaning to current task performance. For
`example, aircraft automated vertical flight control modes should provide guidance on the
`operation of different types of speed control (e.g., speed controlled via elevators with maxi-
`mum thrust or idle thrust) and altitude control (e.g., vertical speed or altitude controlled via
`the elevators and speed controlled via throttles) on the basis of current phase of flight and
`current flight segment, as well as the current LOA for flight control (Feary et al., 1998).
`In addition to these, interfaces are needed to facilitate the development of strong mental
`models regarding how such a complex system will function across many classes of situa-
`tions. Lehner (1987) stated that accurate mental models are important because HCI can re-
`main effective even when there is significant inconsistency between the problem-solving
`processes of the human and the decision support system, although system error conditions
`may occur in which recovery is only possible by one method of operation. Cockpit inter-
`faces for supporting mental models of automated systems in aircraft operations have been
`found to be very poor, leading to significant difficulties in understanding system behavior
`(McClumpha & James, 1994; Wiener, 1989).
`In particular, mental model development can be affected by system response feedback on a
`user’s actions through an interface in addition to consistently displayed system state informa-
`tion. Feedback allows the operator to evaluate the system state in relation to his or her control
`actions, goals, and expectations of system functioning. Both individual and team feedback of
`knowledge of system states and responses have been shown to optimize human–machine per-
`formance (Krahl, LoVerde, & Scerbo, 1999). Lack of feedback forces the human into an
`open-loop processing situation in which performance is generally poor (Wickens, 1992).
`
`Yuneec Exhibit 1026 Page 6
`
`

`
`ADAPTIVE AUTOMATION
`
`43
`
`Although the need for good SA and good mental models are fundamental to the opera-
`tion of automated systems in general, achieving them can be even more challenging with the
`added complexity of AA. System interfaces need to support the understanding of not just
`one system, but multiple systems, in that at different levels of AA, the system may operate
`in very different ways.
`
`3.2. Dynamic (Cockpit) Displays For AA
`
`Morrison, Gluckman, and Deaton (1991) also raised general interface design issues that
`should be considered when implementing AA in the airplane cockpit. They stated that auto-
`mated tasks may require new interfaces and cues so that (a) the status of the automation is
`clearly indicated to the human, (b) effective coordination of task performance is facilitated,
`(c) monitoring of the automated task by the human is encouraged, and (d) manual perfor-
`mance of the task after automation is not negatively affected. These interface characteristics
`are similar to the design recommendations made by Palmer et al. (1995). Unfortunately, they
`do not offer specific interface design guidelines for AA. However, like many other AA re-
`searchers, Morrison et al. are proponents of using adaptive interfaces, or displays that change
`dynamically, according to changes in AA control allocations to ensure the effective coordi-
`nation of task performance.
`Introducing dynamic displays into adaptive system interface design is currently a critical
`research issue. Dynamic displays can allow for consideration of operator information re-
`quirements as well as styles of interaction through their configuration. For example, dy-
`namic displays implemented in the aircraft cockpit can present specific interface features
`based on different modes of automated flight and functional roles of pilots under different
`modes. They can also allow pilots to select or deselect features according to their individual
`information needs and styles of flying the aircraft (Wiener, 1988). By allowing for flexible
`configuration of displays and meeting pilot information requirements, SA may be enhanced
`and performance made effective across modes of aircraft operation.
`Dynamic displays have, however, been noted to cause human–machine system perfor-
`mance problems depending on how they are implemented. If dynamic displays are opti-
`mized to include just the information that supports a particular mode of operation, the global
`SA that is needed to support operators’ knowledge of when to switch modes may be lacking.
`That is, they also need information that will alert them to the need to switch from automated
`to manual control and the information that will support such a transition smoothly. Display
`interfaces that are optimized for automated control may lack sufficient information to allow
`operators to build up this level of understanding. The same can be said of the transition from
`manual to automated control, although this may not be as difficult. Norman (1990) noted
`that designers often leave critical information out of automated displays in the belief that
`operators no longer need that information.
`From the opposite perspective, Wiener (1988) pointed out that there is a potential toward
`display clutter and ill-considered symbols, text, and color in many dynamic display designs
`for complex systems. This is brought about by the designer attitude that if it can be included
`in the interface, then it should. This approach to interface design strays from the theory of
`human-centered automation (Billings, 1997). A number of AA research studies have been
`conducted to establish interface design approaches to address this tendency. For example,
`direct manipulation interface design was proposed by Jacob (1989) as an interface style for
`use in AA systems to offset some performance disadvantages associated with dynamic dis-
`
`Yuneec Exhibit 1026 Page 7
`
`

`
`44
`
`KABER, RILEY, TAN, ENDSLEY
`
`plays that are linked to different modes of automation and to address the lack of transpar-
`ency of system functions through interfaces under high LOAs. The lack of function
`transparency has been associated with mode awareness problems (Sarter, 1995).
`Ballas, Heitmeyer, and Perez (1991) studied the direct manipulation interface style to de-
`termine whether it would be particularly effective in an intelligent cockpit implemented with
`AA. Two important components of direct manipulation that were anticipated to improve per-
`formance included (a) reduced information processing distance between the users’ intentions
`and the machine states and (b) direct engagement without undue delay in system response and
`with a relatively transparent interface. Ballas et al. found that using direct manipulation and
`maintaining a consistent interface style could offset the negative effects of changing controls
`and displays (dynamic displays). They also speculated that direct manipulation would en-
`hance SA in assessment tasks in particular, and, consequently, have the potential to reduce au-
`tomation and dynamic display-induced performance disadvantages.
`In support of these findings, other research has shown that adaptive systems providing
`indirect manipulation and opaque interfaces have negative effects on human–computer
`communication and overall system performance, as they may restrict human interaction
`with the system (Scerbo, 1996). Sarter and Woods (1994) observed an automation opacity
`problem with adaptive systems and claimed that user data interpretation becomes a
`cognitively demanding task rather than a mentally economical one.
`On the basis of this research in the context of adaptive systems, Scerbo (1996) encouraged
`designers of interfaces for AA to include as many information formats as possible to allow
`data to flow more freely between the human and system. In this way, operators may be able to
`communicate more naturally because information translation would not be limited to one or
`two formats. However, it is important to ensure that multimodal interface capabilities of con-
`temporary, complex systems are not exploited to the extent of causing information overload,
`as previously observed by Wiener (1988) in historical dynamic displays.
`
`3.3. Summary of Interface Design Research For AA
`
`Some general guidelines for AA have been presented in context, but they do not offer the
`specificity needed to fully support design. Further applied work is needed in this area to eval-
`uate the degree to which the designs of dynamic displays support human performance with-
`out increasing cognitive and perceptual loading. In addition, work should be done to explore
`the effects of using multiple display formats, as some researchers have suggested, for meet-
`ing specific operator information requirements and simultaneously ensuring global aware-
`ness of system states and changes among modes of operation. In particular, careful attention
`needs to be paid to the extra demands associated with detecting the need for, and effecting,
`smooth transitions between AA modes.
`
`4. AA AND HCI
`
`Communication is a critical factor in achieving effective human–automation integration.
`Most researchers agree that effective communication among complex system components is
`critical for overall system success (see Scerbo, 1996). This is due to each individual compo-
`nent or member of the system (the human or computer) possessing knowledge and informa-
`tion that other members may not. Thus, each member must share information to make deci-
`sions and carry out actions.
`
`Yuneec Exhibit 1026 Page 8
`
`

`
`ADAPTIVE AUTOMATION
`
`45
`
`Within the context of human–human teams, this need has been termed shared SA. Shared
`SA is defined as the degree to which team members have the same awareness of information
`requirements for team performance. It is related to team SA, which is “the degree to which
`each team member has the information needed for his/her job” (Endsley & Jones, 1997, p.
`47). Shared SA incorporates not only information on system states, but also the effect of
`task status and actions of other team members on one’s own goals and tasks (and vice versa)
`and projections of the future actions of other team members. For a human–machine team,
`the same need exists. The machine will have certain expectations of human behavior built
`into it and needs to ascertain what actions have or have not been taken in relation to its pro-
`gramming. The human operator needs to have an understanding of not only what the ma-
`chine has done, but also what it is doing and will do next. Failures in this shared SA among
`humans and machines are well documented (Wiener, 1989).
`The tendency for sharing of information between parties may change with changes in
`system function allocation and LOAs associated with AA. This must be considered in AA
`design and interface design. The question can be raised as to whether the human operator
`will be able to continue communicating with automation effectively without performance
`implications when the mode of system automation changes dynamically, regardless of the
`quality of the interface design. The mode of system automation, the structure of the opera-
`tor’s role, and operator workload may inhibit critical information flow and, in the worst
`case, only allow the human to observe the system. Because of the manner in which automa-
`tion is structured in a supervisory control system, human operators are not permitted in-
`volvement in active decision making on a routine basis during system operations. Process
`control interventions can be used for error prevention, but they do not provide for regular
`communication between the operator and system automation. This is unlike other forms of
`automation, such as batch processing systems, where operators are involved in active con-
`trol of the system and communicate with the automation in planning and decision-making
`tasks on a regular basis, although the communication may be related to future processes.
`Active versus passive decision making has been identified as a critical factor in operator
`out-of-the-(control) loop performance problems, including a loss of SA (Endsley & Kiris,
`1995). Under supervisory control, operators are normally provided with high-level summa-
`ries of system functions handled by automation (Usher & Kaber, 2000). This form of feed-
`back may be sufficient for monitoring the safety of system states, but it is usually inadequate
`for decision making toward planning operations and so forth. This problem extends beyond
`interface design as it is rooted in the adaptive structuring of the system and the natural be-
`havior of human operators, although it may be affected by interface design changes. Re-
`search needs to identify how effective human–automation interaction can be maintained
`across LOAs regardless of changes in the role of the operator in order to promote SA and
`performance when DFAs occur.
`
`4.1. Establishing a Human–Automation Relationship and Potential
`Problems
`
`To ensure effective human–automation communication under AA, Bubb-Lewis and Scerbo
`(1997) said a working relationship between the human and the system must be developed.
`Muir (1987) offered some suggestions for developing this relationship in adaptively auto-
`mated systems, including providing operators with information such as the machine’s areas
`of competence, training operators in how the system works, providing them with actual per-
`
`Yuneec Exhibit 1026 Page 9
`
`

`
`46
`
`KABER, RILEY, TAN, ENDSLEY
`
`formance data, defining criterion levels of acceptable performance, and supplying operators
`with predictability data on how reliable the system is. Also, appropriate feedback mecha-
`nisms and reinforcement during training are important ingredients in developing a relation-
`ship and creating an effective human–computer team.
`Key problems in training and performance that can serve to undermine the human–auto-
`mation relationship and interaction under AA include human information misinterpretation.
`This problem may stem from the inability of the human to assess the intention of the com-
`puter system (Bubb-Lewis & Scerbo, 1997; Mosier & Skitka, 1996). As a result of these
`misinterpretations, Suchman (see Bubb-Lewis & Scerbo, 1997, p. 96) claimed that systems
`can lead humans “down the garden path,” sometimes never reaching a solution. To prevent
`this type of problem, Woods, Roth, and Bennett (1990; also see Bubb-Lewis & Scerbo,
`1997) suggested presenting the machine’s current state, goals, knowledge, hypotheses, and
`intentions to the human in a clear and

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket