Скачать презентацию Issues Related to Design of Automatic Systems — Скачать презентацию Issues Related to Design of Automatic Systems —

0498f106ecc8ef456dd3e54acd5c1aa1.ppt

  • Количество слайдов: 54

Issues Related to Design of Automatic Systems - from a Human Factors Perspective Human-Centered Issues Related to Design of Automatic Systems - from a Human Factors Perspective Human-Centered Automation Ann Britt Skjerve Institute for Energy Technology

Content Part I: Overview • • • Background and definition The operators’ role Keys Content Part I: Overview • • • Background and definition The operators’ role Keys Research Issues 1. Task allocation 2. Human-system interface design 3. Effects on the individual and the organisation Domain: High-risk systems. Part II: Two examples on interface design studies • • Human-Centred Automation Experiments Extended Teamwork Study

Background • 1952: The term automation was applied in an article in Scientific American Background • 1952: The term automation was applied in an article in Scientific American • Mechanization of human labour – Overcome human capacity problems – Automation of “physical” tasks – Routine tasks Production increase • Complex and safety critical tasks – Automation of “control” tasks (Crossman, 1974) – Automation of “management” tasks (Billings, 1991)

Definition(s) Oxford English Dictionary: automation • • • Automatic control of the manufacture of Definition(s) Oxford English Dictionary: automation • • • Automatic control of the manufacture of a product through a number of successive stages; The application of automatic control to any branch of industry or science; By extension, the use of electronic or mechanical devices to replace human labor.

Why are Humans still in High-Risk Systems? • Not all tasks can be automated. Why are Humans still in High-Risk Systems? • Not all tasks can be automated. . . – Degree of proceduralization – Automation may fail – Technology • Cost effectiveness • Legal requirements • Public opinion

Classification Systems: Ten Levels of Automation – an example (Sheridan, 1980) Degree of computer Classification Systems: Ten Levels of Automation – an example (Sheridan, 1980) Degree of computer participation LOW HIGH 1. 2. Human considers alternatives, makes and implements decision. Computer offers a set of alternatives which human may ignore in making decision. 3. Computer offers a restricted set of alternatives, and human decides which to implement. 4. Computer offers a restricted set of alternatives and suggests one, but human still makes and implements final decision. 5. Computer offers a restricted set of alternatives and suggests one, which it will implement if the human approves. 6. Computer makes decision, but gives human option to veto before implementation. 7. Computer makes and implements decision, but must inform human after the fact. 8. Computer makes and implements decision, and informs human only if asked to. 9. Computer makes and implements decision, and informs human only if it feels this is warranted. 10. Computer makes and implements decision if it feels it should, and informs human only if it feels this is warranted.

Effects of Automation 1/2 Some positive effects – Increased production levels • Automatic train Effects of Automation 1/2 Some positive effects – Increased production levels • Automatic train control (ATC) • Trains: Faster and with shorter distances between • Each new generation of commercial aircrafts has improved on the safety record of its predecessors – Automation as a “key” element of competitiveness”

Aircraft Generations Accident rate for 3 generations of aircrafts (Airbus Industry Safety Department “Hangar Aircraft Generations Accident rate for 3 generations of aircrafts (Airbus Industry Safety Department “Hangar Flying”, June 1997, as referred in Pariès and Amalberti, 1999).

Effects of Automation 2/2 Some negative effects associated with automation use – Increased complexity Effects of Automation 2/2 Some negative effects associated with automation use – Increased complexity for the human operator: – Reduced safety margins – Operators are left to deal with automation malfunctions – Reduced possibility for practising operational skills

Key Research Issues - From a HF Perspective 1. Task allocation – How should Key Research Issues - From a HF Perspective 1. Task allocation – How should tasks be allocated between humans and machine? 2. Design of the human-system interface – How should the human-system interface be designed to support the operators’ performance? 3. Effects on the individual and the organisation – How is the individual and the organisation affected by automation?

Task Allocation How should tasks be allocated between humans and machine? Three strategies for Task Allocation How should tasks be allocated between humans and machine? Three strategies for task allocation • The Left-Over Principle • The Comparison Principle • The Complementary Principle

The Left-Over Principle 1/2 • Operators are the most unreliable element • To the The Left-Over Principle 1/2 • Operators are the most unreliable element • To the extent possible operators should be eliminated from the production process • Automate everything that can be automated • The tasks that cannot be automated (i. e. , fully proceduralised) are left-over to the operators. Example: “To improve the reliability of NPP’s, it is primarily effective to automate the hardware as much as possible and to eliminate to the maximum extent human intervention by recognition, judgement and response to information. ” (Inoue et al. , 1991, 449)

The Left-Over Principle 2/2 Critique – Tasks are left-over to the operators without considering The Left-Over Principle 2/2 Critique – Tasks are left-over to the operators without considering human capacity issues • Vigilance • Work load • Cognitive requirements ”. . . the designer who tries to eliminate the operator still leaves the operator to do the tasks which the designer cannot think how to automate. ” (Bainbridge, 1993)

The Comparison Principle 1/2 • Human operators and automatic systems have different capabilities • The Comparison Principle 1/2 • Human operators and automatic systems have different capabilities • Allocate the tasks to the ’agent’ that is better suited to perform the task Fitts’ List (1951)

The Comparison Principle 2/2 Critique – Tasks are allocated without consideration for the overall The Comparison Principle 2/2 Critique – Tasks are allocated without consideration for the overall tasks performance process of the human operators – The overall operator tasks may not correspond to human capacity • Etc.

The Complementary Principle 1/2 • CRITIQUE OF THE LEFT-OVER AND THE COMPARISON PRINCIPLES: Considerations The Complementary Principle 1/2 • CRITIQUE OF THE LEFT-OVER AND THE COMPARISON PRINCIPLES: Considerations are given for how the different task elements should be allocated, not for how the human and the automatic system should perform the task together. • Optimal task allocation is achieved by ensuring that the performance of the operators and the automatic system complement each other – How will the automatic system and the human operators most efficiently perform the operational task?

The Complementary Principle 2/2 Critique – Task performance is a dynamic process • It The Complementary Principle 2/2 Critique – Task performance is a dynamic process • It can be difficult to foresee in advance how a task performance process will progress, and thus how humans and automation may most efficiently complement each other – The limits of technology vs. the apparent adaptability of humans.

Design of the human-system interface How should the human-system interface be designed to support Design of the human-system interface How should the human-system interface be designed to support the operators’ performance? Changed operator role: – From primarily involving operation to primarily involving supervision and deviation handling.

Human-System Interface Design Issues 1/2 Difficulties associated with human-automation interaction: – Monitoring load – Human-System Interface Design Issues 1/2 Difficulties associated with human-automation interaction: – Monitoring load – Vigilance – Workload distribution – Silent automation – ‘Automation surprises’ “After three decades of highly prolific research on human vigilance, we are still making the same seemingly contradictory statement: a human being is a poor monitor, but that is what he or she ought to be doing. ” (Wickens, 1992)

Human-System Interface Design Issues 2/2 • Representation of the systems activity, current problems: – Human-System Interface Design Issues 2/2 • Representation of the systems activity, current problems: – Physical and mental Isolation (Norman, 1990) • Isolated from the moment-to-moment activity – Workload distribution: Too high or too low workload – Increased complexity: Understanding what happens in situations with deviations – ”Out-of-the-loop” • Technical design, current issues in terms of Human Factors: – Compensatory activity, may hide deviations to the operators – Reduced time-span to handle deviations

Human-Centred Design (Rouse, 1991) • Three central attributes: – It focuses on the roles Human-Centred Design (Rouse, 1991) • Three central attributes: – It focuses on the roles of humans in complex systems – Design objectives are elaborated in terms of humans’ roles – Specific design issues that follow from these objectives • Three primary objectives: – To enhance human abilities. – To help overcome human limitations. – To foster user acceptance. • Example, approach: “. . . the purpose of a pilot is not to fly the airplane that takes people from A to B – instead, the purpose of the airplane is to support the pilot who is responsible for taking people from A to B. ” (Rouse, 1991)

Human-Centred Automation 1/2 • Human-Centred Automation (HCA): • Definition: Automation designed to work cooperatively Human-Centred Automation 1/2 • Human-Centred Automation (HCA): • Definition: Automation designed to work cooperatively with the human operators in the pursuit of stated objectives. ” (Billings, 1991) • Assumption: The human operator should always constitute the starting point in a design process, because the operator ultimately is responsible for the performance outcome

Human-Centred Automation 2/2 The HCA design principles: The human operator must be in command: Human-Centred Automation 2/2 The HCA design principles: The human operator must be in command: 1) To command effectively, the human operator must be involved. 2) To be involved, the human operator must be informed. 3) The human operator must be able to monitor the automated systems. 4) Automated systems must be predictable. 5) The automated systems must also be able to monitor the human operator 6) Each element of the system must have knowledge of the others’ intent. (Billings, 1991, 1997)

The gap between user-centred intentions and technology-centred development Some causes: • Oversimplify the pressure The gap between user-centred intentions and technology-centred development Some causes: • Oversimplify the pressure and task demands from the users’ perspective • Assume that people can and will call to mind all relevant knowledge • Are overconfident that they have taken into account all meaningful circumstances and scenarios • Assume that machines never err • Make assumptions about how technology impacts on human performance without checking for empirical support or despite contrary evidence • Define design decisions in terms of what it takes to get the technology to work • Sacrifice user-oriented aspects first when tradeoffs arise • Focus on building the system first, then trying to integrate the results with users. (Sarter, Woods and Billings, 1997)

Effects on the individual and the organisation • How is the individual and the Effects on the individual and the organisation • How is the individual and the organisation affected by automation? Changed operator role: – New design New ways of working…

Individual Skills • Manual control skills – Gradual decay • Cognitive skills – Frequency Individual Skills • Manual control skills – Gradual decay • Cognitive skills – Frequency of use, retrieval – Reduced feedback, memory Vigilance 30 minutes

Organizational Issues • Changes in work content • Changes in work practices Motivation Job Organizational Issues • Changes in work content • Changes in work practices Motivation Job satisfaction Safety – Changes in the lines of authority – Changes in the responsibility of staff members • Changes related to status (-> self-esteem) Ø Education and Training Ø Possibility for intervening Ø Willingness to intervene Will the system in practice fulfil the goals it was designed to fulfil?

TWO EXAMPLES Focusing on Interface Design Part II Interface Design – Starting point: Task TWO EXAMPLES Focusing on Interface Design Part II Interface Design – Starting point: Task allocation has been decided (see ISO model) – Question: How should the humansystem interface be designed to support human-automation transaction? Two Research Programs – Human-Centred Automation – Extended Teamwork Control Centre Design and Modification Process, (based on ISO Std. 11064 -1, 2000).

IFEs Human-Centered Automation (HCA) Research Program IFEs Human-Centered Automation (HCA) Research Program

Introduction to the HCA Program, cont. Motivation: Providing a better understanding of how operators’ Introduction to the HCA Program, cont. Motivation: Providing a better understanding of how operators’ performance is influenced by automation to reduce the negative effects of automation. Main Issues: To develop theories on how automation may influence operator performance, based on experimental studies. To develop measures for studying humanautomation interaction. Specific Goal: Develop HCA design support.

The HCA-2000/2001 Experiments Research question: How do operators handle two types of automation malfunctions The HCA-2000/2001 Experiments Research question: How do operators handle two types of automation malfunctions when operating from human-machine interfaces, which contain either explicit or implicit information about the activities of the automatic system? Automation: Independent variables: Defined as: Interlocks, limitations, protections, controllers, programs (1) Automation Malfunction Type (2) Automation-Information Presentation Type

Design of Human-System Interface Basic Representation Types • Implicit: Representation of a device’s activity Design of Human-System Interface Basic Representation Types • Implicit: Representation of a device’s activity through its effect on something else • Explicit: Direct representation of a device’s activity Basic Representation Forms (include) • Text • Graphic • Sound • Etc.

Automation-Information Presentation How may the activity of the automatic system be represented explicitly ? Automation-Information Presentation How may the activity of the automatic system be represented explicitly ? – Explicit presentation of main activities – Graphic feedback – Verbal semantically meaningful feedback – Intentional agent

Two Automation-Information Presentation Types Conventional Interface • Main process components • Main process flow Two Automation-Information Presentation Types Conventional Interface • Main process components • Main process flow • Control formats Experimental Interface + Main automatic devices + Computerized logic diagrams + Verbal feedback Program ’A 3’ is starting up

The Conventional Interface The Conventional Interface

The Experimental Interface The Experimental Interface

Overview of the HCA-2001 Experiment • Study performed in HAMMLAB (NORS) • Licensed operators Overview of the HCA-2001 Experiment • Study performed in HAMMLAB (NORS) • Licensed operators from the Loviisa NPP • Six crews of three operators (RO/TO/SS) • Four scenarios - a basic scenario combined with two sets of automation malfunctions • Two experimental manipulations • Two breaks in each scenario

Measurement Techniques System-Performance Measures – Operator Response Time (ORT) – OPAS MF, intervening actions Measurement Techniques System-Performance Measures – Operator Response Time (ORT) – OPAS MF, intervening actions (ACT) Human-Performance Measures – OPAS MF, detection (DET) – Situation Awareness Rating Technique – Halden Complexity Questionnaire – Halden Cooperation Quality Questionnaire – Halden Trust in Automation Questionnaire

Experimental Design Main Characteristics • 2 x 2(x 3) within-subject design • Counterbalancing of Experimental Design Main Characteristics • 2 x 2(x 3) within-subject design • Counterbalancing of the presentation order and the combination of the experimental conditions across crews • Psychometric evaluation of response data before hypothesis testing [construct validity, inter-item reliability, criterion validity]. • ANOVA for statistical hypothesis testing. • Hypothesis test performed at the crew-scenario level. Changes Introduced: • Malfunctions re-sat after each scenario period (20 min). • Basis scenario: One turbine synchronised. • Inclusion of a shift supervisor.

The Effects of the Interface Manipulation Workload The Effects of the Interface Manipulation Workload

Interpretation The beneficial effects of the experimental AIP interface are significant. • HCA-2000 and Interpretation The beneficial effects of the experimental AIP interface are significant. • HCA-2000 and HCA-2001 • Detailed information about the automatic system’s activity • Graphically • Verbal feedback • Complexity <> the number of items represented !

The Extended Teamwork 2004/2005 Exploratory Study - Focusing on Human-Automation Co-operation The Extended Teamwork 2004/2005 Exploratory Study - Focusing on Human-Automation Co-operation

Background: Upcoming Industry Needs New operational concepts are currently being debated in the domain Background: Upcoming Industry Needs New operational concepts are currently being debated in the domain of nuclear power plants: multiunit operations by a single operator A substantial increase in the automation level remote unit operations by a single crew Changed staff roles Staff reduction? no full-time, on-duty operations staff, occasional operations tasks are performed by other functional units (e. g. , engineering or maintenance) reduced staff with an individual for multiple reactors and decentralized functional groups for maintenance and emergency Etc. New requirements associated with operation Increased levels of autonomy a nd authorit y NPP process New tools

Research Question The purpose of the Extended Teamwork Research Program: To contribute with knowledge Research Question The purpose of the Extended Teamwork Research Program: To contribute with knowledge on how new operational concepts may affect the quality of teamwork in an operational team. The purpose of the Extended Teamwork Study: To assess how familiarity with operation in a subset of a particular (see prev. slide) new operational environment may affect teamwork. Background for understanding the point of transition Home Plant Field visits Period with increased familiarization Experimental Control-Room Exploratory Study Solution Training Pre- First scenario Last scenario

Teamwork - Theory Co-operation Theory or Social-Interdependence Theory …how people believe their goals to Teamwork - Theory Co-operation Theory or Social-Interdependence Theory …how people believe their goals to Attributes of teamwork • share information openly • take one another’s perspective • communicate and influence each other • exchange resources • assist and support one another • handle conflicts efficiently be related to the goals of other people is a useful way to understand dynamics of the interaction between humans and its consequences…

Types of Teamwork Types of teamwork considered: Teamwork between humans • Co-operation Across Distances Types of Teamwork Types of teamwork considered: Teamwork between humans • Co-operation Across Distances – “Teamwork” between humans and automation • Human-Centred Automation – Extended teamwork • Teamwork-knowledge framework – Extended teamwork: a distinguishable set, at a minimum, two human agents and a machine agent who interact dynamically, interdependently and adaptively toward a common goal.

Main Characteristics of the Study Experimental Facilities: HAMMLAB and the VR-lab Participants: 6 crews Main Characteristics of the Study Experimental Facilities: HAMMLAB and the VR-lab Participants: 6 crews of licensed NPP operators, Swedish NPPs crew: – – – Reactor operator (RO) Shift-supervisor (SS) Field operator (FO) Experimental Team-Composition: Real Role Exp. Role RO or SS FO Control Room Operator Site Co-ordinator Technician Assumed location (Lab) Remote Control-Center (HAMMLAB) On-site (VR-Lab) On-site (VR Lab) Scenarios • 12 scenarios, 40 minutes, minor disturbances, requiring co-operation • The presentation order of the scenarios was randomised. Preliminary

Key Measures: Human-Automation • Interviews – Following training and following completion of the 12 Key Measures: Human-Automation • Interviews – Following training and following completion of the 12 scenarios. – Expectations and lessons learned • Human-Automation Co-operation Quality • Trust in Automation – Questionnaires – Operators’ subjective judgements • Teamwork quality – Process expert’s rating of teamwork • Operators’ ability to detect critical events – Operators ability to detect predefined critical events.

The Control-Room Operator at Work The Control-Room Operator at Work

The Automatic Agents Main program Part program Condition for execution List of sequences Have The Automatic Agents Main program Part program Condition for execution List of sequences Have performed Currently Performs Will be performed Selection of agent Latest voice message Action suggestion

Results: Familiarization Human-Human Co-op. Human-Auto Co-op. Collective Efficacy Trust in Auto Team. Bars (expert) Results: Familiarization Human-Human Co-op. Human-Auto Co-op. Collective Efficacy Trust in Auto Team. Bars (expert)

Results: Human-Agent Co-operation Interviews • CROs view on the Agents Initially a rather negative Results: Human-Agent Co-operation Interviews • CROs view on the Agents Initially a rather negative view on the Agents: (1) Usefulness and (2) Co-operability. – After the 12 scenarios, a much more positive view: (1) Necessary, (2) Co-operative, – but (3) Context sensitivity should be increased. • In general, the CROs feel lonely Misses support, in particular in situations with deviations. – A Control-Room Operator (CRO)at work. Human-Agent Transactions • The level of Agent use is scenario dependent. – • Total Freeze Time demonstrated a significant correlation with the operators’ subjective judgment of human-automation co-operation quality. The higher the level of human-agent transaction, the better the operator is able to detect critical occurrences.

Implications: Design of Automatic Agents • When designing Automatic Agents, the following would be Implications: Design of Automatic Agents • When designing Automatic Agents, the following would be useful: • Application of verbal feedback. – A similar result was obtained in the “Human-Centred Automation” experiments. • “Action suggestions” function The Human-Agent Interface. – Is particularly useful in situations that the operators do not The operators’ need of the Agents encounter often. • “Freeze” function – Is a must for the operators’ to remain in control. • “Repeat message” function – Is a must when verbal feedback is applied. seems to facilitate their acceptance of the Agents. In addition from the HCA experiments: Complexity is not determined by the number of items represented at the interface…

Summary of the Main Issues Three task allocation principles – Left-Over – Comparison – Summary of the Main Issues Three task allocation principles – Left-Over – Comparison – Complementary Human-System Interface Design – – Silent interfaces Representation, feedback Human-Centred designs Human-Centred Automation Effects on the individual and the organisation – Deskilling – Organisational changes – Affect how and the extent to which a system will be used. Examples: – IFEs Human-Centred Automation Program – IFEs Extended Teamwork Program - Teamwork where automatic agents are team members?