• Nie Znaleziono Wyników

Human factors in bridge operations decision support at future shipbridges

N/A
N/A
Protected

Academic year: 2021

Share "Human factors in bridge operations decision support at future shipbridges"

Copied!
10
0
0

Pełen tekst

(1)

watchstander requirements could be reduced significantly, with more efficient use being made of the platform inachinety.

The support elements available to IPMS - ECDIS, EHM. On-hoard Training, will provide the ship with a better capability to mariage its machin-ery and to plan the overhaul of equipment so that it will provide better operational availability. The overall cost of maintenance will be reduced.

1f the concept of an IPMS is accepted. then it is clear that the traditional method of passing Damage Control iniotiriation by sound powered telephone is outdated, and can be replaced by an electronic IPMS that is linked with the CIS so as to provide a better case for Total Ship Survivability. This approach

will improve the capability of the ship to manage major damage, and at the same time it will reduce the number of people required, and the cost of damaged system repairs.

The technology is here today; it has been demonstrated in parts by several nations, and it needs to be fully integrated in order to gain the maximum benefits available. The major issues that need to be resolved include a change in the way traditional operations are executed - this leads to a reduction in the crew required to complete the task. The benefits available include lower costs, increased performance and capability, and affordability through commonality if implemented fleet wide.

TECCE M1IVERStTEiT

Laboratorium voor Scheepshydromechanica

Archief

Mekelweg 2. 2628 CD DeIft

Tel.: 015.7U6873. Fax:O15781C33

HUMAN FACTORS IN BRII)GE OPERATIONS:

DECISION-SUPPORT AT FUTURE SHIPBRIDGES.

H. ScI,uffel, TNO institute for Hwnan Factors. Soesrerberg, The Neterla,uls. ABSÌ'RACT

Under contract to the Netherlands Foundation for the Coordination of Maritime Research, investigations were started, in collaboration with UK research institutes, to provide guidelines for the design of user interfaces, training programmes and watch procedures. given the cognitive abilities of the human operator. The study focusses on future shipbridges conceived of as operation centres where the system functions navigate, maintain status of platform, propulsion, passengers, cargo and crew are supervised and to a small extent controlled. The main theme is that the watch officer monitors by means of a sensor-database the status of these functions, anticipates deviations from plans, and incidentally compensates disturbances manually. This user centred design approach should improve safety and efficiency. The first phase of the study will define critical conditions through function analysis, function allocation and accident analysis. These conditions will be used to test in laboratory experiments, and in more realistic simulator experiment.s, expectations on cognitive behaviour in interaction with various possibilities of computem suppoil systems. As a result of this first phase function descriptions, expectations on decision-making, and a possible structure for the interaction with decision-support systems are elucidated.

INTRODUCTH)N

The development of technology and costs of personnel has led to a major interest in automation. Rather than distinct functions being manually controlled by several persons, the trend is towards a single person supervis-ing all main functions simultaneously, i.e. "navigate". "platform". "cargo", "crew" and "passengers" (e.g. Schuflèl, 1992). However, it has since long been recognized that humans are not very good at process monitoring in order to detect very infrequent signals. Furthermore, in such a complex environment, with human control at a high system level, error may result in disastrous consequences. For this reason, the ongoing automation of functions is more often questioned in favour of a more active role for the operator in the integral system (Wiener, 1985). In addition to reconsidering lunction allocations much effort has been spent on the development of intelligent decision-support systems at sea, which might involve several assessment and reasoning modules regarding the functions "navigate", "platform", "cargo", "crew" and "passengers". Recent systems do not only involve electronic chart systems (ECDIS), hut contain relational databases

(2)

with both geographical and alpha-numeric attribute values (Fawcett et at., 1992; Vijlbrief & Van Oosterom, 1992). Situation assessment and action selection require a great deal of knowledge, taking into account environmen-tal factors such as weather conditions and a careful weighting of action consequences. To some extent, support may he given by incorporating these

knowledge structures in the interface by preprogramming statusjudgment in the context of an operation. However, by their very nature, nonroutine, critical situations are unique and unforeseen and the question remains, how to support the operator in these critical situations in order to minimize effects of task complexity, time pressure and stress. To optimize decision-support in

these situations insight is requid into the bottle-necksof human information

processing under critical conditions. Human en-or has consistently been identified as a major factor underlying accidents (Margeits, 1976; Wagenaar & Groeneweg, 1987; Schuffel, 1987). Within this research area many cognitive and ergonomic factors were identilied that contributed to the

accidents, such as problem identiflcation and infonnation interpretation and

the match of the interface with human characteristics. There is literature with critical elements deduced from observations onboard ships, indicating the wide variety of risk factors. These accident analyses and observations are based, however, on the existing situation, and their results might not be generalizable to the future shipping situation. Automation changes the tasks to be perfonned and, as a result, also the potential risk factors and the consequences of human error. Human information processing in a supervi-sory control situation will be taxed most in critical situations. Furthermore, because of the uniqueness of these situations, supportdoes not directly result from eliciting expert knowledge that has been gained under nommaI

condi-tions. The present study is aimed at getting insight into decision-making

Design and development activity

mission & S ce n dr io n analysis function analysis function allocation task

analysis performanceprediccon

interface & workspace design Information generated mission IL system I requirements

r

functions

>

'I)-, human function machine fonction required task I performance,,,,1 equired machine pe tortoance I

Figure 1: The systems ergonomics app-oac1i (ujier Döring,1983)

3-130

(

workload evaluation

display and control requirements, work station design reqoirn ment s, Working envi,onnrerir. personnel snlection & training -r

I

under critical circumstances, in order to tune function allocationand deci-sion-support to worst case scenarios ¡ri which informationprocessing is affected by tune pressure and stress. The results providewell-founded hypotheses for designing and testing effective interface structures.

2. FUNCTION DESCRIPTION

2.1 System design

The starting point of the systemdevelopment process for boththe specifications of equipment and personnel is the identiflcationof operational needs. To transfonn the operational needs into a systemdescription, systems

engineering follows a series of steps, involving various typesof analyses, trade-off studies, simulation and other experimental tests. Thesequence of human factors design steps follows the same general pattern assystems engineering, including mission analysis, function analysis, l'unction alloca-tion, task analysis and performance prediction (Beevis, 1992). These steps are repeated several times in the course of thedesign process (see Figure1).

By analyzing the IflSSOiÌrsystem functions are detennined (see Figure 2).

The analysis of system functions leads to functionalrequirements which are the basis of allocating the function to human or machine. The detailed

MISSION conduct Ope la lion navigate perce iv e heading error

Figure 2.- Example f ahierarchical decomposition offunctionsinto tusks(15 a

preparation for tile function allocation process. 3-13 1 maintain platform statu s adjuSt heading naintare cargo sta tu S control heading maintain statuscreW passengers communicate maintain ship routine

(3)

function analysis identifies the task performance required to the operator and to the machine. Finally the analysis of the operator and the machine performance gives the cinta for interface design and workstations, environ-ment design, workload evaluation and personnet selection and training (see also Booher, 1990). Despite the similarity in aims and procedure, human

factors analyses are not always conducted concurrently with other systems engineering activities. In current practice. the system concept is often

developed well beyond the point of function allocation before human factors issues are considered. This makes the human factors function allocation analyses of little value. Yet the increasing levels of automation in current systems make it moie important that the roles and functions of the human operators be analyzed in detail. When adopting this functional approach in the design process, there is a significant drawback. Systems engineers and designers often show a strong bias of thinking in terms of specific compo-nents rather than functions, which limits the development of new ideas and promotes the ineffective application of new techniques. In shipping industry a well known example of such a biasis the transfer from riveted to welded ship constructions. Riveting of steel plates needs overlap. With the

introduc-tion of welding, initially plates were welded with overlap as was usually done with riveting. After some time, the plates were directly welded without

overlap, improving economy and quality of the construction. The transfer was directly made from old to new components without thinking about the functionality of the construction. The parallel with introducing computers on board ships is to design user interfaces not based anymore on the traditional

Table I: Reduction of the number of human-related ca usai factors due to a wheelhouse concept disigned according to ergonomic principles.

stand alone equipment of a specific manufacturer. providing a sensor, wiring, processor and display, but to think in tenus of functional requirements. The function view improves the effectiveness of the design process. It is recommended that an existing system is taken to translate the meaning of old components into old functions, for instance from "curtains on the bridge" to "screening light sources on the bridge for using charts at night". Then the new, lutuic mission requirements are considered, leading to new functions, such as (unobstructed) view on the ship surroundings and (unhampered) visual look-out at clay and night time, which governs theselection of new system components such as the electronic chart display adaptable lòr day and nighttime.

2.2 Critical elements of functions

In a study ou causal factors in shipping accidents, Schuffel (1987) analyzed 100 shipping accidents over the period 1982 up to 1985 concerning Netherlands ships. In these 100 accidents, 276 causal flìctors were involved. These factors were divided over the system elements as follows: 209 hwnan related factors, 24 equipment related factors, 9 proceduralrelated factors, 34 environment related factors. These 209 human related causal factors were analysed with regard to improvement of safety by a better match between human characteristics and bridge interfaces (ergonomic bridge), promising a reduction of causal factors of 68 % (see Table I). From 209 factors in the

conventional bridge (047 fl the ergonomic bridge. Thesefindings were in line with those of Drager (1981) concerning shipping accidents along the Norwegian coast. It was recommended to support the watch officer with information processing capabilities in the coastal areas. Although the information processing appears from these studies as a weak point, the nature of human error has not yet been analysed in detail in the maritime context. Wagenaar et al (1987) mentions behavioural habits. personality (see also

Boer, 1990 and Veltman, 1992) wrong hypotheses, ergonomics, training and information processing as contributing elements to human error. Further exploration of human error, preferably by means of statistical analysis (e.g.

Kristiansen, 1980) and to a certain extent by observation is of course possible hut lias limitations with regard to predicting error with new equipment and new generations of mariniers.

2.3 Discussion

For the allocation of functions, the system design approach offers a suitable frame work. The function decomposition provides details of the kind of system activities. With regard to the planned research there is the need to describe to full extent the generic behaviour of a new system with new technology and new functions. Statistical analysis however does not reveal deficiencies of human activity in great detail. Moreover, the extrapo-lation from old to nevv functions is questionable and requiresalso a generic

Functions

Sy.leiit cleinicou

Ilulilans I1quipnieul Procciturts I!ovironrrirnl Total

NAVIGATE irrparc ooy.ge 37 7 I 2 I S S 45 3 Conduct rrrsiknr, Conrsnnnicsrc 109 8 20 3 3 2 I 2 I 20 8 3 5 33 7 34 9 I'IATFOI8M Iropol IS I 4 3 I 23 2 Stoat 12 I 5 3 I I 9 4 Monitor s6rp cooddios, I I Groersla nlculnairy Groerato snppty 25 14 9 i 2 7 7 38 21 CARGo I I I T 0 1 A I. 209 47 24 14 9 3 34 21 276 88

(4)

HIS

Figure 3: The four levels of supen'isor' control (after Moray, 1986)

description of human and equipment tìiluics. The efforts for observations at

sea and the analyses of accidents will be limited to enlarge the effort to

complete a generic failure description, partly based on existing material. Theories on cognitive behaviour will help to describe and categorize cimrs in decision-making under routine and critical conditions. The outcome

constitutes the scenarios for testing new interface concepts.

3. OPERATOR CONCEPTS

3.1 Introduction

In this chapter on operator concepts we will discuss features of supervi-sory control and, as the major task is to intervene whenever SyStem perfor-mance does not meet prespecified criteria, decision-making behaviour in

non-routine Situations. Supervisory control is characterised by the fact thai the task is actually pertònned by machines while the operator monitors whether perfonnance remains within its normal limits. However, even

though automation has unquestionably improved the performance of some

system functions, for example position fixing by means of satellite, (he

qualitative changes in the human task require elaborate analysis of strong and weak aspects of human information processing in relation to the task

requirements, particularly under critical conditions.

3-134

3.2 Supervisry control

As stated above a maui characteristic of supervisory control is (he

inediation of machines between the operator and the actual ship system

functions. Figure 3 shows a model of supervisory control according to Moray (1986). Moray distinguishes four hierarchical levelsofcontrol. The lowest levels -constituting the component-interactive system (CIS)-. control

the hardware coiiìponents Much as engines, pumps and valves. The control of

these components is executed by low-level controllers that are no morethan negative feedback servoloops. One level up contains an

lntelligent'

computer that provides an interface between the human and the low-level

controllers. The operator gives commands on goals, set-points,and

perfor-mance criteria in a high-level language through this human

interactive-system (HIS). The HIS can use stored knowledge to issue commands to the

CIS that will optimize performance criteria specified by the operator. The HIS also provides information on the underlying mechanisms to the operator through its display. One fi.ature distinguishing humans from machines is the human ability to cope with nonroutine (emergency) situations (Price, 1985). In critical, unforeseen situations, the operator will have to take over control

of (he ship because of his unique abilities on the so-called

"knowledge-based level". The tenn "knowledge-"knowledge-based" has been borrowed from the skiJI-rule-knowledge framework olRasnìussen (1987). At the skill-based level, human perfonnance is governed by stored patterns of preprogrammed responses to stimuli. Errors at this level are incorrect responses to signals in time and space, due to the inherent variability in dynamic environments. The mule-based level applies to familiar problems in which solutions aregoverned by stored rules. These rules are of the type <if(state) then

(diagnosis)>or<if(state) then (remedial action)>. Errors at this level are

typically associated with misclassifying situations leading to the application of the wrong rule or with the recall of incorrect procedures. In novel

situations (he knowledge-based level comes into play. At this level controlled

reasoning processes are used based on stored knowledge.Errors at this level

arise from resource limitations and incomplete or incorrect knowledge. To date, the high-level control functions have mainly been allocated to the human component of the system. Not all aspects of this task,however,

match human capabilities. First, even though some functions are better done by machines, the human has to check whether they areactually performing

up to the standards. Numerousexperiments have indicated, however, that

operators are not good in vigilance performance, i.e. in detecting occasional

and unpredictable changes in an operating environment (see, lòr example, Koelega, 1992). Second, because of the automation the operatoris not trained in dealing with the system, and yet in most critical situations the operalor is required to take over control. Thus, on automating the system the

operator is deprived from training on what is actually one of his most

3-135

level 4

Chuman operator J

level 3

r

displays controls

level 2

(

CtS sensors actuators -I level i ( component of L. process

(5)

_____________

important responsibilitíes. Bainbudge (1987) described this contradiction as one of the 'ironies of automation' . In our intervicws the same problem was indicated; by using advanced radar support, skills to interact with charts are lost. These observations clearly show the challenges with regard to maxi-mizing system perl()rrnance by ami optimal allocation of functions to human and machine. Even on accepting the superiority of humans on the knowl-edge level. some features in the supervisory control task further complicate adequate perfoiiiìaiìce in dealing with the actual system. With the possibility

lbr autoiiiation. systems also became more complex: more functions, dependent relations ¿imid greater dependence on accurate communication.

Reason (1990) distinguishes two main factors underlying mistakes at the knowledge level which will have greater impact in systems with a high complexity level: bounded rationality and the fuct that knowledge relevant to the problem is nearly always incomplete and often inaccurate. Automation has increased the distanoe from the operator to the actual system tobe controlled. The supervisor interacts with the system at a high level (through the HIS), without the need to know system perfotinance at a detailed level. Because of this, the operator may lack a correct mental mcxiel of the system in critical situations. Thus, in such cases the huirían has to gather infonria-tion in omder to create a mental model of the actual system state. Or. in other words, it takes some time to get into the loop again. time for which the

critical situation may not allow. The difficulty to get into the loop is also illustrated by aviators who switch from automation mode to manual control mode long before the actual landing (Wiener and Curry, 1980). Distanoe to the actual system to be controlled in combination with complexity will increase the delays in feedback loops. The master executes an action, for example altering course or ordering the engineer to raise the anchor, arid it takes some time before the consequences of this action become manifest. Coping with feedback (delays) is a major factor foi non-optimal perfonriance in complex, dynamic environments (Brehmer, 1992; Sterman. 1989). On centralizing control functions this aspect will be emphasised.

3.3 Decision-making

Our study will be directed towards decision-making in critical situations. which are typically characterised by time pressure, unceilainty, stress and information overload. lu tenus of mental load these critical situations seem to be orthogonal to the normal situation in which most functions are per-formed autoiiìatically, occasionally intervened by the supervisor. Under these conditions mental underload may occur, or in other words operators may get bored. In a study of Veltman and Gaillard (1992) it was concluded that under changing circumstances information underload can cause

infonima-tion overload. 1f the manning react to underload by doing extra tasks less attention is paid to the main task, which may eventually lead to risky

situations. Fwthcrmore, underload may strengthen the effects offatigue.

5.-J s__1 ç

t

C O o S -4.

-4.

g 2 t; E 2 o O) E D -1 s.J ¿ ¿

f-'

o

t

g 2 t;c > 4.

2 -* g

a

t

G) g E D 4 O) G) G) a Q) E >-G) G) a E o u

f-'

o 0) E a .4.

(

t u i a

t

s:)

g-.j

s.-' _\ C-) g U

t

2 E

t

t

E fTh C o 2 o

-t

O E s.J

f'

C 2 > -g t; G) D

'_i

(\

t

0 D U Q) E D

-t

9 E E O

t

Q)

t

O) )._J

f'

o

-C

4 (o

.-tml

G) G) I 'CG)

Ut

G)J 0 g2

-c

0 >0

\_J

t

g 0 2 g X 0)

s.i

-Th

t

G) O) -0)0 -Û o

v

-- 5 E -G) D

(6)

However, even though mental underload should receive more attention, iii this chapter ve will restrict ourselves to decision-making in ciitical situa-lions, i.e. situations of infonnation overload. The intervention of the supervisor in a non-routine critical situations starts vitli the detection of a discrepancy between actual state and goal state. A very general model of decision-making is provided in Figure 4. The overall goal ol the supervisor is to keep the values of all functions, i.e. "navigate", "platform", "cargo", "crew" and "passengers". within their normal range. This implies that in

case of a problem in one of these functions, his task is to solve this problem, while continuing to monitor the other functions. The strategy that will be employed in dividing attention over these functions plausibly depends on both the seriousness of the problem and the risks of disturbances in other functions (as a result of for example dependencies between functions). However, a potential threat in such a situation is that all attention is paid to solving a problem in one function at the expense of monitoring the others. The phases that are distinguished in Figure 4 can refer to one or several detected problems. The decision process starts after the identification of a problem, i.e. a discrepancy is detected between actual state of the system and

its goal state in one or more functions. This may for example be read from an annunciator. The problem and its solution may be immediately apparent to operator, in which case he can directly select an action, or information is needed to construct an accurate mental model of the precise system values. AtÌer the information is gathered a reasoning process is started, in which the infonnation is interpreted and integrated, resulting in the selection of a cause. Then an important phase may result for the operator; weighting the conse-quences of alternative actions. He has to consider various attributes such as

time, safety and comfort. Then the operator can select and execute an action, in order to solve the problem. Through feedback loops the operator receives information on the effect of the executed action. As already indicated the decision phases distinguished in Figure 4 need not necessarily appear mn a strict order. Especially in critical situations the operator will lust try to keep safe, before action is taken at a more detailed level. The following observa-tion provides an example of a strategy in which a more global acobserva-tion is taken

before the actual problem is dealt with: the master on the ferry who was

confronted with the stabilizer problem lirst reduced speed (action), in order

to create more time for the engineers to repair the system. 3.4 Discussion

Supervisory control can induce mnajom problemiìs in thehuman intoimnatmon

processing system, because of the amount or iulòmination to be processed. the

updating of a correct mental model amid failures in commnunicatmon both in imiter-human relations and in alarm signals. In addition, critical situations arc

typically characterised by time pressure and stress, imposing the information processing system even muore. Solutions tothese problems can be sought mn

3-138

two directions. First, as errors at the knowledge-based level arc induced by mental capacity limitations and lack of knowledge, one could support this rcasonmng process by decision-support systems or expert systems(Shcridamì,

1988). Research activities in this direction should be directed towards the

identilucatmon of knowledge structures and decisiomi strategies that are

employed under various conditions and the evaluatiomi of the effects of system charactciistics on humrìan perlönnance (such as feedback delay and

comnplexity). Second. recently a ditìòrent division of functions over miian amid

iriachine has been advocated (see for example Wiener, 1985). Rather than the human operator serving as a monitor of automnatic devices, the humnan could be brought hack into a more active role in the comitrol loop, possibly supported by systemns. Research activities in this direction should focus on task features that remain invisible at a level at which single low-level

functions arc allocated to Imumnami and mnachine, such as prohlemns to get into

the loop in case of emnergencies (it has been shown that iii diflicult situations people perfonn better in manual conditions than in automilalic conditions (Thornton, Braun, Bowers and Morgan, 1992), coping with large amounts of information, effects of underload and vulnerability of the systemn on single hmjmnan errors. However, human error cannot be considered as the only factor contributing to (near)accidents. As analyses have indicated management and design failuies can have mnajor contributions in accident proneness, but may he hidden for quite a long time.

4. I)ECISION-SUIPORT STRUCTURE

4.1 Supervision concept

Given the functions which have to he supervised on a ship's bridge, two

elemnemits can he distimiguished: a "within-function" element, referring to

user s

requirements a da pIa tian

mechanism

3-139

secondary level

Figure 5: Time pri?na?y and secondary let eis of an adaptive control system. Primaiy level: information about goal(s) to be achiei'ed the reference value R of lite coni,oiler. and ïimfonuation about the actual process output. tite controlled

variables. Secondai-r level: restrictions, requirements and disturbances.

(7)

-monitoring and control activities with respect to a single function (e.g. "navigate") and a "between-function" element. referring to the overall performance, the scheduling and the inonitoritig of the different tunctions. Within function element. Various systems on hoard can be depicted as Compensatory control systems, which rniuinìize the influence of disturbing influences on the basis of feedforward and feedback control. To adjust these primary control loops to varying disturbances, on a secondary, more long-tenu. level adjustiiìents are calculated within knowledge about these distur-bances. The results of this adaptive level may either he fed directly into the prilnaly level (additive, leedlorward like), or are used to tune the settiligs (multiplicative. gain scheduling) of the primary controllers (Figure 5). In relation to the model of supervisory control, after Moray (Figure 3), the primary level control functions are typically allocated to the "component-interactive system". whereas the secondary. adaptive. level is allocated to the human operator, assisted by the "human-interactive system". However, as a result of modem technology, also at the adaptive level an increasing numiiber of functions are performed by "intelligent" distributed control systems. which have their origin in adaptive control theory (Vaiì Amerongen, 1982; Passenier, 1989) in combination with the development of iinbedded expert systems. Given these developments, the level of direct involvement of the human opel-ator for the "within-function" element is decreasing. The supervision of multiple fuiìctions, however. becomes more and more the role of a single human operator. the supervisor, assisted by a highly centralized. integrated bridge-inÍòrmation system. The implications of this increasing "between-function element" for the human supervisor will be discussed in more detail in Ihe next section. Between-function element. Given the fact that an increasing number of functions are perfonned in parallel by different navigation and platform systems (distributed control), the issue of limited (computer) resources, calling upon a need for task scheduling at the "compo-nent-interactive" level, becomes less and less an issue. However, for the human supervisor the coordination and supervision of different functions becomes more important while on the other hand the "division of attention" is not something which the human operator is particularly good at: he is relatively very slow and cannot shift attention rapidly From OOC function to another. According to Sheridan (1988), attention sharing of the supervisory operator may be described on the basis of attention (task) demands, accord-ing to four attributes:

What resources need he assigned to pcIibn1lwhatever needs be done (Thcsc could be human resources such as particular senses, or 111010Fcapabilities, or mnemnoly; they can be computer or periplìcnil mechanical resources, etc.).

How long will it take to perform (lie function, or how much effort is required. Supervision of System Supelvistons within functions DCS Oc DCS Components of the preces

Il

-platform Human Operator HIS superenion between functions

- - - -.

normative element I of the mission I f o pera tian) -J

-1

scheduling

dt

cont,ol

ligine 6: Diffetent elements (?f bridge supervision at different levels. HIS: Human-Interactive System. !)CS: I)istribuied Conti-o! Svstent.

How much time is available to get the function done.

What is the reward for successfully completing the function, or what is the cost of doing it.

An interesting observation regarding experiments on this topic of human attention allocation and timing (Tulga and Sheridan. 1980) is that as task demands increased (increasing load and pace), and "planning ahead" finally had to he abandoned in favor of "do what has an immediate dead'ine", subjective mental workload actually decreased. From a control point of view, at this between-function level a "robust" approach (resulting in a single control action which is acceptable for a range of processes or situations) instead of an "adaptive" aproach (resulting in a single control action for an individual process or situation) is more appropriate. Summarizing. supervi-sion at the bridge can be described at three levels according to Figure 6, with the within-function (adaptation and control) and the between-function element (scheduling) allocated to the component-interactive system, and the hwiian-interactive system.

4.2 Decision-support

For the decision-making processes on the bridge several intòiination sources are available, varying from ieal time (sensor data) to non-real time (knowledge base). which nced to he selected. integrated and analyzed in the context of the actual decision-making problem. Analogous to the description of the "within-function" and the "between-function" element, the role of decision-support may be characterized according to a "situation-driven" and

1 st of es of the preces

cargo surroundings Situation) I

(8)

-an "operation-driven" element. The situation-driven element refers lo support in idenification and solution of problems, caused by unexpected external influences (disturbances), resulting in attcntional demands al the human-interactive level. The operation-driven eleiient refers lo assistance iii overall plan preparation. execution and rnonitoing, thus rniniuizing the attentional demands required for the various scheduling activities. Situation-driven decision-support. Driven l)y unexpected disturbances. this type of

problem solving requires activity at the adaptation level of the

coinncni-interactive system. resulting in. for instance. controller adjustments or replacement of mal-functioning components. The adaptation process may be described as updating a model of the cunent situation on the basis of state infonnation. Typical phases in this process are: situation assessment, on hic basis of sensor observations, the system state is identified; problem identifi-cation, given the system state, possible causes for malfunctioning are

identified; problem solution, on the basis of the identiied problem, possible

consequences of actions are evaluated and effective countermeasures are taken. Besides explicit on-line decision-support tools. both expert system

("if then") like and prognosis ("what it") like, the role of the human-machine interface in this decision-making process includes the following steps: problem identification: effective means for information selection and integration for the transformation of database contents to display contents and support in the visualisation of possible causes and user concepts; problem solution: effective tools kw generating alternative solutions (action patterns) and testing the robustness of solutions found and efficient presenta-tion for comparison of several oppresenta-tions. Operapresenta-tion-driven decision-support. For the operation-driven element of decision-making, the human supervisor may be assisted in the preparation, coordinated execution and monitoring of different functions at the component-interactive level in order to achieve "higher-order goals" according to external requirements (e.g. sale

ranspoilt-Lion of cargo). The result of the planning ÍOCCSSprovides the human

supervisor for the different system functions at various levels of detail with desired system states at speciíìc times, which are required to successfully complete the mission (transportation from A to B). During voyage-execu-tion, this collection of desired states may serve as a reference model tor the monitoring of system functions. Again, decision-support may be "if then" (presentation of reference states according to stored procedures in relation to actual states) and "what if' like (effect of decisions on one function with regard to other functions). Furthermore, for the design of the

human-machine interface, an integrated presentation format according to the structure of the reference model may provide the human supervisor with a coherent picture of the various subfunctions in relation to the overall goal (mission). On the basis of this model-reference approach, implications for

3-142 r

i

I supetvjsion of system r L supo Vision between functions supervisions within fonCtions components of the proces DSS piaf tor nl Human Operator OSS -1-cargo DSS normative element i of the mission loperationl

---i

role of DSS: J whOm ii optinv,ltmO(l .1 mm.,,. pt000th,eS --t states 01 the p,oCeS su,moundings Isiteationl -J

Figure 7: Decision-suppoil at d[Je reni levels of bridge supervision. HM!: Hlwran Machine Interface. DSS: Decision-Support System.

flexibility of dialogue mode and information presentation (the use of overview versus detailed infonimatiun; the creation, manipulation and organization of overlays and multiple windows) may be derived. In Figure 7 the two forms of decision-support, situation-driven versus operation-driven, are summarized in relation to the different levels of bridge supervision as

presented in Figure 6. 4.3 Discussion

Different levels of bridge supervision, ranging from high-level task scheduling, via adaptation to low-level control, can be allocated to both human and machine, with the human-machine interface as the essential link between the human supervisor and automation. From a supervisory-control point of view, with the human-machine interface a comparison is made between operation-driven information (procedures and planned actions. related to the mission and criteria for safety and efficiency) and situation-driven information (observations, preprocessed according to generic expert-rules like sensorfusion), in order to reduce a discrepancy between the goal state and the actual state of the system. According to these two aspects of decision-support. the following research issues concerning the

human-machine interface can be fonriulated in ternis of infor-mation selection and

3-143

role of OSS:

DSS whet mr: d.monst,et.«' I

(9)

L25] Thornton, C., Braun,C., Bowers, C. and Morgan, B.B. (1992). Automation effects in the cockpit: a low-fidelity investigation. Proceedings of the Human faCtors Society 36th Annual Meeting.

APPLYING HUMAN FACTORS TO FUTURE SHIP

CONTROL SYSTEMS

Air R E Bishop 'I 'x Crwnpin

MinistryofDe/ence Senior Pwiner

DGFS(ES) ES253 Liveuo,e 1-luinon1-odors Consu0ant

Fox/ii!! ('restan House

BATH High: Si reel

BAI 5AB Anginering

UK 0225-8833 73 W Sussex

BNI6 4AE UK 0903-773301

ABSTRACT

The aim of this paper is to present a realistic approach to the applica-tion of Human Factor throughout all Ship Control System design

phases. In the saine way that Quality is now a commodity, it is envisaged that by the year 20(X), Human Factors in total warship design will he

likewise, with an equipment "common kxt and feel" prevalent from stein

to stern. Recent initiatives, pioneered by the Royal Navy MANPRINT programme, have put Human Factors to the tòrefront of Warship systems design. The driving torces have been the need to opimnise manning by judicious allocation of functionality between humans and modern automated technology, and an awareness that manpower and equipment pecifícations must come under coninion procurement scrutiny. Two key UK documents now exist, as Sea Systems Controllerate Publications IO anti II. which explain how to apply Human Factors to the design of both whole warships and combat systems. This paper anticipates the birth of a third HF SSCP on the design and management of marine engineering equipment. The Human

Factors initiative is described together with the current status of Human Factors in the UK. Human Factors and Marine Engineering issues are intnxluced and discussed. The concept of a Human Factors IntegrationPlan

(HFIP) is provided which löcuses attention on l-luman Factors deliverables at various strategic 1x)ints during the procurement process. The key marine engineering design areas are identilied with a discussion on the trend towards total platfonri management. The paper culminates in a way to integrate 1-lumnan Factors into the total management and design process. In particular, the project manager's need to manage the writing, response and assessment of equipment specilications is paramount. To achieve the requirementsof (he NAVSTARCODE, Project Managers iieed a clear explanation of how tu implement human factors.

3-147 Tulga, M.K. and Sheridnn, T.B. (1980). Dynamic descisions and

workload in multirask supervisory control. IEEE Trans. on Systems, Man and Cybernetics. SMC- 10.

Vclnan, J.A. and Gaillard, A.W.K. (1992). Mental workload and stress as a factor in the occulTance of ship accidents. Report IZF 1992 C-10. TNO Institute for Perception, Soesterhcrg, The Netherlands.

Vijlbrief, C. and Oostcrom, P. van. (1992). The GEO System: an Extensible GIS. Proceedings of the 5iIi International Symposium on Spatial Data Handling, Charleston. South Carolina, USA.

Wagenaar, WA. and Groeneweg, J. (1987). Accidents at sea: Multiple causes and impossible consequences. I iìlcnìational Journal of Man-Machine Studies, 27, 587-598.

Wiener, E.L. and Curty, R.E. (1980). Fight-deck automation: Promises and problems. Ergonomics, 23, 955-101 I. Wiener, E.L. (1985). Beyond the sterile cockpit. Human Factors, 27, 75-90.

(10)

integration: Infonnation selection: given a single attention/task demand, what are effective tools for the selection of relevant information from the database, which enable the human supervisor to become involved in an interactive way in the problem-solving process and minimize the amount of eflòrt related to

'attentional switching" between different processes (the "adaptiv&' element of budge supervision); Infonnation integration: given the different attention! (ask demands, how should the related infonnation be presented in order to assist the human supervisor in constructing an adequate mental niodel of the overall situation in such a way that the "attention allocation and timing" process results in an acceptable overall system performance (the "robust" element of bridge supervision).

5. REFERENCES

Amerongen, J. van (1982). Adaptive steering of Ships: a Model-reference Approach to ünproved manoeuvming and economical course keeping. Ph.D. Thesis. Delft University of Technology, The Netherlands.

Bainbuidge, L. (1987). Ironies of automation. In: J. Rasmussen, K. Duncan and J. Leplat. New Technology and Hunian Error. Chichester: Wiley.

131 Beevis, D. (1992). Analysis Techniques for Man-Machine Systems

Design. Final report from NATO RSG.14 AC/243 (Panel 8/RSGI4) TR/7. Bnissels: NATO Defense Research Group.

Boer, J.P.A. (1990). Personality and skillfulness as variables at the origins of shipping accidents. Report IZF 1990 C-18. TNO Institute for Perception. Soesterberg, The Netherlands.

Booher, HR. (ED.) (1990). MANPRfNT: an approach to systems integration. Van Nostrand Reinhold. New York, USA

Brehmer, B. (1992). Dynamic decision-making: human control of complex systems. Acta Psychologica, 81,211-241.

Coad. P. and Yourdon. E. (1990). Object Oriented Analysis. Yourdon Press, New Yersey.

DÖring, B. (1983). Systems ergonomics, an approach for

developing well-balanced, cos-effectivc man-machine systems. lui: The human as a limiting element in military systems. Vol. 1. (NATO DRG DSIA!DR (83)170). Brussels: NATO Defense Research Group.

Drager. K.H, (1981). Cause relationships of collisions and groundings. Report no. 81-0097. Oslo: Det Norske Ventas.

[10] Fawcett, G., Smneaton. G.P. and Dineley, W.O. (1992). The electronic chart in the integrated bridge: an emerging technology at

sea. The Institue of Marine Engineers, Marine Management (Holdings) LTD.

(Il] Koelega. H.S. (1992). Extraversion and vigilance perfonnance: 30 years of inconsistencies. Psychological Bulletin, 112. 239-258. 112] Kristiansen. S. (1980). Analysis of ship casualities and its

application in design.

[13] Proceedings of the International symmiposium on advances of Marine Technology. Oslo.

[141 Margetts, B.D. (1976). Human error on miicuchant marinc safety. The National Research Council, Washington D.C.

Moray, N. (1986). Monitoring behaviour and supervisory control. In: KR. Boff, L. Kaufman and J.P. Thomas. Handbook of Perception and Performnance, Vol. II. New-York: Wiley.

Passenier, P.O. (1989). An adaptive track predictor for ships. Ph.D. Thesis, Delft University of Technology, The Netherlands.

1171 Price, H.E. (1985). The allocation of functions in systemns. Human Factors, 27. 33-45.

[18]Rasmnussen. J. (1987). Cognitive control and human error unechanisms. In: J. Rasmussen. K. Duncan and J. Leplat. New Technology and Human Error. Chichester: Wiley.

[19] Reason. J. (1990). Human Error. Cambridge: Cambridge University Press.

120] Schuffel. U. (1987). The automated ship's bridge: human-emTor resistant!. Report IZF 1987 C-32. TNO Institute lbr Perception. Soesterberg, The Netherlands.

Schuffel, H. (1992). Designing and testing a ship's bridge layout. In: E(l. H. Kragt. Enhancing Industrial performance : experiences of integrating the human lactor. London - Washington D.C.. Taylor & Francis.

Sheridan, T.B. (1988). Task allocation and supervisory control. In: M. 1-lelander (ed.). Handbook of H uman-Comnputer Interaction. Amsterdam: Elsevier.

[24] Sterman. J.D. (1989). Misperception of feedback in dynamic decision-making. Organizational Behaviour and Human Decision Processes, 43, 301-335.

Cytaty

Powiązane dokumenty

The detailed engineer of the thermally isolated connections, makes his own calculation of loads (instead of using the loads by the structural engineer) and

Tytuãowe stwierdzenie dotyczĈce formowania sič nowego paradygmatu w edukacji traktowaþ zatem naleİy jako formowanie sič pewnego frontu myĤlowego, który najprawdopodobniej

Co praw da, w rozdziale o mimesis w prozie lat dziew ięćdziesiątych autorka w ym ienia trzy rodzaje naśladow ania (mimesis językową, procesu oraz „nostalgiczną”), ale

Careful analysis of the CCMI-1 results showed that only CMAM, EMAC, SOCOL, and WACCM CCMs are suit- able for the intended analysis, while other models involved in CCMI-1 were either

1) Warunkiem dostępu do obcości jest wstępna relatywizacja własnej tradycji, wyzbycie się przez nią absolutystycznych roszczeń. Sposób poznania wpływa na rzecz

[r]

Ponadto przy doborze składów uwzględniono wartość współczynnika tolerancji Goldschmidta (tak, by była ona zbliżona do jedności), róż- nicę elektroujemności

oraz dyscyplin” (Silverman 2008: 84), „Przyjmując perspektywę Foucaulta w badaniu osiągnięć eduka- cji, trudno korzystać z pojęć rozwiniętych z myślą o