The reasons for the current widespread arguments between designers of advanced technological systems like, for instance, nuclear power plants and opponents from the general public concerning levels of acceptable risk may be found in incompatible definitions of risk, in differences in risk perception and criteria for acceptance, etc. Of importance may, however, also be the difficulties met in presenting the basis for risk analysis, such as the conceptual system models applied, in an explicit and credible form. Application of modern information technology for the design of control systems and human-machine interfaces together with the trends towards large centralised industrial installations have made it increasingly difficult to establish an acceptable model framework, in particular considering the role of human errors in major system failures and accidents. Different aspects of this problem are discussed in the paper, and areas are identified where research is needed in order to improve not only the safety of advanced systems, but also the basis for their acceptance by the general public.A satisfactory definition of "human error" is becoming increasingly difficult as the human role in systems is changing from well trained routines towards decision making during system malfunctions. Recent research on the cognitive control of human behaviour indicates that errors are intimately related to features of learning and adaptation, and neither can nor should be avoided.There is, therefore, a need for design of more error-tolerant systems. Such systems depend on immediate recovery from errors which, in turn, depends not only on access to factual information about the actual state of affairs, but also on access to information about goals and intentions of planners and cooperators.This information is needed as reference for judgements, but is difficult to formalise and is not at present included in interface and communication systems to any large degree. As the information systems are becoming more "intelligent" and systems for cooperative decision making are being designed, the influence of the users' understanding and acceptance of advice from a computer will be critical for overall risk from large-scale system operation.
|Title of host publication||Risk and Decisions|
|Editors||W.T. Singleton, J. Hovden|
|Number of pages||12|
|Place of Publication||Chichester|
|Publication status||Published - 1987|