Risk and Information Processing

J. Rasmussen

    Research output: Book/ReportReportResearch

    1060 Downloads (Pure)

    Abstract

    The reasons for the current widespread arguments between designers of advanced technological systems like, for instance , nuclear power plants and opponents from the general public concerning levels of acceptable risk may be found in incompatible definitions of risk, in differences in risk perception and criteria for acceptance, etc. Of importance may, however, also be the difficulties met in presenting the basis for risk analysis, such as the conceptual system models applied, in an explicit and credible form. Application of modern information technology for the design of control systems and human-machine
    interfaces together with the trends towards large centralised industrial installations have made it increasingly difficult to establish an acceptable model framework, in particular considering the role of human errors in major system failures and accidents . Different aspects of this problem are discussed in the paper, and areas are identified where research is needed in order to improve not only the safety of advanced systems, but also the basis for their acceptance by the general public. A satisfactory definition of "human error " is becoming increasingly difficult as the human role in systems is changing from well trained routines towards decision making during system malfunctions. Recent research on the cognitive control of human behaviour indicates that errors are intimately related to features of learning and adaptation, and neither can nor should be avoided. There is, therefore , a need for design of more errortolerant
    systems. Such systems depend on immediate recovery from errors which, in turn, depends not only on access to factual information about the actual state of affairs, but also on access to information about goals and intentions of planners and cooperators. This information is needed as reference for judgements, but is difficul t to formalise and is not at present included in interfac e and communication systems to any large degree. As the information systems are becoming more "intelligent" and systems for cooperative decision making are being designed, the influence of the users' understanding and acceptance of advice from a computer will be critica l for overall risk from large-scale system operation.
    Original languageEnglish
    Place of PublicationRoskilde
    PublisherRisø National Laboratory
    Number of pages20
    ISBN (Print)87-550-1138-1
    Publication statusPublished - 1985
    SeriesRisø-M
    Number2518
    ISSN0418-6435

    Keywords

    • Risø-M-2518
    • Decision making
    • Functional analysis
    • Human factors
    • Industrial plants
    • Information needs
    • Man-machine systems
    • Nuclear power plants
    • Planning
    • Risk analysis
    • System failure analysis

    Fingerprint

    Dive into the research topics of 'Risk and Information Processing'. Together they form a unique fingerprint.

    Cite this