Risk Assessment. Marvin Rausand
made important contributions to the quantification of reliability. Their most well‐known result was the formula for calculating the reliability of a series system.
The first draft to a standard for risk and reliability emerged in 1949, through the guideline on failure modes and effects analysis (FMEA) that was published by the US military as MIL‐P‐1629. This guideline was later converted to the military standard MIL‐STD‐1629A. Another important method, fault tree analysis, was introduced in 1962 by Bell Telephone Laboratories during a reliability study of the launch control system of the intercontinental Minuteman missile.
The military standard MIL‐STD‐1574A “System safety program for space and missile systems” appeared in 1979 and was transformed to MIL‐STD‐882 “System safety” in 1987.
Human error was early recognized as an important cause of accidents and the technique for human error rate prediction (THERP) was introduced in 1962, mainly by Alan Swain. THERP was primarily directed toward identification and prevention of human errors in nuclear power plants.
Until 1970, the risk assessments were mainly qualitative. Quantitative aspects entered the scene in parallel to the developments of reliability theory that started from the early 1960s. An impressive early work was the book “Reliability Theory and Practice” (Bazovsky 1961). Several new books on reliability theory appeared during the 1960s and set the scene for the introduction of quantitative risk assessments from approximately 1970.
The first attempts to use a HAZOP‐like approach to identify deviations and hazards in a chemical plant were made by ICI in 1963, but HAZOP, as we know it today, was not developed until around 1974.
Preliminary hazard analysis was introduced in 1966 as a tool to fulfill the US Department of Defense's requirement for safety studies in all stages of system development.
Perhaps the most important achievements in the 1970s was the “Reactor Safety Study” (NUREG‐75/014 1975). A wide range of new methods and new approaches were developed, either as part of, or inspired by this study. Important methods include the “kinetic tree theory” (KITT) by William Vesely and models for treatment of common‐cause failures (Fleming 1975). The Reactor Safety Study was heavily criticized, but this criticism does not diminish its importance. The risk of nuclear energy was discussed in most Western countries and new education programs in risk and reliability emerged in several countries.
The US Nuclear Regulatory Commission (NRC) has played a very important role in the development of risk assessment. Two major landmarks are the publication of the “Fault Tree Handbook” (NUREG‐0492) in 1981 and the “PRA Procedures Guide: A Guide to the Performance of Probabilistic Risk Assessment for Nuclear Power Plants” (NUREG/CR‐2300).
Another US report that led to a lot of risk assessments in many countries was “Critical Foundations: Protecting America's Infrastructures” that was published by the President's Commission of Critical Infrastructure Protection in 1997. The infrastructures are exposed to natural hazards, technical failures, as well as deliberate hostile actions. The concepts vulnerability, hazard and threat, and security suddenly became common ingredients in most discussions among risk analysts. In many countries, it became mandatory for all municipalities to carry out “risk and vulnerability analyses” of infrastructure and services.
Many of the developments of risk assessment have been made as a response to major accidents (see Section 1.3). In Europe, two major accidents occurred close to the publishing of the Reactor Safety Study. The first of these, the Flixborough accident occurred in 1974 in North Lincolnshire, UK. It killed 28 people and seriously injured 36 out of a total of 72 people on‐site at the time. The casualty figures could have been much higher if the explosion had occurred on a weekday, when the main office area would have been occupied.
The other important accident occurred in 1976 in Seveso approximately 20 km north of Milan in Italy, where an explosion led to the release of a significant amount of cancer‐causing dioxin. Together with the Flixborough accident, the Seveso accident triggered the development of the new EU directive on “the major‐accident hazards of certain activities,” which is known as the Seveso directive and was approved in 1982.
In the 1970s and 1980s, a range of laws and regulations on safety and risk emerged in many countries. Two well‐known laws are the US Consumer Product Safety Act from 1972 and the UK Health and Safety at Work act from 1974.
Many new organizations were established to prevent accidents. The United Kingdom Atomic Energy Authority (UKAEA) was formed already in 1954. In 1971, UKAEA formed its Safety and Reliability Directorate (SRD). The UKAEA SRD was an active organization and published a range of high‐quality reports. One of the central persons in SRD was Frank Reginald Farmer who became famous for the Farmer curve (FN‐curve) that was used to illustrate the acceptability of risk. Farmer was also the first editor of the international journal Reliability Engineering, the forerunner of the journal Reliability Engineering and System Safety (RESS).
Another early organization was the IEEE Reliability Society that was established already in 1951. This society is responsible for the journal IEEE Transactions on Reliability. A forerunner to this journal appeared in 1952 under a different name. It changed name three times and finally got its current name from 1962.
The first scientific society that was dedicated to risk analysis, the Society of Risk Analysis (SRA) was established in 1980 and its associated journal, Risk Analysis: An International Journal, appeared in 1981.
1.4.1 Norway
In Norway, developments of risk assessment have been made in parallel with the offshore oil and gas activities. The first major oil and gas accident, the Bravo blowout on the Ekofisk field in the North Sea, occurred in 1977. There were no fatalities but a significant release of oil to the sea. First and foremost, this accident was an eye‐opener for the authorities and the oil companies who suddenly realized that the oil and gas activities were associated with a very high risk. As a consequence of this accident, the Norwegian Research Council initiated a large research program called Safety Offshore, and the authorities demanded the oil companies to support Norwegian research projects and universities. This requirement was strengthened after the second major accident, the capsizing of the semi‐submersible accommodation platform Alexander L. Kielland in 1980, with 123 fatalities.
The support from the Safety Offshore research program and the oil companies produced a number of new academic positions and a comprehensive education program at the Norwegian University of Science and Technology (NTNU) in Trondheim. Both authors of the current book participated in this development at NTNU. The knowledge gained through this period is an important part of the basis for the book.
1.5 Applications of Risk Assessment
The use of risk assessment has increased vastly over the years. A steadily increasing number of legislations, regulations, and standards require or mention risk assessment – and methods are being developed. The increase that we have seen therefore seems to continue into the future.
The prime objective of any risk assessment is to provide decision support. Whenever making a decision that affects risk, a risk assessment helps understanding what the sources of risk are. To illustrate this issue, some typical process industry decisions that can be supported by information from risk assessment are listed.
1 Location of a process plant. Chemical process plants often handle toxic, flammable, and/or explosive materials (commonly called hazardous materials). Release of these materials may affect people living or working outside the plant. Understanding the risk these people are exposed to is important before making a decision about where to locate a plant.
2 Layout of a process plant. Release of flammable material may cause fire, and this may spread to other equipment, leading to a far more severe event than the initial fire. Understanding the sources of risk may help us locate the equipment at safe distances from each other.
3 Need