KIVI's Risk Management & Engineering Department organised the Working Symposium Irrational Decisions in Engineering on 28 June. The attendance list saw not only members of the department, but also construction experts, safety experts, knowledge managers, contract managers, guests and, as usual at RBT, two professors. In short, a mixed bunch with a lot of knowledge at hand. Recent practical cases were discussed.

Previous working symposia

John van der Puil, member of the Programme Committee, which had developed this event, kicked off with a brief review of two previous working symposia. These included an exchange of ideas in April 2016 on how to deal with the 4K model, in which participants had to indicate in which quadrant they would place an unwanted event. This brought unexpected learning at the time. The 4K model does not appear to have direct scientific value, but in risk management practice it turns out to be a very useful tool to place unwanted events with the aim of better managing the relevant risk in the future. The insight that risk appetite changes over time also became clear here. Culture is a determining element, as are prestigious objects; the development of technology requires extra alertness from the engineer. The engineer's responsibility, raison d'être of the Risk Management and Engineering Department, to clients and society goes further than a technical standard prescribes.

Last April's working symposium From Thinking to Pre-thinking taught us other insights. Insufficient further development leads to accidents, too many new developments in the same object ends badly, lack of communication between partners in a project leads to disasters, a missed risk at first identification is usually followed by subsequent opportunities to rectify that omission. Decisions are often made on irrational grounds. It was obvious that June's working symposium should deal with the latter topic.

Irrational decisions

And now for Irrational Decisions in Engineering. The participants had worked through a snappy syllabus with that title beforehand. It already contained many practical cases that gave an advance picture of irrationality in various situations.

There was a broad exchange of knowledge, lively discussions, many original thoughts and even the generation of new knowledge that participants had brought from home.

Working out the practical cases

Exciting practical cases had to be reviewed again. Participants tackled the problems presented with verve. The discussions were sometimes tedious, especially in the groups where people quickly agreed on desired views and mitigating actions against risks to be run. Other groups became hotbeds of more subjective disagreements. It was all allowed. Insights could not help but be sharpened.

The Multiplexer

The first real-life case involved a failure of a measuring and control installation at a chemical plant. There was a fire hazard there, but on irrational grounds it was decided to continue at full power nevertheless. A sound decision involves risks, places technicians in dilemmas and forces uncomfortable choices.

This case, called the Multiplexer, involved pinpointing what kind of unwanted event it was. The Multiplexer malfunctioned, causing a loss of revenue of Hfl 10,000 per hour. Cause unknown. Opinions were divided, however, with a clear preference for Black Hole (60%), followed by White Spot (30%). The CEO's decision - despite the danger of fire - to produce full anyway was mainly driven by pursuit of financial results. It was considered predominantly incorrect that a subordinate would refuse a CEO's order to run full power - accepting unknown risk. Unanimous was RBT's view that it is not appropriate for a staff or line officer to take significant additional security measures without informing the management or the authorities. Click here for the column analysing the case.

Emergency to please a customer

The second real-life case was about a contractor who had put himself in an impossible position by taking on a contract that lacked time to properly identify, analyse and evaluate the risks. It was bound to go wrong. It did.

This case about emergency contingency to please a client also had a lot going for it. The management of an internationally operating contracting company accepts an almost impossible task of issuing a tender within far too short a timeframe, thus failing to have a clear view of key risks. Whether such a decision and subsequent acceptance of the contract is rational or emotional was answered 50/50. The answer depends on the perception of the person judging. Again, I refer to the relevant column by Ms Drs Natasha Dodonova. Exceeding a major budget item by 200% (€3 million instead of €1 million) was overwhelmingly seen as a White Spot. People were aware of that risk, but were insufficiently familiar with decisive information on that point. The fact that a ship with equipment arrived 2 weeks late at the construction site was seen as a normal Basic Risk.

Poor project manager who has to operate that way. It is virtually impossible to get out of this unscathed. Yes, then there is always something to criticise his actions.

The issue management system

Guest speaker Ms Drs Natasha Dodonova took us by the hand to clarify irrational decisions in a management team. Click here for the case analysis.

Chernobyl

Fred van Iddekinge gave a fascinating talk on Chernobyl, 26 April 1986. How the accident occurred and what all went wrong. This was an engaging talk with a lot of expertise on nuclear reactors, presented to the audience at a fast pace. Wrong decisions, analysis of exactly what went wrong and failed risk management. High-quality knowledge on nuclear power was imparted here. This presentation served as an introduction to the remaining cases by our guest speaker of the evening, Ms Natasha Dodonova. Her talk and her comments on the participants' performance can be found elsewhere.

Festivities involving alleged radioactive radiation

In the case on the Chernobyl nuclear disaster, the considerations were varied. The question there was whether Scherbitsky, secretary of the Communist Party in Kiev and aware of the realistic radiation risk to the crowd, should cancel the May 1 celebrations in Kiev or not. He calls Gorbachev about this, suggesting he cancel that celebration. Gorbachev clearly instructs him to let that celebration go ahead and be on the stand himself in the process. RBT replied 24% would cancel the celebration, while 72%, empathising with Scherbtisky's situation, would let it go ahead. Besides the quantified responses, considerations were particularly important here. People mentioned not spreading panic, uncertainty of the size of the risk, consequences for their own position, international image damage of Russia, contamination has taken place anyway, not disappointing citizens by depriving them of a celebration, and so on. There are plenty of arguments, realistic and irrational. The name of the working symposium was well chosen. Click here for the analysis of the case: 'Chernobyl nuclear disaster, Scherbitsky's dilemma'.

Closing

There were some after-dinner drinks. In the corridors the rumour arose that such working symposia should actually be set up for other KIVI departments as well. This is knowledge and technology that is not tied to a single discipline. Every department should cover a different than traditional approach to typical risks in engineering in this way.

Description

Scope of this working symposium
Risk management looks like a rational process. In practice, however, things are often quite different. Very many decisions, which are technical in nature, are made on informal grounds, based on perception and experiential intuition. The aim of this working symposium is:
- deepen understanding of how irrational arguments come about in a technical environment
- realise that they are unavoidable in certain cases
- how to deal with them in engineering practice.

Programme

Part 1
16.00 - 16.10 h Introduction by John van der Puil
Lessons from the 2 previous working symposia
Red thread between those symposia
The added value of the xRM model
Deciding in uncertainty, - irrational decisions

16.10 - 17.00 hours Case The Multiplexer, to be handled by participants
Practice case - judgement - knowledge exchange - decision making with missing information

17.00 - 18.00 hours Case Emergency to please the customer - irrational decision making in uncertainty
Judgement - knowledge exchange - decision making with missing information - how to avoid it?

Case in reserve - in English
The black list and the subcontractor
Decisions in uncertainty - Emotional versus rational thinking
Fake news and rumours

Case in reserve
The Planning Schedule
A schedule we don't believe in ourselves?
Should we hand it over to the client?

Intermezzo
6.00 - 6.30 pm Sandwiches and drinks

Part 2
6.30 - 7.00 pm Introduction by Fred van Iddekinge
Chernobyl IV nuclear reactor accident sequence 26 April - 10 May 1986
What exactly happened at Chernobyl?
What exactly went wrong?

19.00 - 19.30 h Introduction by Drs Natasja Dodonova, knowledge manager
Effects of risky decisions, illustrated with real-life cases
Case The Issue Management System
Rejecting a digital knowledge system on irrational grounds

19.30 - 20.30 hours Case Nuclear Chernobyl disaster PART 1 - Scherbitsky's dilemma
What would you have done?

Case Nuclear Chernobyl disaster PART 2
Psychological laws in technical decisions
How do we avoid those inevitable choices?

Consequences of decisions are:
- Positive or negative
- Immediate or future
- Certain or uncertain.
Also in technical environments.

After registering, participants receive the Syllabus Non-Rational Decisions in Engineering and the practical cases.
Participants study the syllabus and the cases in advance. Each practice case contains a few questions. After discussions among themselves, each person submits their personal answers. Afterwards, the Programme Committee will count those answers. The result will be published.

More information via link below.

Speaker(s)

Mr. ing. John van der Puil was Academic Director of TIAS, School of Management and Society of the Universities of Tilburg and Eindhoven. He is a member of the RBT departmental board and a member of the Programme Committee. John writes columns for the KIVI website. These are always about notable events in risk management.

Fred van Iddekinge has worked both nationally and internationally as a process design and business engineer in the petrochemical industry and as a nuclear safety expert for the Dutch government.

Natasja Dodonova is a clinical psychologist. She also studied computer science in St Petersburg and Integrated Document Management at EU Rotterdam. She worked as Head of Office Knowledge Management and Knowledge Coordinator at technical organisations.

Location

KIVI building

Prinsessegracht 23, 2514 AP The Hague

Organiser

Risk Management and Technology

Name and contact details for information

Fred van Iddekinge (fredvaniddekinge@gmail.com), Harry Spaas (hmspaaskok@hotmail.com) and John van der Puil

johnvanderpuil@gmail.com

Additional information working symposium 28 June 2017