How many times a day do you make decisions? Scientists have calculated that the average is around 35,000. That sounds like a lot, but many of those decisions are so automatic we are not even aware of them. Only 5-10% of decisions are in fact conscious ones. Routine actions, such as the morning commute or everyday operations at work, trigger decisions taken below the awareness threshold. Such decisions feel easy. In the majority of cases, you don’t pause and reflect on their consequences. Yet if only 1% of those 35,000 decisions are risky ones, you put yourself and your colleagues in danger 350 times a day. Take the drive to work, for instance. If it has gone without incident for years, the temptation is to underestimate danger, to operate on auto-pilot with lower risk awareness levels than if you were driving that route for the first time.
Clearly, when people make repeated choices that involve at-risk behaviour and experience benefits first-hand that match the outcomes they expect, they tend to underestimate the actual risk.
This is borne out by a study conducted by T. Dell and J. Berkhout in 1998, which found that injuries were 88% more likely to occur in a perceived ‘safe’ job, compared to those regarded as the most dangerous. Labelling a behaviour as ‘unsafe’, when it has been performed hundreds or even thousands of times without negative consequence is more than a challenge. If the behaviour was associated with a forecast benefit that was realised, you are now at odds with actual experience; a hurdle in which logic and reason alone will have limited success.
If there is a conflict between intuition and our rational system, our experience-based, intuitive response appears to have the upper hand and exert the strongest influence on decisions and subsequent actions. The notion of a two-track mind, one part logical and rational, the other intuitive and automatic, is not new. While established safety practices are rooted in logic, facts and guidelines, most human behaviour is not. Applying that knowledge to safety, however, is a fresh approach that could provide the answers to some of our biggest challenges.
As Nobel laureate Daniel Kahneman and co-authors Andrew M. Rosenfield, Linnea Ghandi and Tom Blaser wrote in the Harvard Business Review, “people’s judgements are strongly influenced by irrelevant factors, such as their current mood, the time since their last meal and the weather. We call the chance variability of judgements ‘noise’. It is an invisible tax on the bottom line of many companies.”
The effect of cognitive biases on decision-making
So how can companies manage this ‘noise’, which can affect not only the safety of their operations and employees, but also the wider community and even their profitability? What can operations and safety managers do to ensure that two people will make the same, safe judgement call if they are faced by a critical, complex situation that requires a split-second decision? After all, it is only in an ideal utopia that people work in a constant, never-changing safe environment performing predictable tasks.
To understand what drives people to take risks or take steps to prevent them, it is necessary to understand how our judgement is influenced by social and cognitive biases, as well as noise.
These biases, formed by experiences, perceptions and emotions, tempt us to take mental shortcuts, to take the lazy, easy option rather than to pause and think. In industrial safety, the cognitive biases we most often come up against are:
● The availability bias: overestimating the value of information that is readily available. If you witness a colleague trapping their hand in
a machine, you are more likely to overestimate the risk of that happening to you than another person who hasn’t witnessed the incident. The same also applies in reverse. If you have seen someone take a risk and get away with it, you are likely to assume you can take the same risk and stay safe.
● The outcome bias: a focus on the outcome rather than on the decisions that led to it. For example, if you can quickly clean equipment without locking out and have not had an accident in the past when doing so, the perceived risk of
that action is low.
● The bandwagon effect: if an influential person in a group expresses an opinion about the risk involved in a certain activity, he is likely to persuade people in that of the same belief.
When faced with these biases, technical precautions, regulations, guidelines and safety behaviour training are not enough. They are only effective if the people using them take the right decisions when installing, operating or applying them. This is perhaps the reason why HSE statistics show the safety performance of many UK companies to be stagnating. As the HSE’s November 2017 report (https://bit.ly/2hL1SfK) says, “There has been a long-term downward trend in the rate of fatal injury, with indications showing some signs of levelling off in recent years.”
At the same time, “the rate of self-reported non-fatal injury to workers showed a downward trend up to 2010/11; since then the rate has been broadly flat.” As a result, the annual cost to industry has also been broadly level, costing employers roughly £2.9 billion, individuals £8.6 billion and government £3.4 billion a year.
Psychological tools that managers can use to influence safety decisions
One such procedure is what DuPont calls the ‘Pause, Process and Proceed’ approach, a technique that ensures people take a moment to reflect on what they are about to do, such as moving material from one section of the workplace to another, then assess the route and other elements such as the level of risk, obstacles, doorways, stairwells, confined hallways, lighting before deciding how to proceed.
Another helpful practice is based on so-called ‘nudge theory’. A nudge is designed to make the right decision the easy decision. A simple example of a nudge in an industrial setting is a colour-coded lock-out system to prompt people to connect the lock to the correct lock-out point.
A further tactic makes use of the risk-reward balance. Most people instinctively balance risk versus reward when they make decisions, tending to underestimate risk and overestimating the reward, because past experience has taught them to do so. Examples from industry include:
● Production pressure reward: the perceived risk of making a repair without locking out may be considered against the benefit of continuing production. If past experience has not led to an accident, the reward of saving time, of being able to continue with a task or going home early is perceived to be very high and the risk is underestimated.
● Personal passion as an emotional reward: maintenance mechanics enjoy repairing machines. Making them wait before they can carry on with the job because they need to have a work permit signed, or delay until the equipment is locked out, or postpone the repair until they have the right parts is the opposite of a reward.
● Social pressure: not wanting to let down your team or disappoint your shift leader, or being embarrassed because you were unable to complete a task on time is also a negative outcome. The reward of satisfying these perceived social obligations may be higher than the perceived risk.
● Social connectivity: staying in touch, responding to texts or messages is a powerful emotional reward. Why else
are people regularly caught on the phone while driving?
This has led to the development of lean thinking for risk and safety, a concept to improve workflow. If the use of a walkway, for example, costs workers 10 minutes of extra time every time they use it, they will seek a shortcut. An improved workflow eliminates detours, steps that take extra time, practices that are overly complex, etc. so people are not tempted to take shortcuts.
While experience may be the driving factor behind most at-risk behaviours, it is also the key to overcoming them. Logic and reason are influenced by words, data and analytical comparisons; our intuitive system is not. To effectively influence behaviours, companies need to use methods of communication that employ images, emotions, personal stories and experiential techniques that connect with their employees.
Sustainable safety improvements
American basketball coach John Wooden once said that “the true test of a man’s character is what he does when no one is watching”.
It is a truth that firms come up against daily. They cannot monitor their employees around the clock, although many may try. Even if they could, supervision is not as good a driver of performance – whether it is safety or operational performance – as changing habitual and instinctive behaviour from the outset