Cyber Security: Is “human error” really to blame.
ICA Members – please login to track your CPD
Find out more here.
90% of data breaches reported to the UK Information Commissioner’s Office (ICO) in 2019 could be attributed to ‘human error’, according to user awareness company CybSafe. The reports were dominated by phishing (45%) followed by ‘unauthorised access’, malware, ransomware, hardware/software misconfiguration and brute force password attacks. This conclusion won’t be a surprise to risk and compliance professionals. Root cause analysis of cyber failures, compliance breaches, risk events and complaints frequently cites humans as the primary cause of the problem.
This diagnosis assumes that automated systems are safe. If it wasn’t for those pesky humans, everything would be fine. Furthermore, once the culprit has been found, remedial actions are often taken to reprimand, re-train or re-allocate the individuals concerned. The cure demands rectifying the weaknesses of inattentive, clumsy or incompetent employees.
But what if ‘human error’ isn’t a diagnosis, but a symptom of system weaknesses?
In order to understand ‘human error’ we need to look at human competence at work. Many activities we engage in in the workplace, particularly in first line processes, can typically be divided into two broad categories:
- Skills-based – Some tasks are so dull and repetitive that we can readily reduce them to automatic responses. Take driving, for example. As learners, we need to pay conscious attention to every action – depressing the clutch, changing gear, mirror-signal-manoeuvre. Once we have mastered those skills, however, it is perfectly possible for us to drive many miles thinking about something else only to discover, somewhat to our surprise, that we have nearly arrived at our destination. We have, over time, converted those attention-absorbing activities into autonomic actions.
- Rule-based – For many expert activities we develop our own ‘rules of thumb’ that can be applied in most circumstances. This is the most common technique we see in our businesses. Procedures may be formalised, but what distinguishes an expert from a novice is experience, knowing workarounds and ‘tricks of the trade’. An expert can apply learnt ‘heuristics’ to optimise procedures and deliver efficient processes.
Most organisations rely on people operating standard processes on a repetitive basis; taking customer orders, registering an insurance claim, assessing a mortgage application or responding to an email. Each requires a sequence of actions to be performed, in the correct order and to set rules.
Errors arise in these tasks in several ways. Interrupting a compiled skill can lead us to pick up from the wrong place. An agent may be taking a client through a script on the telephone when one of her colleagues drops his paperwork on the floor. Returning to the customer, the agent picks up the script in the wrong place and completes the process in a non-compliant way. Similarly, a change in the environment can make your usual rules inappropriate. You are required to send a special one-off email to all of your clients, but place all of the recipients in the ‘CC’ field instead of the anonymous ‘BCC’ section. People are much more prone to such slips and ‘absent-minded’ errors when tired, stressed, working in environments full of distractions or put under productivity pressures.
Email has become an essential tool in most of our work. Reading emails, clicking on links and replying to colleagues is a fundamental requirement of most of our jobs. Our human response is to perform these tasks as instinctively as possible – compiling the sequence of actions as much as we can, using our experience of ‘quick fire’ rules and getting the job done. Seen in this light, ‘human error’ isn’t erroneous at all – it’s the natural consequence of us being skilled experts.
It is too glib, and fundamentally inhuman, to diagnose ‘human error’ as the root cause of cyber issues. Take the phishing attacks identified as ‘human error’ by CybSafe. The UK National Computer Security Centre (NCSC) has issued guidelines on how to prevent or minimise the threat of phishing attacks on organisations. In one case study, a firm received 1,800 emails containing malware. Through placing the correct technological measures within the firm – email filtering, adequate reporting mechanisms, up-to-date patching, ‘call home’ blocking and adequate virus protection – the case study resulted in absolutely no malicious impact on the organisation at all.
As risk and compliance professionals, we should challenge the common perception that ‘human error’ is an adequate diagnosis. In the majority of cases, it is the processes, systems and environment within which we ask human experts to work that have generated the issue. These systemic issues are often caused by management failure to implement appropriate mitigations over time. Given the right circumstances, the holes in multiple layers of control will align to cause issues.
Diagnosing ‘human error’ can be a convenient alternative to lack of rigour in systemic controls, and can be unjust to the skilled individuals we ask to work for us.
 The safety guru, Jim Reason, calls this the ‘Swiss cheese’ model. Under the right circumstances, as fates align, some cheeses have a hole right through them.
 I always use quotes around ‘human error’. If you have read this far, I hope that you will have some empathy with why I do that.
Written by Paul Eccleson
International Compliance Association
The International Compliance Association (ICA) is a professional membership and awarding body. We are the leading global provider of professional, certificated qualifications in anti money laundering; governance, risk and compliance and financial crime prevention.
Being a member of the ICA, a global community, is a mark of prestige and shows that you have reached a standard of excellence in your professional career.