New Times,
New Thinking.

  1. Spotlight on Policy
  2. Tech and Regulation
  3. Cybersecurity
25 November 2021updated 12 Oct 2023 10:40am

How the Bank of England educates employees about cyber threats

At the UK's central bank, good cyber practice starts with people, says John Scott.

By Zoë Grünewald

As head of security education at the Bank of England, you would think John Scott and his family would be the most security-conscious people in the room. But even the most security-conscious people can make mistakes, Scott told the New Statesman‘s Cyber Security in Financial Services Conference yesterday morning. As he explained, just a couple of months ago, he and his wife realised that she had left her house keys in the front door all night long.

Rather than revoke her key privilege or chastise her for what could have been a monumental security breach, Scott acknowledged that mistakes happen. In fact, Scott admitted, he had done the same thing just a few weeks before.

This anecdote was his opening analogy for his talk on cyber security culture and his role at the Bank of England. His theory: that it is time that companies put the reality of human nature at the heart of their security policies.

Scott argued that cyber security policy should be based around “championing” people to make good decisions, and in order to do this, we need to understand people – what they need and why they make mistakes. As Scott explained, most cyber security approaches across institutions are based on presenting boundaries – implementing “three clicks and you’re out” policies to phishing emails – leading to too many IT departments becoming known as the “Department of No”. But, as Scott pointed out, effective cyber security policy doesn’t work like this.

Instead, it should be a human-based approach. Scott pointed to what he called “cognitive biases” – that is, the way humans process information – to understand why people make mistakes: “We do the things that are like the things that we’ve done before as much as we can, because working out from the core principles every time is tiring.”

Select and enter your email address Your weekly guide to the best writing on ideas, politics, books and culture every Saturday. The best way to sign up for The Saturday Read is via saturdayread.substack.com The New Statesman's quick and essential guide to the news and politics of the day. The best way to sign up for Morning Call is via morningcall.substack.com
Visit our privacy Policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.
THANK YOU

These biases cause us to make mistakes because, of course, not all tasks are the same. They may be sub-conscious thoughts or reactions like “I have to think fast”, “there’s too much/not enough information” or “what should I remember?”, and this in turn leads to mistakes, slips or lapses in judgement.

Scott also explored the human approach to risk assessment as a contributing factor. Just as people are generally more afraid of sharks than mosquitoes even though, globally, the risk of death is much higher when encountering a mosquito, people often underestimate the risk of cyber threats. Sharks look and feel scarier. In the world of cyber, the bad thing that might happen when clicking a phishing link may not look or feel scary. In fact, you may never see the impact of it or realise it’s happened, as cyber attacks are often sophisticated enough to make it impossible for companies to trace the point of infection.

Scott also pointed out that we forget that there are people out there who are deliberately trying to exploit employees and rely on our propensity for error. It isn’t a level playing field – there are seasoned, expert hackers, going up against people for whom “cyber security” is just a corporate buzz phrase. Despite this, companies still focus far more of their time and resources on securing the technology and servers, rather than educating and empowering their employees.

So, what does a human-based approach look like? Scott spoke of the need for safeguarding processes, such as requiring sign-off, to limit the likelihood of human error. Secondly, companies should look at their current procedures. Employees should have checklists that they adhere to, but they should also ensure that those checklists are iterative and appropriate for different scenarios.

In this instance, Scott pointed to the hacking of the Bank of Bangladesh’s account with the US Federal reserve – when the first transaction was withdrawn by hackers, there were errors in the code. The Federal Reserve followed the procedure as set out by its checklists – giving the bank an hour to confirm the withdrawal request was correct after initially bouncing it, and then releasing the funds after confirmation. Unfortunately, it hadn’t considered that it was evening time in Bangladesh, an unlikely time for such a withdrawal. If this consideration had been built into their checklist, the transaction may have been prevented.

Finally, and maybe most importantly, Scott advocated for “treating people like adults”. Just as he and his wife had made an honest mistake, there is no need to chastise employees when they do the same. As Scott quoted from security author Lance Spitzner, “humans are not the weakest link, they’re the primary attack vector”. Errors are part of human nature and threat actors know that, so by educating and empowering employees, we limit the propensity for serious breaches. Ultimately, after all this, do you really think they’ll leave the key in the door again?

Content from our partners
No health, no growth
Tackling cancer waiting times
Kickstarting growth: will complex health issues be ignored?