Security awareness training is one of the most effective ways to strengthen what is generally known as “the weakest link in the security chain.” The key is to make employees skeptical without paralyzing them with paranoia. Security experts agree that humans are the weakest link in the security chain. Virtually all of them agree that security awareness training can strengthen many of those weaknesses.
But how best to do that can generate some debate.
Lysa Myers, a security researcher at ESET, summarized in a recent post what she said was a collective message from several presentations at the recent Black Hat conference: While it is possible to train employees to be "hyper-vigilant, it can create more problems than it solves.
How to respond to ransomware threats
“It is not beneficial for the individual or for harmonious group dynamics to be in a constant state of distrust,” she wrote.
The presenters, who included Zinaida Benenson, of the IT Security Infrastructures Lab at the University of Erlangen-Nuremberg; Jelle Niemantsverdriet of Deloitte; and Judith Tabron of Hofstra University, emphasized the need for security trainers to listen to users and adapt education and security defenses to, “how people actually do their jobs.” While that would not eliminate security failures – indeed, nothing will – it would, “make it easier for people to make better security decisions more often,” Myers wrote.
Benenson, in a presentation titled, "Exploiting Curiosity and Context," reported on the results of two user studies where more than 1,600 university students received spear phishing messages from non-existent people. The percentage of those who clicked on what would have been a malicious link ranged from about a third to more than half – 56 percent.
The students’ reasons for clicking ranged from curiosity, being addressed by their first names, receiving a message that fit their lifestyle or thinking they knew the sender.
“Therefore, it should be possible to make virtually any person click on a link,” Benenson wrote in a summary of her presentation, adding that expecting, “error-free decision making under these circumstances seems to be highly unrealistic.”
Sending regular spear phishing messages to employees to test their awareness, she argued, could be counterproductive. “People's work effectiveness may decrease, as they will have to be suspicious of practically every message they receive,” she wrote.
That argument gets mixed reviews from a number of experts, although in many cases it comes down to how one defines hyper-vigilant. Nobody thinks security training should leave workers feeling paralyzed or paranoid, but given the variety, sophistication and level of threats, most say a bit of paranoia is a good thing.
Lance Spitzner, director, SANS Securing the Human, said human security requires compromise. “We need to have a certain level of suspicion in people, but how much depends on the organization,” he said, noting that the level would be different at a university than the Department of Defense.
Human security is all about behavior – the more difficult the behavior the less likely it can be done. But the bottom line on suspicion is, “not enough and bad guys get through. Too much and definitely trust and the ability to work together breaks apart,” he said.
Joseph Loomis, founder and CEO of CyberSponse, agreed that, “awareness is good but unreasonable and unrealistic is another. Without balance, nothing will work in the enterprise.”
Still, he said even if someone he knows sends him a link, he checks on it, “ because I do not take anything for granted. Compromised accounts happen all the time.”
In the view of Rohyt Belani, CEO and cofounder of PhishMe, security training should not encourage, “a state of paranoia per se, but the right level of prudence or vigilance when recognizing a potential attack.”
He noted that the Department of Homeland Security (DHS) and the New York Police Department both have “See Something, Say Something” campaigns, which don’t encourage people to become vigilantes, but simply to report anything suspicious to authorities.
Trevor Hawthorn, CTO of Wombat Security, said the goal shouldn’t be to create paranoia, but “smart skeptics.”
He likened it to a child learning to cross the street – it requires constant, and perhaps intense, parental involvement at the start. But eventually the child learns how to do it – with constant awareness of the danger that is not disabling. “The child will be able to cross on his own without feeling so fearful that he can’t cross a street,” he said.
Given the level of online threats, “awareness training needs to be constant,” he said. “Not only does it persist the message but it also makes the training and simulations the users’ ‘new normal.’”
Stacy Shelley, vice president and chief evangelist at PhishLabs, said while a constant state of distrust would be destructive, workers do need to have, “elevated levels of skepticism during circumstances when more scrutiny and distrust is essential. Those could include everything from a link or attachment in an email to a request from the help desk for one’s password to perform a remote system update.
(Train employees) proactively to use the System 2 deliberate-thinking process to recognize when something is out of the norm, by paying attention to certain details that normally our System 1 set of thinking would ignore.
Rohyt Belani, CEO and cofounder, PhishMe:
“Users need to be hyper-vigilant when the situation calls for it. Effective training should focus on helping users recognize those risky situations,” he said.
And Kevin Mitnick, once known as the “world’s most wanted hacker” and now head of Mitnick Security Consulting, said regular, even intense, awareness training shouldn’t have a negative effect on morale or productivity.
“That would be like saying wearing a seat belt takes away the enjoyment of driving. Or locking your car makes people drive poorly,” he said. “You wouldn't blame the manufacturer if someone left his keys in the car and a thief drove off with the vehicle. The driver would be responsible.
“In the world we live in, security precautions become second nature and people adapt.”
That said, there is general agreement that security training does need to take into account how people do their jobs, and can’t be so stringent that it stifles their productivity.
Hawthorn calls it, “being realistic. User security policies are like diets. If they aren’t sustainable and you have no way of enforcing them, either using technical controls or firing the person, you end up with something that fades away or people cheat on,” he said.
“So yes, giving users realistic guidance is powerful because it’s both sustainable and relatable, which makes the training stick with the user.”
Spitzner also said it is a mistake to, “focus on perfect security, and forget that real people are involved.
“Passwords are a great example. Security researchers talk all the time about what the ‘perfect’ password is, only to come up with a solution that no one can remember or follow," he said. "Human security is all about behavior – the more difficult the behavior the less likely it can be done."
Belani said that for training to be effective, it has to go beyond awareness to “behavioral conditioning.” He cited the work of Nobel Prize winner Daniel Kahneman, author of “Thinking Fast and Slow,” who said when people are doing repetitive tasks, the brain tends to operate in a version of autopilot that does not require deep thought – what he called “System 1.” With more complex tasks, it uses a more deliberate process that requires more effort – or “System 2.”
The key, Belani said, is to train employees, “proactively to use the System 2 deliberate-thinking process to recognize when something is out of the norm, by paying attention to certain details that normally our System 1 set of thinking would ignore.”
This kind of conditioning, he said, actually helps workers to sort the legitimate from the malicious and makes them more productive.
And Shelley said while it would not be practical to offer every employee customized training, it is possible to tailor training to various employee groups, based on things like their department and technology profile.
The more relevant the training is to how the user operates day-to-day, the better it will resonate and be retained.
Stacy Shelley, vice president and chief evangelist, PhishLabs:
“What technologies do they use? What threats are they likely to encounter? The more relevant the training is to how the user operates day-to-day, the better it will resonate and be retained,” he said.
There is general agreement that any generally good thing – physical fitness, diets, working – can be overdone. Still they say regular security training is not overdoing it.
Regular fake spear phishing tests, rather than sowing distrust, should, “help the organization know who are the biggest offenders and how to better train them,” Loomis said.
Shelley suggested thinking about awareness training, “as conditioning, in which an individual’s susceptibility to attack will increase over time without frequent training to keep them sharp.”
But, it is also important to be realistic about what can be accomplished.
“Training can absolutely reduce the chance and percentage of those who fall victim,” Spitzner said. “Most organizations can reduce failure rate to less than 5 percent. Can they make it 0 percent? Absolutely not. Can any control reduce risk to 0 percent? Absolutely not.”