Steve Nichols has seen just how far training can go in safeguarding government from cyberattacks — and he’s seen the limits of training, too.
“We run internal phishing campaigns against our employees. We have been doing that for over four years and it doesn’t get any better than an 80 percent compliance rate,” said Nichols, chief technology officer of Georgia. “In any given campaign, 20 percent of employees will click the bad link or do the wrong thing. I talk to my colleagues in other agencies, in other states, and no one gets into the single digits.”
Statistics bear him out. While training, awareness and assessments all are a routine part of the cyberdefense strategy in government, workers still open malicious attachments and click on toxic links. Despite years of aggressive efforts, 35 percent of data breaches still are attributed to human error or negligence, the Federal Information Systems Security Educators’ Association reports.
This is not to say that training is futile — not at all. Former Missouri Chief Information Security Officer (CISO) Michael Roling recently took a cybervulnerable local government entity into the state’s cyberawareness program and saw stellar results. “That entity had a 20 percent higher victim rate than our worst state agency at the start,” he said. “Now that they are a part of our program, they are comparable to other state agencies.”
Illinois CISO Chris Hill says he would like 70 to 80 percent of employees to report in when they see potential phishing, and he’d like a fail rate of not more than 5 percent on attack assessments. Training hasn’t gotten him anywhere near those numbers, and so while Hill is passionate about training, he’s also realistic about its limits. “Campaigns work. Awareness works. But training alone is not enough,” he said.
How come all the best training we’ve got does not seem sufficient? And if that’s the case, what’s next for government cyberstrategy? Here we’ll consider the shortcomings of the current training paradigm, explore how awareness efforts could be improved upon, and finally look at some technological fixes that promise to fill the gap when training alone falls short.
If training doesn’t cut the mustard, some fault may lie with the trainers. Too often, the awareness effort is “very rote, with no worker involvement in its development,” said Charlie Gerhards, executive director of the Government Technology Institute at Harrisburg (Pa.) University of Science and Technology. Often, training is “not very relevant to an employee’s role.”
The problem may also have to do with volume. An employee who gets five emails a day could easily stay compliant, “but we have created a virtual assembly line where workers spend all day clicking on links, reading and responding,” Nichols said. “That’s what they do all day long. Now if a bad thing just slips past the malware filters, it’s asking them a lot to pick out that one thing among all those hundreds of emails.”
Maybe we also need to ask ourselves who is running the cybertraining. Are the trainers skilled in the subtle arts of education and influence?
“Quite often the people in charge of human risk are really, really technical geeks. The depth of their expertise actually makes it hard for them to communicate what they know,” said Lance Spitzner, director of SANS Security Awareness. “Security is easy for them, so they think it must be easy for everyone else, when in fact most people find it confusing, scary, intimidating.”
Roling, meanwhile, says the shortcomings of training may track back to the executive suite, where IT leaders lay the plans for cyberawareness. They may be aiming for the wrong mark. “When I look at how we used to do it, we viewed it as just a compliance requirement — some external entity requires this awareness program,” he said. “If you look at it like that, you will do the bare minimum, you won’t refresh the content. Of course that method doesn’t work well.”
Given all these concerns, it seems reasonable to conclude that if training isn’t getting IT all the way to the goal line, maybe the first thing to do is to rethink training. Could government be doing cyberawareness more effectively? Yes. Here’s how.
Illinois CISO Hill is looking to improve cybertraining by making it more specific. Rather than heighten the general level of awareness, he wants to drill down, to develop materials that speak with a higher level of specificity.
“Right now, for instance, we are looking at programs that target law enforcement and fire officials, campaigns that speak more on their terms,” he said. “Rather than say, ‘A worker gets an email,’ you say: ‘I am a police officer and I am getting an email.’ You want it to be down to that level of detail.”
Tying cybertraining to specific roles and responsibilities is part and parcel with a larger effort to convince workers that their clicks have real, actual consequences. “We have to explain to them how important this is to them personally, to their specific piece of work,” said Mark Testoni, president and CEO of SAP National Security Services. “It’s not just about somebody getting into our system. How does this affect your job, your pocketbook, your ability to succeed?”
Nichols meanwhile is pursuing a three-pronged course of attack to enhance the impact of training:
This imperative to keep training fresh and new resonated with Roling. He’d been using the training firm Security Mentor to deliver interactive cyberawareness, but recently switched to Habitu8, largely in order to give employees a new take.
“I have nothing bad to say about their program, they did a good job for us, but I want to provide a fresh experience for our end users. I feel it’s important to keep the content fresh and its delivery fresh, or else you get that glassy-eyed look from people,” he said.
The previous vendor offered quizzes and problems to solve, an approach that Roling found effective. He’d roll out monthly iterations and employees would engage readily. The new supplier takes a different approach. “They create short videos using professional actors and actresses that are hilarious and engaging in a very different way. They’re about people having fun while learning at the same time, and we are hoping that will bring our awareness program to a different level,” he said.
Experts in the field put a lot of weight on this notion of entertainment: They encourage government IT leaders to think hard about the form of training, and whether it is geared for maximum impact.
Where past presentations may have been somewhat static, today’s best offerings “leverage a short, ‘bite-sized’ security lesson in the form of cartoons or short funny sketches or parodies. These tend to have much better acceptance from a broad employee base,” said Gerald Beuchelt, CISO at LogMeIn.
A routine part of most government agency cyberefforts these days is the phishing expedition: sending employees fake emails loaded with traps to see who takes the bait. In considering potential improvements to training, it’s worth taking a deeper look specifically at how the phishing expedition ought to be handled.
“If we send 5,000 messages, do we know that all 5,000 were received? Do we have good logs in place? Do we track them all the way through?” Hill said. “Proactive phishing is a good way to see if your training is working, but you have to really push the reporting piece, and then you have to follow through. Once we do get a report, do we have the correct procedures in place to respond to that?”
Many organizations fall short on follow-through. If training is less than fully effective, some say, it may be because the IT team isn’t sufficiently aggressive in how it closes the loop when employees fall for a trap in testing.
Some suggest radical surgery.
Train to Win
Phishing campaigns: An internal “red team” launches a bogus attack, training employees to spot and report suspicious emails.
Desktop/tabletop exercises: Make cyber tangible by putting employees through the paces, showing them how to handle incidents such as a DDoS attack or website defacement.
USB drops: These exercises give employees firsthand experience in handling a mysteriously found USB
At the cyber trade association (ISC)², Director of Cybersecurity Advocacy for North America John McCumber talks about the Passover Principle. “Slaughter the lamb and spread the blood on the doorposts and the lintels,” he said.
McCumber proposes publicly broadcasting the names of those who fail the phishing test, letting everyone know it was Bob or Sally who allowed the faux invaders to breach the walls. For IT chiefs who balk, he proposes the Pontius Pilate corollary: Wash your hands of the deed. “You have to have human resource management step into the breach, so that it is not a security problem, it’s not a CISO problem,” he said. “You want this to be the shared responsibility of organization leadership.”
Many will cringe at the proposed public shaming.
“If you get it wrong, you should get feedback: Why did you click on this? What could you do differently next time? You use it to get people thinking about things,” said Jason O’Neill, head of global training services at consulting firm Kepner-Tregoe. “One-on-one personal feedback is key. If you want them to really think about this, you can’t just send an email.”
Positive reinforcement could also help. Rather than call out those who miss a cue, “you create an award or a commendation or a scoreboard to track who is the best at reporting these kinds of things,” he said. “You create an environment where people are recognized and rewarded for doing the right thing."
Better training could raise the cyberbar, as would improved efforts around phishing follow-up. Ultimately, though, people are fallible and all the training in the world won’t fix cyber. Enhanced efforts on the back end — new technologies and cyberpractices — will need to close the gap.
“Most organizations are still struggling with the basics,” Spitzner said. “We need to patch. We need to manage access. This is well-known, there are entire frameworks like the Center for Internet Security’s Top 20 Critical Security Controls. How many devices do we have? What software do we have? Who is using it?”
For resource-strapped government tech leaders, automating where possible can help to move the needle. Even as we wait for machine learning and artificial intelligence to save the world, there’s a lot IT can do to build up security routines. “It might just be a warning, a little trigger that says: ‘Hey, do you really want to click on that?’ A reminder like that could definitely be helpful, once people have had the training,” O’Neill said.
In Illinois, Hill has pursued a couple of structural changes that he said will help to offset the human propensity to goof. All email that comes from outside of government gets flagged “EXTERNAL” in the subject line, as a quick and easy way to put recipients on their guard. The system also strips embedded links from emails: You can still copy and paste a link, but it’s harder to get to, and there’s a built-in moment for reflection. “That’s a simple trick, but it can dramatically reduce clicking,” he said.
In Georgia, Nichols is looking at emerging techniques like two-factor authentication to better safeguard systems, but he said it can be a struggle to balance security against usability. He said AI could help in the long term, potentially building up user profiles that can be leveraged to automatically flag suspicious activity.
“We can know every single thing that your computer is going to do, that your phone is going to do. We can watch where you go and what you follow,” he said. “What is the typical day in the life of this individual? What do they open? Where do they visit? There are all sorts of algorithms to leverage against that.”
Such efforts could help reduce the impact of human error. Until this comes to fruition, though, government IT leaders say they will continue to look to workers as the foot soldiers in the cyberwars, and they’ll continue to lean on training and awareness as the most potent weapons in the fight.
“Until computers can think like a human and identify human-based threats, technology will not be able to put a complete stop to these attacks,” Roling said. “Humans are still the No. 1 attack vector, the No. 1 target, and they have to be the first line of defense.”