Cybersecurity: Don’t blame employees—make them feel like part of the solution
Scientists find that blaming employees is counterproductive and suggest creating a safe environment for people to admit their mistakes and learn from them. One company already puts that into practice.
Human error is not going away anytime soon, so we need to get past the blame game and figure out how to stop cyber bad guys. Thankfully, several behavioral scientists are working hard to accomplish this, including Amy C. Edmondson, the Novartis Professor of Leadership and Management at the Harvard Business School.
SEE: Security incident response policy (TechRepublic Premium)
Edmondson, who studies leadership, teaming and organizational learning, said in the article Psychological Safety and Information Security by Tom Geraghty, that she believes a lack of psychological safety results in a “blame culture.” Edmondson coined psychological safety and defines it as: “Where blame is not apportioned, but instead every mistake is treated as a learning opportunity, mistakes ultimately improve performance by providing opportunities to find the systemic causes of failure and implement measures for improvement.”
Edmondson explained how she came to that conclusion in her paper Psychological Safety and Learning Behavior in Work Teams. During her research for the report, Edmondson noticed that even though some teams made more mistakes, in the end, they were in a better place mentally. Edmondson concluded the reason was the willingness of those teams to own their mistakes, where other teams were not.
That insight led Edmondson to understand the importance of making the workplace a safe haven psychologically and physically. Employees should not be belittled or punished for any reason.
SEE: How to manage passwords: Best practices and security tips (free PDF) (TechRepublic)
“Organizational culture and psychological safety are critical not only to prevent information security breaches but to ensure we deal with failure in such a way that we can learn from it,” Geraghty explained. “If we want to learn from mistakes, and if we want people to voice their mistakes or concerns, we must facilitate a psychologically safe culture in which people feel empowered and safe to raise concerns, questions and mistakes.”
Geraghty suggested the following core behaviors should be nurtured:
- Treat everything as a learning opportunity: Every incident, every task, should generate learning
- Admit your fallibility: If you admit when you don’t know something or make a mistake, you also make it easier for others to admit it as well
- Model curiosity and ask questions: By asking questions of others, you create the space for people to speak up
One company is using this concept in the real world
It’s not often that we encounter concepts presented in an academic paper in the real world. Mimecast, a company providing cloud cybersecurity services for email, data and web, appears to have incorporated Edmondson’s concept of psychological safety into its message to customers—in particular, how security awareness can reduce human error and the need to blame anyone.
Before we get to the steps Mimecast employs, it might be to everyone’s advantage, and hilarious, to watch the short video they created about human error.
Mimecast offered suggestions for organizations to get their employees invested in security awareness in a press release.
Focus on specific areas of risk: Training should never be generic but tailored to the needs of each customer with a focus on particular concerns related to the customer’s industry. “Some industries’ top cybersecurity concern may be legal or HIPAA compliance—for another company, they may have dealt with targeted malware attacks. Keep training connected to the most relevant concerns employees are dealing with.”
Give real-world examples: Experts at Mimecast found that many users think security-awareness training isn’t relevant; they don’t work in IT or handle legally sensitive materials. “Giving practical, relatable examples of how common cyberattacks, such as phishing scams, can impact people at any level of an organization will help keep employees aware that their role does make a difference. It may help to give examples of situations where a cyberattack can have an impact on a personal level, such as defrauding an employee out of money directly.”
Keep it short and simple: Humor and condescension are out—a nod to psychological safety. Employees likely understand the importance of cybersecurity; make sure that respect for what they do is also understood. “Give employees the most important information on each subject in an easily digestible format that feels relevant to their work. By keeping the information as short and simple as possible, it will be much easier for employees to give it their full attention.”
Be transparent: This step is also about psychological safety. Employees will push back if there is any indication of Big Brother tracking or snooping. “It can help to communicate openly about the purpose of any new security software or tools, and to explain that software is being used to keep company and client information secure and not to monitor productivity.”
Let employees test out of training: Once more, psychological safety is in play. “The best security-awareness training tools will give employees the ability to test out of (or into) training.” By tracking which employees fail and which ones respond appropriately, it becomes apparent who needs more training and who does not have to waste time learning what they already know.
If you are questioning the importance of psychological safety, look no further than Google. The company did a two-year study and came to the same conclusion as Edmondson, even ranking psychological safety more critical than dependability.