Edward Snowden may have the reputation as the most infamous insider threat in recent history, but he’s not the only one who used his job and company resources to commit a crime. There’s also Lennon Ray Brown, Ricky Joe Mitchell, Shanshan Du and her husband, Yu Qin, and countless others.
Brown worked at Citibank Regents Campus in Irving, Texas, and was responsible for the bank’s IT systems. According to documents filed in his case, after having a discussion with his supervisor one day about his work performance, Brown erased the configuration files for nine routers in Citibank’s global network operations center, taking down connectivity to approximately 90 percent of all Citibank networks in North America.
At Brown’s sentencing hearing, the government read a text that Brown sent to a coworker shortly after he shut down Citibank’s system that read, “They was [sic] firing me. I just beat them to it. Nothing personal, the upper management need to see what they [sic] guys on the floor is capable of doing when they keep getting mistreated. I took one for the team. Sorry if I made my peers look bad, but sometimes it take something like what I did to wake the upper management up.”
After finding out he would be fired, Mitchell of Charleston, West Virginia, sabotaged EnerVest’s servers badly enough to disrupt its business operations for a month.
And Shanshan Du and Yu Qin were convicted by a federal jury in Detroit for conspiring to steal hybrid technology trade secrets from GM with the intent to use them in a joint venture with an automotive competitor in China.
Insiders can do serious harm to an enterprise, including suspension of operations, loss of intellectual property, reputational harm, plummeting investor and customer confidence, and leaks of sensitive information to third parties, including the media. According to various estimates, at least 80 million insider attacks occur in the United States each year. But the number may be much higher, because they often go unreported. A 2016 Ponemon study found that on average, organizations in Ponemon’s survey reported spending $4.3 million in total on insider-related incidents over the past 12 months. The costs tended to vary by organization size. Large organizations with more than 75,000 employees spent more than $7 million annually, while smaller organizations with between 1,000 and 5,000 employees spent around $2 million.
The costs encompass monitoring and surveillance, investigation, response, containment, incident analysis and remediation.
Dr. Michael Gelles is a forensic psychologist with more than 20 years of military and federal law enforcement experience and an expert in insider threat. During his tenure at the Naval Criminal Investigative Service, Gelles served as a participant for “Project Slammer,” a U.S. government study on convicted American spies. He has supported the Office of the National Counterintelligence Executive in a number of espionage and other insider investigations; directly participating in more than 10 debriefs of convicted spies. He’s currently a Managing Director at Deloitte Consulting LLP, where he focuses on providing solutions to clients around mitigating insider threats; solutions that integrate behavior and technology that holistically and proactively look to prevent, detect and respond to anomalous activities of individuals who have access to an organizations information, people, material and facilities within the global and virtual world of business, in both the private and public sectors.
“Insider threat is a people problem. It’s not a technology problem.”
“Insider threat behavior itself hasn’t really changed over centuries,” Gelles explains. “But what has changed is the context, as we’ve moved, from a world of bricks and mortar to a world of bits and bytes, where enormous amounts of information can be instantaneously downloaded, transferred and exfiltrated. So the risk is much greater today than it was when we were in a paper and pencil world.”
While federal government agencies are required by law to have an insider threat program, Gelles finds that many Fortune 50, even Fortune 1000 companies have initiated the development of programs without any connection to government and concern about brand and reputation.
“Organizations still really trust their employees, and they don’t believe that employees may do any damage to the organization,” he says. “But at the same time, we’re seeing a significant increase in insider threat activity, and thus increased development of across industries.”
Defining Insider Threat
Most definitions of insider threats focus on an employee destroying or stealing an enterprise’s security practices, data and computer systems. Gelles dispels that notion, adding that it could be an individual – employee, contractor, vendor or a company, who has access to a company’s systems where they could potentially do harm to information, material, facilities and people. “It’s more than data,” he says. “There’s also sabotage. And fraud. And even workplace violence is an insider threat problem.”
In fact, Gelles says that studying workplace violence can provide a security executive with insight into insider threat, as both have in common the fact that they are not impulsive acts. “With both, an individual moves along a continuum of idea to action,” he says. “And there’s a discernible pattern of behavior. Today, because we’re able to capture that behavior in data, we can capture what a person does in the virtual space and in the non-virtual space. What websites are they looking at, what are they downloading? All of that is data, but that’s also reflective of behavior. And it might reflect an individual who’s starting to download lots of information or trying to access specific places on the system that’s not relevant to their job, and thus showing undue interest into areas that are not relevant to what they should be doing. On the non-virtual side, we can capture whether someone’s performance is declining. Have they been compliant with certain company policies? Have they been compliant with expenses? Have they been compliant with training? What does their physical access – badge in, badge out – look like? And we’re able to then correlate using analytic tools. So we can proactively see that an individual’s baseline behavior is starting to change, based on a number of behaviors that are being monitored and correlated that carry a certain risk rate. And that provides an alert.”
Gelles adds, “And so the question now is, does an individual’s baseline behavior change, or does his or her behavior show differences to their peer groups? One, it may be because they’re actually assigned and detailed to a different part of the company and doing a different job. Or maybe they’re thinking about leaving and they’re starting to collect information or take IP and R&D for the next job. Or, they’re trying to learn about certain aspects of the business that could include sabotage. Or, they’re engaging in insider trading.”
To complicate the issue, not all insider threat actors are malicious insiders. The complacent insider can include an employee who simply clicks on a phishing scam to someone who is just engaged in very poor cyber hygiene. “That individual is simply engaging in behavior in the execution of their job that violates policy security controls and is thus opening all sorts of potential windows to external attacks. Insider threat is a people problem. It’s not a technology problem; technology is part of the solution but only half the equation, so we must maintain a holistic view of what a person does,” Gelles says.
The Insider Threat Program
An insider threat program should first start with a risk analysis that asks what needs protection, what a CSO is willing to do, and what he/she won’t do in terms of policies, procedures and technology. Also, what groups in the organization have access to what is deemed must vulnerable?
“Some CSOs will tell a sales team that they can’t use their laptops to access personal email, or use a cloud application, or browse a website outside the system,” Gelles says. “And that’s up to them. The point is that it has to be balanced. Too much security is going to impact business growth and revenue. And too little security puts business at risk. And your insider threat program should follow that. What are you willing to do, or not, with regards to what you need to protect?”
The second step is creating defined policies that are clear and easily assimilated by your workforce. And then train, train and re-train your managers and staff.
And as you build your insider threat program, Gelles advocates a holistic approach, one that includes alignment and support of all key executive stakeholders, policies and defined business processes that are aligned to establish baseline job behaviors for personnel, linkage between the enterprise’s cybersecurity strategy, and technology.
The stakeholder aspect should be a collaborative effort across the C-suite and key executive stakeholders. Those stakeholders, Gelles says, often include executives from risk, IT, HR, legal, ethics and physical security who can impact change in policy, processes, employee lifecycle events and physical and IT security controls. “These key stakeholder executives are owners in protecting, preserving and enhancing an organization’s reputation,” he notes.
With regards to technology, Gelles advocates analytics tools to analyze items such as expense compliance, downloads, access control logs, time and attendance and more. And that’s where the key stakeholders come in, he says. “You want to be partners with HR because they will have access to some of the data that you need. Creating an insider threat program is a very key opportunity for security and HR to build bridges between each other.”
Email software can also be employed to proactively identify anomalous behavior/communications within the workforce. While privacy advocates may disagree, Gelles stresses that most companies monitoring their workforce are monitoring them in an anonymized fashion, meaning that there are no names associated with the alerts that come up, only an alphanumeric is generated. So how you position and communicate the monitoring is key. Be transparent about the monitoring, Gelles suggests. “By the way, lawyers are very comfortable with what’s monitored internally. If it’s at work and it’s in the confines of the systems of the organization, there’s no real issue about monitoring. But be transparent about it.”
“People generally don’t get a job to become an insider,” Gelles adds. “It’s a result of something happening, such as a crisis in their personal life. But the idea is that you’re actively protecting your enterprise.”
Once an employee is showing signs of “going rogue,” advanced analytic tools can be collected and shared with an analyst to determine what’s driving his/her behavior. “We see an employee downloading a lot of information, and we also notice that their performance is poor, and they’re not coming into work that often. Those irregularities need to be investigated.”
“This is a people-centric problem that really starts with behavior or a problem at work,” Gelles adds. “And unlike 15 years ago, we can capture data and help understand a problem sooner than we could before. We can proactively identify it, where in the past we’ve always reacted to it.”
Three Faces of Insider Threats
Negligent Employees
-
Employees that may accidentally delete or modify critical information or unwittingly share sensitive information. Unintended disclosure comes in the form of posting information on public-facing websites or social media sites, sending information to the wrong party or posting proprietary data to unapproved cloud providers and applications.
Exploited Employees
-
These insider threats are the least frequent, but they have the potential to cause the most damage due to their insider access. Employees are exploited when an external adversary finds their way into the network with compromised user credentials. User credentials can be stolen in many ways, including phishing, malware and web-based attacks.
Malicious Insiders
-
These are employees with the willful intent to deliberately steal critical company information typically for selling or profiting from the information. Cases also include sabotage of facilities, equipment and IT systems. These cases are the most challenging to identify and can cause some of the greatest harm to an organization.