Working for a company that makes behavioral analytics software for insider threat detection and user activity monitoring software for response and investigation, I naturally read a lot about specific insider attack instances and about risk and threats in general. Rarely does anything surprise me anymore.
Today I was surprised. A co-worker forwarded me a link to an article that based on some recent survey data. 35% of employees surveyed said they would sell corporate data. Thirty-five percent! A full 25% would sell information on “company patents, financial records and customer credit card details for US$8,000. That’s shocking regardless of your industry.
The fact that 35% said they would sell sensitive company data does not mean they are actively out auctioning off data. It is, though, a pretty shocking and clear indication of the insider risk that resides, well, everywhere.
I agree fully with the points made in the article about making sure that employees have the access to information and systems they need to do their jobs, and no more. I think, however, that trusting access controls does not fully address the underlying threat.
The insider threat involves people with authorized access using that access inappropriately – whether accidental or malicious. We can’t lock things down so tightly that productivity is impacted. There needs to be a level – an appropriate level – of trust within an organization for it to function. I am reminded again of the Russian proverb translated and made famous during the Reagan – Gorbachev era. Trust, but verify.
Insider risk is very real. If someone approached you and said “we are going to hire 100 new people, 35 of whom have put a price on our data of $8000 – but we can’t tell which 35,” what would you do? Would you hire anyone? The robots haven’t taken over yet. We still need people.
Insider threats are very real. No, 35 of your new hires are not going to sell your data. But some smaller number, given the right conditions, might. Financial hardship or perceived injustice at work, for example, may be the catalyst to change “could” to “will.” When that change in calculus occurs, do you have the systems and processes in place to detect it?
Behavior changes when an insider attacks. There are technical indicators when this occurs. Deviations from an established pattern related to data exfiltration potential are detectable and actionable, if you are set up with the systems needed to baseline behavior and flag anomalies.
There are also psycholinguistic indicators that the calculus is shifting. As an example, insiders engaged in threat activity tend to be more invested in their own success than that of the team or organization, and this is reflected through increased use of first person singular pronouns and decreased use of first person plural pronouns. These changes detectable from communications data can present earlier than the technical indicators, providing a level of advance warning that risk may be transitioning to threat.
Detecting for shifts in behavior and communications patterns does not impact productivity, but it does enable the type of verification required to make sure you don’t see your companies IP on eBay any time soon.
Insider Risk – How Prepared Are You?
Not every company is equally prepared to deal with insider risk. This report outlines the four stages of insider risk maturity and explores how to improve your insider risk preparedness.