User Behavior Analytics

Can AI Predict Workplace Violence?

By Dr. Christine Izuakor

In June 2020, a knife attack at a kindergarten in China injured 39 people, many of them children. The perpetrator was a security guard at the school. This was an insider attack and a horrific act that happens far too often across the world. While the majority of the cybersecurity industry is focused on securing data, the growing convergence of digital and physical security remains unhinged.

Between 2000 and 2018, 884 people were killed, and 1546 injured in the U.S. FBI studies into active shooter incidents found that last year, 12 out of 58 incidents occurred in commercial spaces. Within these workspaces, 51 employees were killed, and 48 were injured. Violent insider threats within the workspace impact the individual behind the act as well as the wider staff. Additional figures on violent and harmful behavior make for concerning reading. For example, in 2018, there were 1.4 million attempted suicides in the U.S.

Human beings clearly do amazing things, but they can also have an opaque side riddled with self-hard, inflicting harm on others, and more. However, there is a way to spot patterns of behavior that harm employees and the workplace. The solution lies at the intersection of technology and human behavior in the form of User and Entity Behavior Analytics (UEBA).

What is User and Entity Behavior Analytics (UEBA)?

Human beings are creatures of habit. They tend to act in particular ways, with global drivers like hunger, procreation, and socialization being part of our decision-making processes leading to behavioral norms. However, human beings are not automatons. Being able to tweeze out patterns that are repeatable and recognizable is no easy feat. Further, being able to predict possible future events from these patterns seems like something from a sci-fi movie. As technology progresses, the ability to do just that has become a reality.

The result is a crop of tools that are designed to recognize patterns of behavior as humans, devices, and networks interact. These tools set a baseline of behavioral patterns as a reference and use this to spot anomalies, patterns, and changes in behavior that can point to a threat.  For example, in the cybersecurity industry, UEBA is used to locate unusual behavior that could indicate an insider threat. Once a potential threat is detected, the UEBA platform will create an alert that can be acted upon appropriately.

Its importance is reflected in the global market value of UEBA, which is predicted to be worth 2.5 billion USD by 2024, growing at a CAGR of 44.3%. UEBA is a powerful tool applicable to organizational and national security.

However, where does the smart functionality in UEBA tools come from?

How Does User and Entity Behavior Analytics (UEBA) Work?

First-generation anomaly analysis solutions were effectively hard-coded and reliant on static rules-based engines. This led to usability issues because of a burden of false-positive alerts, which resulted in ‘alert fatigue’ on the part of the security analysts interpreting the results. The latest entrants to the space have overcome this issue by using smart technologies. Next-generation UEBA solutions are based on artificial intelligence (AI) and the subset of AI, known as machine learning (ML).

UEBA is the technology behind platforms, like Veriato Cerebral, that offer an intelligent way to hunt threats. The data consumed and analyzed by the UEBA solution is based on an employee’s attitude, behavioral patterns, and device interactions.

The use of machine learning in UEBA has taken the technology to new heights. Machine learning and artificial intelligence interrogate the ‘big data’ generated when human beings interact with technology. The more data an ML-based UEBA tool consumes, the more accurate it becomes. The type of data and indicators that UEBA tools can learn from includes:

  • Keywords used when searching the internet
  • Use of dark websites
  • Social media activity
  • Changes in login behavior and application use
  • Network entity behavior such as traffic volumes
  • IP address
  • Psycholinguistics, e.g., the tone and language used in emails and online chats

Note that ‘entities’ as well as humans are used as data points to train machine learning algorithms. Individually, those data points may not say much. However, the collation and intelligent analysis of this data, as a holistic system, provides the data needed to spot potential threats. Machine learning based UEBA solutions provide an overlay of context with indicators to accurately pinpoint threats, reducing false positives.

User and entity behavioral data utilized through artificial intelligence allows for more accurate monitoring of users on an endpoint-by-endpoint basis. The result is deep visibility across a map of behavioral events that sets the baseline of regular behavior from which unusual actions and events can be more easily spotted. Go outside of that reference line, and an alert will be generated, which a human analyst can interpret. You can see a deeper dive into UEBA here, “What is User and Entity Behavior Analytics and why does it matter?”.

Using UEBA to Prevent Corporate and Personal Harm

Machine learning based UEBA tools are powerful. They are increasingly used in cybersecurity and insider threat detection. However, UEBA has a much broader scope. The use of behavioral pattern analysis can be applied to personal harms such as suicide, self-harm, and attacks on citizens. It can also be applied to harms against corporations, such as the active shooter incidents mentioned previously. An enterprise UEBA solution that is used to shine a light on employee behavior can prevent all manner of unwanted insider threats from data leaks to active shooter incidents.

Example use cases perhaps best exemplify the range of applications of UEBA outside of the more traditional cybercrime insider threat categories.

Use Case One: Self-harm, Suicide, and UEBA

A study found that 17% of adolescents perform acts of self-harm or non-suicidal self-injury (NSSI). In one very sad case involving the suicide of a 14-year old British girl, social media platforms came under scrutiny. The girl was a regular Instagram user. When her parents checked the account timeline after their daughter’s death, they saw graphic images promoting self-harm. The platform was scrutinized for allowing such images. Instagram responded by promising to remove material that contained images inciting self-harm or suicide. Removal of an image is one thing, but the intelligent analysis of behavioral signals can offer a more powerful way to prevent self-harm and suicide.

The Durkheim Project, which focused on veterans, was designed to collect and analyze social media and mobile text data to find potentially damaging behavior. The project research was based on the application of artificial intelligence to human behavior prediction. The results of the project were very promising, with an average performance of 65% accuracy.

The Durkheim Project offered evidence that UEBA, in the context of human behavior, can be a powerful prediction tool.

Use Case Two: Malicious Insider Threats (including active shooter incidents) and UEBA

Going back to the active shooter incidents in a workplace context, machine learning-based UEBA offers important potential in reducing the risk of workplace violence. Whereas physical security can help lock doors against outside intruders, insider threats are much more challenging to deal with. Sometimes these threats come from seemingly ordinary work colleagues; folks who offer to get you a coffee or chat with you at the water-cooler. One wrong turn and these employees can turn out to be unsuspecting violent threats. Having an automated, self-learning system such as a machine learning powered UEBA offers a reliable mechanism to spot the patterns of behavior that hint towards violent tendencies.

Below is a snapshot of the kind of violent incidents happening in workplaces across the world that could theoretically be detected more proactively with the help of UEBA.

  • Murder-Suicide in Adams County
    A former employee shot his supervisor then killed himself. In interviews, ex-colleagues stated that “… think the whole thing was triggered by him losing his job.”
  • Paris Stabbing
    IT department employee, Mickaël H, worked for 20 years in police intelligence in Paris. Mr. H carried out a knife attack on colleagues, resulting in the death of four city employees. After the attack, behavioral patterns were noted, showing existing conflict with his superiors.
  • Facebook Stalker
    A Facebook security engineer used his employee privileges to access personal information of women whom he then stalked online.
  • Employee Shooting Incident
    James Cameau, an employee of Jacksonville Granite, attempted to shoot co-workers. Fortunately, on this occasion, the gun jammed, but not before Cameau injured one person and shot himself. Camaeu was a newly appointed employee, but other employees had noticed a change in behavior leading to the shooting incident.
  • Excel Shooting
    A seemingly mild-mannered employee at a lawn care company, carried out a shooting incident at Excel Industries, killing ten people and injuring 14. During interviews with former colleagues, it was noted that “There was some things that triggered this particular individual“.

In all of the above examples, clues to the resulting violent behavior had been noted by other staff members. Profiling employees gives vital clues to potential insider threats. Disgruntled and terminated employees, staff with financial problems, drug addiction, and more can turn seemingly good employees into malicious insiders. UEBA provides a way to automate, collate, and analyze behavioral clues with a view to action. Machine learning algorithms designed to learn, using behavioral patterns, can be applied to these scenarios with great success. A paper by researchers at MIT, “Deep Feature Synthesis: Towards Automating Data Science Endeavors”, demonstrated that machine learning is more accurate at finding patterns in data than human beings are. In a series of competitions, the ML algorithm designed by the team was more accurate and significantly faster than human teams, when put to the test.

Human behavioral patterns may be challenging to predict, but UEBA has offered a way forward. The automated analysis of big data, using intelligent algorithms is what sets UEBA that utilizes artificial intelligence, apart. A collection of big data across multiple platforms and devices creates the fuel by which the ML algorithm learns and develops accuracy. This data will be a mix of variables, and as mentioned, can include social media comments, company email exchanges, instant messages, web searches, and more. Cloud-based UEBA solutions can reach out to remote workers as well to aid in building a picture of unusual patterns that could point to an insider threat – be that theft of data or the potential of a more violent incident.

Using artificial intelligence, these patterns of behavior that lead to awful acts of violence could be nipped in the bud before a disaster happens.

Human Behavior as an Indicator of Compromise (IoC)

Much work has been done and continues to be carried out to find patterns in employees’ behavior that can be linked to insider threats and violent behavior in the workplace. A 2018 reference guide from the FBI on “Pre-Attack Behaviors of Active Shooters” delivers some critical data that can help predict violent behavior in the workplace (and beyond). The FBI research was based on 63 cases over 13 years. Researchers found that 35% of active shooters over 18 years of age targeted their workplace. The report found “observable behaviors” which were strong indicators of propensity to violence, each attacker had around 3.6 of these “stressors” – 35% of attackers were noted to specifically have job-related stressors.

Human behavior is an Indicator of Compromise for both cybersecurity breaches and violent attack scenarios at work. As such, intelligent behavioral analysis needs to become part of our threat detection dynamics. Threat detection using User and Entity Behavior Analytics (UEBA), that is driven by machine learning, has this capability.

Conclusion

Shooter incidents and other harmful events are deeply shocking. It is an instinct to want to prevent them from happening. Spotting a pattern of behavior that could become a serious incident isn’t just about extremely violent acts like stabbing or shooting incidents. Other employee harms, that may be less severe, can also be identified using indicative behavioral threat detection. Within the work environment, we now have smart tools in the form of machine learning-based UEBA to help prevent workplace-related violence.

Insider Risk – How Prepared Are You?

Insider Risk – How Prepared Are You?

Not every company is equally prepared to deal with insider risk. This report outlines the four stages of insider risk maturity and explores how to improve your insider risk preparedness.

About the author

Dr. Christine Izuakor
Dr. Izuakor is the Senior Manager of Global Security Strategy and Awareness at United Airlines where she plays a critical part in embedding cyber security in United’s culture. She is an adjunct professor of cyber security at Robert Morris University, and independently helps corporations solve a diverse range of strategic cybersecurity challenges.

Insider Risk & Employee Monitoring Resources

Smart Year-End IT Investments- A Trifecta for ROI

Smart Year-End IT Investments- A Trifecta for ROI

Drive Productivity, Reduce Insider Risk, Enforce Compliance As the year wraps up, many IT, security, compliance, and HR teams have unspent budgets that won't roll over. Rather than scrambling for last-minute, low-value purchases, why not make smart, strategic...

Is IAM, SIEM, and DLP Enough to Combat Insider Risk?

Is IAM, SIEM, and DLP Enough to Combat Insider Risk?

Key Takeaways: Closing the Gaps in Traditional Security Tools: IAM, SIEM, and DLP are vital but insufficient in addressing insider risks. They focus on access control, event logs, and data protection without understanding the behavioral context that signals insider...