Oh the Humanity! Top Three Root Causes of Compliance Violations
Drifitng Out of Compliance, Part 4
This is the fourth and final installment in my “Drifting Out of Compliance” series, taking a closer look at organizational approaches indicative of a point-in-time compliance mentality and the challenges of shifting to a continuous compliance mentality. Although a security first, compliance second approach is best, many organizations still struggle to attain the baseline level of security outlined in compliance requirements.
So, are you sick of compliance already? You’re not alone. We’ve hated the concept of “complying” since the time we were kids. If our moms told us “No,” we did it anyway. Same when we were teens. I saw a great quote the other day:
If at first you don’t succeed, do what your mother told you to begin with. (unknown)
Now fast forward to adulthood. While driving, how many of us roll through stop signs? Or who sticks to the speed limit?
It’s useless to lecture a human. (Richard Riordan, The Lightning Thief)
And in the business world, it just doesn’t sit well with us when an outside entity tells our business what to do. “Okay, so we’ll do what we have to do, but then we’re going to get back to the business of running our business.” As one retailer stated after a data breach, “We sell hammers.” Sure, this is their core business model but shouldn’t they still take ownership of securing card data? Part of the process of selling hammers is accepting payments ... securely.
In an attempt to gain a better understanding of the root causes of compliance violations, I recently discovered that with NERC CIP violations, we humans are most often to blame, not technology. No surprise there. Although the list below outlines three separate and distinct root causes, humans are at the core of all three:
- Human neglect
- Lack of processes (ultimately comes down to human neglect)
- Lack of documentation (ultimately comes down to human neglect)
Humanity in healthcare
According to Verizon’s 2015 Data Breach Investigations Report (DBIR), healthcare data security incidents are notoriously “human,” as evidenced by the following top three “incident patterns” shared in the report:
- Miscellaneous errors – Examples within this category include sending sensitive data to the wrong recipients, publishing non-public data to public web servers, and insecure disposal of personal and medical data
- Insider misuse - The top action for this category is privileged abuse or abusing elevated access they’ve been trusted with
- Physical theft/loss – This covers the loss or theft of sensitive data, most commonly from an employee’s work area or their vehicle
Humanity in the critical infrastructure industries
The North American Electric Reliability Corporation (NERC) calls out “human error” as an official risk factor for 2016 ERO Enterprise Compliance Monitoring and Enforcement Plan. This follows from recorded NERC compliance violations where “human error or human performance failure” was the root cause in many cases. Examples include:
- Ports and services enabled that were not required for normal operation
- Failure to associate a security upgrade with associated cyber assets
- Manual assessment of scan results using less sophisticated scanning software, which introduced a greater propensity for human error
- No action taken in response to vulnerability identification
- Focus on getting the system up and running, personnel failed to disable ports and services not required for normal operations
- Incorrect assessment that the security patch was not applicable
Human error or human performance failure was the root cause in many cases
Humanity in PCI DSS
The #1 hardest-to-sustain PCI DSS requirement is not using vendor supplied defaults for system passwords. Oftentimes, this is simply because it’s easier for those signing into such systems, or they simply don’t think about it.
According to Security Metrics.com in 2014, unencrypted card data was found in unsuspected places for 61% of businesses researched. Given how difficult it can be to find cardholder data across the enterprise, it makes you wonder how many companies define their card data environment based on where they think the card data should be versus taking the extra time to determine where it actually is.
Phishing, a form of social engineering which relies on the email recipient to click on a link or open an attachment, continues to rank the highest in terms of human vulnerabilities. In this case, our curiosity gets the better of us. The Verizon 2015 DBIR references a 2013 statistic:
In the 2013 DBIR, phishing was associated with over 95% of incidents attributed to state-sponsored actors, and for two years running, more than two-thirds of incidents that comprise the Cyber-Espionage pattern have featured phishing.
Other forms of social engineering? At work, how many of us have allowed people to follow us onto a secure floor without even knowing who they were and if they even should be granted access to begin with? In the vast majority of cases, there’s no negative impact, and yet allowing similar access in a healthcare context grants easy access to improperly disposed of medical records or plugging directly into the enterprise network. For technological infrastructure, such access opens the door to the sabotage of critical systems.
We’re all in this together
There’s no doubt that ultimately, we want to trust each other and we choose compassion over compliance every time. We want to trust the quality of our own work. We want to get more stuff done faster. We want to trust the person in the hallway who has forgotten a badge. We want to trust people who send us emails and that what they’re sending is worthy of our time and attention. So what can be done about “us?”
- Awareness of risks and consequences
- Automation of manual tasks to reduce chances of error
- Monitoring host activities on the network
- Double-checking the security effectiveness of human-administered devices
- Cross-checking human decisions
Automation can be helpful for double-checking our work
Although there’s no substitute for good old human know how and due diligence, automation can be helpful for double-checking our work and for helping to see things we couldn’t see otherwise, a second set of eyes so to speak. Not only can this help with decision reliability, but it can also help unearth hidden risks and introduce considerable efficiencies.
Automated processes, including the use of automated support tools (e.g., vulnerability scanning tools, network scanning devices), can make the process of continuous monitoring more cost-effective, consistent, and efficient. (NIST 800-137)
Adopting advanced continuous network monitoring technologies, such as Tenable’s SecurityCenter Continuous View™, will go a long way towards reducing the human risk factor. By automating compliance processes and conducting automated “audit checks,” Tenable solutions can close the door on risks such as rogue hosts, default user accounts and passwords, unencrypted sensitive data, vulnerability remediation lapses, and misconfigured security devices to name a few.
Thanks for tuning into this blog series and Happy Holidays everyone!