Security Metrics - Common Mistakes in Vulnerability and Compliance Reporting
I get the chance to speak with many different types of customers and potential customers. I am particularly interested in how they want to monitor and report on their network activity. I am frequently asked what type of metrics can be tracked for upper management. Trending charts are very popular, but what goes in them can be deceiving. Let’s consider some examples.
A very common requirement in enterprise networks is to patch all systems every 30 days. That is, no system should ever have a major vulnerability for more than 30 days. With this in mind, consider this trend graph of my laptop’s vulnerabilities over time:
Even though I had some vulnerabilities back in November and October, my system was always patched within a few days, which is in compliance with our patch policy. To a security analyst, it might be interesting to note that my laptop was known to have vulnerabilities on a certain date, but to a compliance officer, this data may be misleading.
In addition, keep in mind that reporting on vulnerabilities ends up graphically demonstrating how effective your scanner is and not how effective your IT organization is. Consider this fictitious vulnerability scan of a host that is scanned every day, but only patched once a week:
I purposely chose the graph “trending downward”. A senior manager might look at this graph and conclude things were getting better, when in actuality, things are great! All of the hosts’ vulnerabilities were patched every 7 days.
If your organizational policy is to patch within 30 days, showing a graph like the one from above can be very confusing.
Another error in vulnerability reporting is related to scanning frequency. I’ve spoken to some organizations that scan their network once every 30 days and some even once every 90 days. During this type of time period, there can be a tremendous amount of new vulnerabilities that have been disclosed. In the graph below, if you sampled on April 8th, you would have a very negative impression of the patching activities and on April 25, you might think your scanner was not working because it didn’t report anything. Likewise if you sampled when a smaller amount of vulnerabilities were reported, you might think things were great all of the time.
Last, consider this report on the trend of vulnerable web browsers for a major university:
This graph comes from the Passive Vulnerability Scanner (PVS) and shows just how chaotic tracking vulnerabilities can be. This particular graph is of any vulnerable web browser that can be observed by the PVS. Since the PVS works in real time with continuous monitoring, the graph has many trend lines related to vulnerabilities in various web browsers.
When reporting on vulnerabilities within a compliance framework, be sure to provide the information asked for. Otherwise, the information you do provide may be misleading or even alarming.
We’ve blogged several times about reporting on vulnerabilities in the past. These following links list several popular topics.