Additionally single metrics tend to be misleading by virtue of being too abstract. For example, if someone told you that today’s risk rating was 79, you might think that is either good or bad, depending on how much you know about the score. If yesterday’s score was 65, then you might think that today’s score is better. On the other hand, if you learned that this score was out of 250, then perhaps you would not be so positive. At the end of the day, this number may really be quite meaningless unless you know exactly how it was calculated and what it represents. Worse, because it’s a number and not a label (e.g., high, medium or low), it may feel more authoritative, even though its basis is no more credible than a label.

Finally, it’s important to consider whether or not aggregated metrics derived from big data put decision-makers in a better or worse position for making a decision. Over-aggregating data sets into reduced metrics can distort reality, leading to worse decisions than if decision-makers were exposed to larger data sets. We know how to perform analysis on various data sets, but big data means that we must now evolve that approach to provide a reasonable secondary tier of analysis that balances aggregation and reduction against the value of the resultant metric or metrics.

From Data to Decisions

When all is said and done, the value of big data is in how well it informs the business and leads to better decisions. If this sounds a lot like decision sciences, then you are right. How leaders make decisions is increasingly influenced by the data available and how it can most effectively be used. Being able to present disparate data sets in a meaningful, consumable manner without losing a reasonable degree of detail is a key challenge.

In the short term, one of the key areas for focus is cybersecurity and related operational risk management concerns. The reality today is that IT operations and cybersecurity represent a disproportionate influence on overall operational risk. That is to say, if your IT systems go down or are compromised, then the effect goes well beyond just a minor operational inconvenience, potentially disrupting many — if not all — business functions. Addressing concerns in these areas today will help stabilize the environment and allow for advances in other key performance areas as well.

There are three considerations for achieving these objectives of better performance within a cost-efficient framework, which will result from putting analytical islands into a more complete context:

  • Know Yourself: What does the organization do, and how does it measure mission performance? How is improvement gauged and managed? Understanding the core functions of the business is a vital first step in being positioned to effectively handle big data. It is also important to understand the asset profile (i.e., people, process, technology) of the organization in order to properly factor key business metrics and values into functional roles and processes.

     

  • Know Your Data: What data sets are available? Are there areas where sufficient data is lacking? Are there common touchpoints in multiple data sets that can be leveraged in correlation and aggregation? Enumerating available (or desired) data sets is a key next step. Doing so provides the bottom-up view that needs to be aligned with the overall top-down view.

     

  • Connect the Dots: Being careful not to over-aggregate big data into super-metrics that undermine decision quality, the last step is connecting the top-down and bottom-up views through a secondary level of data analytics. The biggest challenges come in finding the right balance of aggregation versus detail in order to communicate a sufficient amount of information without overwhelming the decision-maker or obscuring vital details that are needed for making good decisions.

Quality decisions naturally will flow from having better data and following better decision-making processes. However, it’s important not to over-aggregate data sets, which can result in obscuring important details that are necessary in making reasonably well informed decisions. The example set by the U.S. State Department’s Information Assurance program demonstrates the value of analytical methods and the success that can be achieved through a continuous monitoring approach. However, the example also provides an early glimpse of the emerging challenge posed by big data. This challenge can be met through a multitiered analytical approach that charts the sea of data, connecting analytical islands into a super-set of KPIs and metrics that in turn improve security and performance.

Ben Tomhave is principal consultant for LockPath, which provides governance, risk and compliance software.

Ben Tomhave  |  Ben Tomhave is principal consultant for LockPath, which provides governance, risk and compliance software.

Ben Tomhave is principal consultant for LockPath, which provides governance, risk and compliance software.