Aligning Data to Value

What are the key metrics that are tracked at the top? What measurements demonstrate successful mission fulfillment? How are these metrics related to the operational picture? These are all key questions that must be considered in the face of big data. At the very least, there is an emerging imperative to have two tiers of analytics: one at the silo level and one at the overall enterprise level. Connecting these two tiers will provide the value-mapping that’s often missing today.

The first question to consider is: How is overall organization performance measured? Put aside operational concerns for a moment (including cybersecurity) and consider what the mission is and how it’s being fulfilled. What are the key attributes of these daily responsibilities? Identifying what your organization does, and how it performs these duties, is a good first step. Once these questions are understood, it is then possible to start developing and factoring the key metrics that go into measuring and demonstrating delivery on these missions. In doing so, it becomes easier to start equating operational analytics with overall mission analytics, which in turn provides a much-needed mapping of value from the top of the business to the daily operations that underpin it.

Another key component for aligning data to business value is in understanding the asset picture. In this context, “asset” is interpreted broadly to include people, process and technology. Understanding how organizational performance (and success) is measured is a great starting point, but it also needs to be considered together with the assets that comprise the organization. Factoring in business performance metrics and assets will lead to a deeper understanding of operational performance, which can in turn be correlated directly to operational KPIs. In essence, the entire process charts analytical islands throughout the organization, with the net result being to connect data to business value.

In context, the overall hierarchy looks at the top-level business performance metrics, which map to assets, which in turn map to various operational areas. Rolling-up these various analytical islands creates the second analytical tier, which helps address some of the challenges inherent with big data. Moreover, it allows organizations to move beyond disparate analytical islands to a cartographic perspective that views each island in a useful and meaningful context.

Beware the Super-Metric

Aggregating various metrics into roll-up analyses can be very beneficial, but there is a point of diminishing returns. In fact, over-aggregating data can have a detrimental result, as has been demonstrated by the financial services industry over the past decade. It may be tempting to create single super-metrics that are equivalent to a temperature gauge or thumbs up/down picture, but doing so will come at the expense of having a sufficient amount of detail in the characterization from which to make a well informed, defensible decision.

Reducing multiple analyses into a single super-metric obscures the truth. Consider, for example, a single metric that’s comprised of five equally weighted components. Four of those components could have high scores, whereas one could have a moderate to low score. The overall aggregate score will appear to be disproportionately high, even though one component area may in fact be failing and represent material risk to the organization.

Ben Tomhave  |  Ben Tomhave is principal consultant for LockPath, which provides governance, risk and compliance software.

Ben Tomhave is principal consultant for LockPath, which provides governance, risk and compliance software.