How do your technology services compare with others? Have you compared your service offerings, rates and customer satisfaction with similar size organizations in the public and private sectors? Should you be offering more for less? Elected leaders, CxOs and business managers worldwide constantly want answers to these questions. But benchmarking government technology services is easier said than done.

For one thing, every organization’s unique — with a different culture, local politics, history, budget, expectations and governance. Two, governments’ policies and requirements differ from the private sector’s — with Freedom of Information Act laws, diverse priorities from elected leaders, pay/benefit/cost-of-living disparities nationwide, and varying perspectives on outsourcing. Three, government service offerings must equally serve citizens and not “cherry pick” the most profitable services or segments of society. While value-for-money efficiency can — and should — be measured in government operations, the private sector differs.

This makes us realize that benchmarking isn’t an exact science. Developing actionable measures requires excellent methodology, a large database of comparable entities, good judgment and lots of experience. During one benchmarking effort a few years ago, one person said, “If this is apples to apples, one is a Golden Delicious and the other is a Granny Smith.”

Nevertheless, comparing costs and service levels is a pragmatic strategy that’s an essential element of effective leadership. Most government technology managers that I’ve met believe that their teams are providing good value for the money spent given their set of circumstances. Still, they recognize the need for external validation and want to benchmark. Managers who resist benchmarking are sometimes pressured by new business leaders to bring in the “experts from out of town” to determine if changes can save dollars or offer better services.

Michigan recently completed a benchmarking effort of technology infrastructure services. It was a bumpy multimonth process — and we almost drove off the road a few times. The journey was painful, but worth it. I’d like to share several things we learned and offer a few (vendor-agnostic) tips to help in your benchmarking.

  • How we got started — Michigan’s been working on a potential new data center project for more than a year. After the Request for Information phase, we learned that we needed much greater detail before moving to the Request for Qualifications (RFQ) phase. Our benchmarking effort was critical in preparing for the RFQ and RFP phases. Tip: Consider linking benchmarking efforts to a wider initiative with executive support.
  • Initial team and plans — After awarding the contract competitively to a company with impressive benchmarking expertise, we built an internal team with staff from many different areas. Tip: Ensure all stakeholders are included in the process.
  • Different definitions — After a successful kickoff, our joint teams met many times to fill out spreadsheets and describe our rates and services to the consulting experts. We were compared to public- and private-sector organizations of similar size and complexity. Tip: Have clear definitions, especially around the scope of what’s included in all metrics.
  • Troubles emerge — At one point, we had a major disagreement with our vendor partner. Early results showed that our costs were well above others. But further analysis revealed that we weren’t comparing apples to apples, e.g., we included much more consulting, research, development, testing and evaluations within our rates. Tip: Do a comprehensive “FTE-mapping” of how each person spends his time. Stick with the program, even if you initially disagree with the numbers.
  • Back on course — Later, a top executive from the consulting company said, “This often happens. Customers either think they are too low or too high in some area. It’s a part of the benchmarking process.” In the end, we found the results helpful with our technology services and rates ranging from best practices to areas requiring improvement.
  • Final thought: Benchmarking is a must. Lord Kelvin once said, “If you cannot measure it, you cannot improve it.”

Dan Lohrmann is Michigan’s CTO and was the state’s first chief information security officer. He has 25 years of worldwide security experience, and has won numerous awards for his leadership in the information security field.

Dan Lohrmann Dan Lohrmann  |  Contributing Writer

Daniel J. Lohrmann became Michigan's first chief security officer (CSO) and deputy director for cybersecurity and infrastructure protection in October 2011. Lohrmann is leading Michigan's development and implementation of a comprehensive security strategy for all of the state’s resources and infrastructure. His organization is providing Michigan with a single entity charged with the oversight of risk management and security issues associated with Michigan assets, property, systems and networks.

Lohrmann is a globally recognized author and blogger on technology and security topics. His keynote speeches have been heard at worldwide events, such as GovTech in South Africa, IDC Security Roadshow in Moscow, and the RSA Conference in San Francisco. He has been honored with numerous cybersecurity and technology leadership awards, including “CSO of the Year” by SC Magazine and “Public Official of the Year” by Governing magazine.