Transparency and Bad Data: How to Avoid the Pitfalls (Industry Perspective)

Government entities have a duty to provide frequent and accurate transaction-level reports to citizens — and the smallest slip-up can spell big trouble once erroneous information reaches the public’s eyes. Consider these four methods for helping to keep your data accurate.

by Erin Latham / September 24, 2015

Governments across the country are striving to meet a call to transparency by upping their data collection and analysis efforts. But with this trend toward big data comes an influx of “bad data” — inaccurate, outdated or misused information that leads to ill-informed policies, misled initiatives and an ensuing lack of citizen trust.

In a 2015 study, 70 percent of government officials claimed they “frequently” or “often” encounter bad data that hampers them from properly doing their jobs. The agencies being hit the hardest include social services, economic development, health and public safety — all areas that the general public takes great interest in. 

While bad data isn’t any more prevalent in government than it is in the private sector, government bodies have a duty to provide frequent and accurate transaction-level reports to citizens — and the smallest slip-up can spell big trouble once erroneous information reaches the public’s eyes.

Typically bad data isn’t detected until it’s too late. Nothing is more frustrating, time-consuming and labor-intensive than having to start from scratch after realizing you can’t glean any useful insights from the information you’ve spent countless months (or years) gathering.

What Causes Bad Data?

Bad data tends to arise due to the manual nature of traditional spreadsheet-based collection methods. What always begins as a clean, well-organized and accurate spreadsheet will undergo so many tweaks and updates over time that, sometimes just months later, it’s no longer useable for its initial purpose. Any number of things can go wrong: inconsistent names or fields, outdated or inaccurate information, or simple typos. 

But bad data can also occur when what was once good data is neglected for too long. For example, accurate reporting that brings to light a mistake in the accounts payable process — like the miscoding of an invoice — can turn bad if it’s not quickly addressed. The good data that pinpointed the problem was ignored, causing the good data to go bad because it resulted in a decision based on an erroneous transaction.

All of these factors can birth some form of bad data and cause an ensuing negative ripple effect. When exposed to bad data and the initiatives that spawn from it — even if it’s just one seemingly innocuous misstep — constituents and examiners will begin to question the validity of all data-driven governmental endeavors. This issue of perception affects more than just taxpayer confidence; it’s also detrimental to the receipt of grant awards, bond election approvals and even government credit ratings — ultimately costing taxpayers money as the government pays higher interest rates.

Government bodies must not only become more aware of the impact of bad data, but also begin enforcing strict policies on data accuracy and upgrading their data collection protocols. When a private business realizes it has bad data, the company can usually fix the issue internally before outside influences such as investors find out. However, due to the transparency requirements of the public sector, this is usually not an option for governments.

How to Create Good Data

Creating and sustaining good data requires ongoing diligence. It’s crucial to continuously monitor data over time, addressing and correcting high-risk areas before they become bad. This method eliminates the need for expensive and time-consuming reconciliations, all while ensuring taxpayer confidence remains at optimal levels.

One option is to wait until after financial close and report only audited data, but that only allows you to be as transparent as the most recent fiscal year. In today’s world of instant communication, that’s not good enough.

With the proper tools and preparation, however, you can ensure good data isn’t late data. Setting up reports, visualizations and analyzers helps you discover problems and mistakes quickly, allowing time for corrections to be made before publication. In the right circumstances, organizations devoted to the diligent monitoring of data can easily turn bad data back into meaningful results before any external entity sees it.

Consider these four methods when searching for ways to keep your data accurate:

1. Budget Analysis Reports: These reports identify unbudgeted or miscoded expenditures, giving you the opportunity to correct information internally before someone outside of your organization points out the mistake.

Recently one of my company’s customers noticed that one of her line items had too much money remaining in available funds. She initially chalked up the discrepancy to a reporting error, but after running the data through an analyzer report, we discovered that a credit for a vendor was erroneously applied to her department instead of another. 

Regular analysis reports help identify and correct misinformation before it goes out to a transparency center.

2. Aligned Ledgers: When subledger systems account for related programs, such as projects and grants, those systems should match what the general ledger says. Reports that reconcile the two modules ensure both locations book revenues and expenses correctly, preventing the output of conflicting data.

3. Daily Reviews: A little work today can save a lot of work down the road. Run daily reports, and compare them to that day’s banking activity to ensure revenues are recorded correctly. When you conduct these checks every day, the monthly bank reconciliation process becomes much simpler, with all of your accurate data organized and ready to use.

4. Account Analysis Reports: These reports show which balances in accounts payable and accounts received have not been cleared. This helps you make sure all expenses have been appropriately paid, and all receipts have been receipted and booked correctly. Not only does this process protect against coding errors, but it also helps you pinpoint fraud and other internal control issues.

Preventing the proliferation of bad data does more than keep your organization out of hot water; it ensures that public policy decision-makers have everything they need to take appropriate and well-educated action. 

With diligence, the right tools, and a few regular checks and balances, you can confidently present your data to transparency programs without worry. 

Erin Latham founded Mo’mix Solutions with the goal of delivering software and services that drive better outcomes for the public sector and education. She has served as a technology government adviser focused on ERP, budget, business intelligence and open data/transparency solutions to local and state governments and higher education organizations for more than 15 years.