Improving Performance

State human services agencies implement a reporting dashboard that gauges case workload performance.

by / June 2, 2006 0
New York state provides human service programs to hundreds of thousands of clients across 58 districts. Typically each county has one human services district that provides services for food stamps, housing assistance, child-care subsidies and health care, to name a few.

Although the state's human services agencies have historically analyzed their data to gauge performance, the 1997 State Welfare Reform Act forced these agencies to evaluate their programs' success in an entirely new way.

Human services commissioners and their executive team managers in all 58 districts went looking for a way to analyze data with an easily accessible application that could merge data from multiple sources.

"We needed to track such things as receipt of benefits by clients over time and to determine when they reached their 60 months of federal benefits, and when they would need to be switched over to state programs," said Bob Mastro, CIO of the New York State Office of Temporary and Disability Assistance (OTDA).

The OTDA joined with the Office of Children and Family Services (OCFS) and the Office of Medicaid Management (OMM) to find a way to better report and analyze the volumes of information available.

The agencies eventually settled on a "dashboard" tool -- an application that translates complex information into easily read and understood gauges -- powered by business intelligence.

Business intelligence tools come from a broad category of application programs and technologies for gathering, storing, analyzing and providing access to data, and they help users make better business decisions. It usually includes query and reporting capabilities, statistical analysis and score carding, and often uses a "dashboard" concept for an easy-to-use interface.

"One of the most staggeringly primitive problems facing organizations is the difficulty they have accurately answering even the most basic questions," said Ian Charlesworth, senior research analyst for the Butler Group, a Europe-based IT consultancy and research firm.


Going Dashboard
Officials want to know what's happening in their districts, why it is happening, and what should happen in the future. The agencies hired performance management software vendor Cognos to develop a pilot project to address this need.

The result was a dashboard tool that generates standardized reports from information pulled from multiple databases. The two-month pilot project encompassed three districts, and its success led to full implementation in February 2006.

At press time, 48 local social services districts were on-board, with the remaining 10 expected to go live before summer 2006. Before switching to the dashboard, each district employed a number of staff members who manually queried multiple data sources to answer statistical questions. The dashboard streamlines the process by providing that same information electronically, now giving those staff more time to focus on other tasks.

"It does allow us to analyze data better, in that you can get a view across multiple data sources and look at it all in one place," Mastro noted.

The Web-based dashboard is password protected, and accessible on the district level only by commissioners and designated staff. In New York state, human services are state-supervised but county-administered. Local counties appoint each district commissioner, and each social services district is limited to accessing its own data.

Unlike individual districts, the supervising state human services agencies -- the OTDA, OCFS and OMM -- can look at each county's data, as well as statewide data.

Although the state can't require districts to utilize the dashboard, commissioners have embraced the concept wholeheartedly, because it makes data available on the desktop, and the dashboard's reports are tested to ensure accurate and consistent information.


Finding the Data
The dashboard utilizes Cognos ReportNet (now called Cognos 8) to pull raw data from a Department of Health Medicaid data warehouse, a CONNECTIONS data warehouse that hosts child welfare data, and a welfare reporting and tracking system (WRTS).

That raw data comes from a variety of health and human services programs administered by state and county agencies, and front-line staff frequently identify additional data sources, which will be added to the system as time progresses.

Primarily the dashboard project team receives requests to add data to the dashboard from the OTDA Welfare Management System, Mastro said, such as food stamps and temporary assistance for needy families budgeting data, Medicaid budgeting data and additional child support data.

The new data will augment what is already stored in the WRTS database, he continued, which was built to track receipt of federal and state benefits and for mandated reporting related to welfare reform.

"The users have found it very beneficial to add data not related to the original primary requirements so they can get a clearer picture of case and client information," he said. "WRTS adds data in 'time boxed' deliverables and aims to provide new data in six- to eight-month projects. They do as much as they can in a project of that duration."

The dashboards offer limited reporting options to the districts, but these will also increase as the project continues. Existing data goes back approximately 10 years, so some trending analysis is already available.

The dashboard contains only management-level data. District employees must still access the original data source to get case-specific information.

For example, if one identified goal is to reduce the number of children in foster care, then the district needs to know how many children enter, exit and re-enter foster care foster care during any given time interval, along with the reasons behind those numbers. Those statistics help the district evaluate whether its current programs are effectively reducing the number of children in foster care.

In addition to the hard numbers, the reports provide graphs and charts to illustrate the data, and commissioners can toggle back and forth between the actual numbers, pie charts or bar charts. Red, green and yellow indicate trending against identified performance measures. The trending colors correspond to traffic signals: green means the department is meeting the performance measure, yellow warns that an issue may be developing and red clearly flags an existing issue.


Using the Dashboard
The browser-based application looks like a typical Web site, and is only accessible from an intranet.

Microsoft SharePoint powers the portal view and navigation while the reports are Cognos products. Each commissioner and designated staff can safely log on to the dashboard designed specifically for their county using lightweight directory access protocol (LDAP) security. In computer networking, LDAP is a networking protocol for querying and modifying directory services running over transmission control protocol/Internet protocol.

The welcome window contains tabs at the top of the page allowing users to select three categories of reports -- statistics, performance measures and financial/budget trending information.

Clicking one tab opens a drop-down list of reports available in that particular category. One more click on a report name, and that item displays in a new window.

The initial screens for the reports contain introductory information, such as an explanation of what's covered in the report; data sources; time period covered; how often the report is generated; how current the information is; and agency contacts.

In some cases, the reports give the commissioner an opportunity to query by specific parameters, such as dates. In other cases, standardized reports are run weekly, monthly or annually.


Future Plans
In the future, the dashboard will continuously expand to contain additional data sources along with two or three new reports monthly. The agencies will make the dashboard take baseline measures that will become trackable performance measures.

"In our cost benefit analysis for the project, we have identified performance measures for the state and county perspectives and will be taking baseline measures and tracking performance against those measures during the life of the project," Mastro explained. "We will use the trend data to tell if project deliverables are moving us toward the desired measure or outcome."

Mastro said an example of trend data is the average number of days from client application for assistance to eligibility determination and disposition.

"If the baseline data is that the current state is taking too long on average for applications to be processed, we will establish a goal, and track that measure as we deliver products that we believe should impact the measure," he said.

Over the course of a multiyear project, the agencies will be able to see if they are moving toward their goals. The dashboard should prove to be a strong tool in helping agencies meet the mandates of the 1997 State Welfare Reform Act.
Leslie Friesen Contributing Writer