Does Your City Measure Up?

How does your city stack up against others in terms of service to the public? A new consortium is helping to find out.

by / August 31, 1996
The year starts with a city counsel investigation into the fire department's performance -- several constituent complaints about slow response kicked the whole thing off. The investigation ate up Thursday evenings for a month, took the fire chief off a training project he was supposed to complete two months ago and generated several negative articles in the local paper highlighting problems in a fire department that actually does a great job.

The year ends with a mandate from the state for two brand new programs in neighborhood services. The programs sound nice, but what's the best way to implement them? Other states have similar programs, but which ones were successful? How much did they cost? What were the pitfalls?

Both of these types of situations have a common denominator -- both kick up a flurry of rumors, opinions, "expert" pronouncements and complaints -- and relatively few verifiable facts. Missing in all the talk are reliable, well-thought-out statistics that can identify successful and unsuccessful programs; chronic problems and one-time aberrations; well-managed and poorly managed projects. Statistics don't solve every problem, but used correctly they give a clean view of
how an agency is doing. More important, by stripping off the rumor, statistics
can make it possible for managers
to manage.

The International City/County Management Association (ICMA) first published a performance measurement book in 1943 and has continued to publish in the area over the years. The 1993 passage of the Federal Government Performance and Results Act established performance measurement as a requirement for federal agencies and bolstered an already growing state and local government interest in performance measurements. In response to this interest, ICMA helped establish the Comparative Performance Measurement Consortium in 1993. The consortium is made up of 40 cities and counties with populations of at least 200,000. The consortium's first target has been to establish standard measurements which can be uniformly implemented and used to measure and compare local government performance.

"Lots of cities have done performance measurement but have done it internally," said Gerard Hoetmer, director of Research and Development for ICMA. "When they start to compare it to other cities, they've run into problems. They hope that by banding together and rolling up their sleeves and defining how they are going to measure things, they will define these measurements in a similar fashion and start to collect data they can use to compare apples to apples, get best practices, etc."

Working out a correct statistical measurement is not always an easy undertaking. It requires careful definitions and agreements on basic terms and procedures. At its first meeting, the consortium decided to focus on four areas -- fire, police, neighborhood services (such as garbage collection) and support services (such as fleet maintenance) to keep their efforts concentrated. They established Technical Advisory Committees for each area comprised of active practitioners. At one of the early meetings, Harry Hatrey, of the Urban Institute, who is a leading expert in measuring government performance, did some training on performance measurement. Although this helped clarify the process and goal, it has still been a tricky matter to work out effective performance measurements.

"It hasn't been very difficult to get the information," said Scott Bryant, director of strategic management for the city of Long Beach, Calif. "The difficulty has been in terms of how we define it and how some other city defines it. For example, you would think things like the number of square miles would be simple, yet when we tried to collect that information we had a difference. Do you count inland water? Some areas have a lot of it and should count it."

Bryant found the issue of cost the most difficult because each city seemed to track costs differently and have different definitions. But similar problems surfaced when trying to measure police and fire department efficiency.

"We track response time in terms of priority of call and our priority might be different than someone else's," said Bryant. "We knew this would be an evolutionary process, we knew we wouldn't accomplish it in a year. Everything is in draft right now. We've done several data collection efforts but it is in a test mode and we have identified the problems."

Long Beach's interest in the project was the "big-picture perspective." Since cities are all in a similar business, the consortium gives them an opportunity to identify best practices, to learn from other cities. According to Bryant, Long Beach's involvement is "really an effort to improve our operation."

Frank Fairbanks, the city manager for Phoenix, Ariz., came to the consortium for similar reasons.

"We were already doing this kind of stuff [performance measurements]," said Fairbanks, "But the reason we were so enthusiastic about the comparative study is that it takes it in two new directions for us. It helps us find other ways of measuring. By participating in this study we could improve our own technique and knowledge. Also, through the comparative study we'll find some cities are doing better than others. We can find out what kinds of things they are doing. Phoenix has a good reputation for being innovative, but a lot of what we are doing is borrowed from other cities."

Fairbanks knows from experience that there tends to be a lot of information sharing within metropolitan areas, but the information flow outside the region is not as good. He sees the consortium as a way for people in one part of the country to learn about good approaches used in other parts of the country.

Fairbanks also thinks the program will create an incentive for cities to progress -- after all, no one likes to be last. Used correctly, he's also seen it help employees.

"We think it can be a major tool for improving our government and local government across the country," said Fairbanks. "If you use measurement to beat up and get angry with employees, it absolutely doesn't have a positive result. It can hurt you. But if you can use it to help employees understand better what their job is about, they are happy. It is like a sport. You have to know how to score. It simplifies their world [to know what defines success], especially if you get their ideas. Employees know what we want and they enjoy delivering what we want."

As part of its efforts at performance measurement, Phoenix does a citizen attitude survey every two years to get feedback on how the government is doing. Recent surveys indicate that citizens appreciate the city's efforts, that they have noticed a difference. This isn't to say everything is rosy -- Fairbanks knows there is a lot of cynicism about government -- but he noted that the polls have shown that citizens know the government is focused on service delivery, which translates to citizen support. Fairbanks credits this generally positive attitude with helping the city pass bond issues when needed -- citizens tend to support governments they feel are providing real service.

Although its work is a long way from complete, the consortium has pulled together 1995 data and Hoetmer feels they will be in good shape to do valuable comparisons when the 1996 data comes in. Still, the whole effort is evolving. The Steering Committee, which consists of ICMA staff, Urban Institute staff, representatives from Deloitte & Touche and the chairs for each of the technical advisory committees, continues to work at refining the performance measurements. At each stage, though, cities have exchanged successful approaches and techniques. In addition to improving service and isolating best practices, Fairbanks hopes the consortium's efforts may also help overcome what he sees as a continuing problem in American politics.

"We talk a lot about what new laws we should pass," noted Fairbanks, "yet we do almost nothing to evaluate after the fact to see how it is implemented and what impact it had. We have almost no discussion of what it has done for the public."

Perhaps someday, with good performance measurements in use, the truly great programs will get the recognition and funding they need.


Sample ICMA Performance Measurements

Cost per capita

Full-time equivalent staff per capita

Total costs per library user

Circulation per capita

Program attendance per capita

Reference transaction per capita

Total registered borrowers as a percentage of population served

Survey on usage and satisfaction

Parks and Recreation

Cost per capita

Total earned revenue per capita

Full-time equivalent staff per capita

Total maintenance cost

Percentage of citizens indicating they have used a park or jurisdiction-sponsored recreation program within the past 12 months

Percentage of citizens who rate parks and recreation facilities as satisfactory


Total amount recovered from sale of surplus property during last fiscal year

Total purchase dollar per full-time equivalent purchasing employee

Average number of days of receipt of purchase requisition from user dept. to date purchase order issued for purchases under the informal bid amount (e.g. purchases not requiring formal or advertised bid)

Average number of days from receipt of purchase requisition from user dept. to date purchase order issued for purchases exceeding formal bid amount

Recent protests filed that were sustained

Percent of vendors rating the procurement process as fair and equitable

Percent of customers rating their purchasing experiences as "good" or "excellent" (rather than "fair" or "poor")

Human Resources

Percent of employees reporting satisfaction with human resources services

Percent of management reporting satisfaction with human resources services

Percent of other customers reporting satisfaction with human resources services, by department

Employee turnover rate

Number of grievances per 100 full-time equivalent

Percent of grievances resolved before passing from management control

Average number of calendar days to complete an external competitive recruitment and selection process

Average number of calendar days to complete an internal competitive recruitment and selection process

Sick leave utilization rate

Are personnel services significantly decentralized?

Ratio of employees in human resources dept./sec./div. to total work force of jurisdiction

Police Services/Deterrence Patrol

Number of police calls per patrol officer

Calls handled by means other than dispatched

Total calls to 911 police

Percent of commissioned personnel dedicated to patrol services (actual)

Average patrol time utilization per officer

Response time to emergency calls

Number of crimes per 1000 population

UCR Part 1 crime rate

Property crime rate

Juvenile arrest rate

Dept. cost per dispatched call

Dept. cost times percent of time spent on reactive calls

David Aden
David Aden is a writer from Washington, D.C.