The year starts with a city counsel investigation into the fire department's performance -- several constituent complaints about slow response kicked the whole thing off. The investigation ate up Thursday evenings for a month, took the fire chief off a training project he was supposed to complete two months ago and generated several negative articles in the local paper highlighting problems in a fire department that actually does a great job.
The year ends with a mandate from the state for two brand new programs in neighborhood services. The programs sound nice, but what's the best way to implement them? Other states have similar programs, but which ones were successful? How much did they cost? What were the pitfalls?
Both of these types of situations have a common denominator -- both kick up a flurry of rumors, opinions, "expert" pronouncements and complaints -- and relatively few verifiable facts. Missing in all the talk are reliable, well-thought-out statistics that can identify successful and unsuccessful programs; chronic problems and one-time aberrations; well-managed and poorly managed projects. Statistics don't solve every problem, but used correctly they give a clean view of
how an agency is doing. More important, by stripping off the rumor, statistics
can make it possible for managers
The International City/County Management Association (ICMA) first published a performance measurement book in 1943 and has continued to publish in the area over the years. The 1993 passage of the Federal Government Performance and Results Act established performance measurement as a requirement for federal agencies and bolstered an already growing state and local government interest in performance measurements. In response to this interest, ICMA helped establish the Comparative Performance Measurement Consortium in 1993. The consortium is made up of 40 cities and counties with populations of at least 200,000. The consortium's first target has been to establish standard measurements which can be uniformly implemented and used to measure and compare local government performance.
"Lots of cities have done performance measurement but have done it internally," said Gerard Hoetmer, director of Research and Development for ICMA. "When they start to compare it to other cities, they've run into problems. They hope that by banding together and rolling up their sleeves and defining how they are going to measure things, they will define these measurements in a similar fashion and start to collect data they can use to compare apples to apples, get best practices, etc."
Working out a correct statistical measurement is not always an easy undertaking. It requires careful definitions and agreements on basic terms and procedures. At its first meeting, the consortium decided to focus on four areas -- fire, police, neighborhood services (such as garbage collection) and support services (such as fleet maintenance) to keep their efforts concentrated. They established Technical Advisory Committees for each area comprised of active practitioners. At one of the early meetings, Harry Hatrey, of the Urban Institute, who is a leading expert in measuring government performance, did some training on performance measurement. Although this helped clarify the process and goal, it has still been a tricky matter to work out effective performance measurements.
"It hasn't been very difficult to get the information," said Scott Bryant, director of strategic management for the city of Long Beach, Calif. "The difficulty has been in terms of how we define it and how some other city defines it. For example, you would think things like the number of square miles would be simple, yet when we tried to collect that information we had a difference. Do you count inland water? Some areas have a lot of it and should count it."
Bryant found the issue of cost the most difficult because each city seemed to track costs differently and have different definitions. But similar problems surfaced when trying to measure