New Orleans’ Fight with Blight Started with Data

With more than 40,000 dilapidated properties throughout the city in 2010, officials turned to data science to remedy the problem.

by William D. Eggers, Governing / October 25, 2017

Back in 2010, five years after Hurricane Katrina, blight was still a seemingly intractable problem in New Orleans. With an estimated 43,755 dilapidated properties and overgrown lots, the city was suffering from one of the worst rates of blight in America.

This prompted Mayor Mitch Landrieu to set an audacious goal of cutting blight by 10,000 units by 2014. The city achieved that goal a full year early and now is down to fewer than 28,000 blighted properties, according to city officials. One crucial tool in this effort was BlightStat, an analytics program that uses data from the Department of Code Enforcement and other agencies, presented and discussed in monthly public meetings, to identify solutions, set priorities and evaluate performance in the city's campaign to get troubled properties under control.

New Orleans is one of the leaders in the use of data analytics to transform how city hall works. The city demonstrates how a government can use data to achieve striking results, even with tight resources.

When New Orleans started using BlightStat, for example, the city saw the rate of property inspections multiply fivefold in just 10 weeks, thanks to the knowledge it extracted from the data, says Oliver Wise, director of the city's Office of Performance and Accountability (OPA). "We saw eye-popping returns from simply shining some light on a service area where there previously had not been light."

Wise's office runs the city's data analytics initiatives. Along with BlightStat, they include ResultsNOLA, which evaluates the performance of city departments, and NOLAlytics, which helps those departments conduct their own analytics projects to support their missions. NOLAlytics's role is almost like an in-house analytics consulting unit for city agencies. "You're there to provide those departments with some edge so that they can work smarter, not necessarily harder," explains Wise.

OPA operates with a surprisingly small staff, and some staffers come to the office without any prior training in data science. "We've never been bigger than six," Wise says. "We spend a lot of time here learning." A staff member might spend the morning reading up on machine learning and then spend the afternoon applying that new knowledge -- for example, to help Code Enforcement make faster, better decisions on what to do with blighted properties.

One of the big challenges New Orleans faces in its war on blight is how to set priorities for the dozen or so inspectors and eight researchers who identify rundown properties, contact owners about code violations, help the city win judgments against uncooperative owners, and determine how to proceed from there -- whether to levy fines, order a demolition, force a sale or take some other action.

BlightStat helps Code Enforcement identify which properties, out of the 4,000 new cases that come into its pipeline each year, that are most likely to make their way back to compliance. It considers variables such as the condition of the roof and foundation, the owner's history of tax payment, and the market for real estate in that neighborhood. "We utilize everything we can to try to predict the cases that will have the best outcomes in the real world," explains Chad Dyer, the city's director of code enforcement. Armed with this knowledge, Code Enforcement decides how best to deploy its resources.

Getting to the point of having this power of prediction required coordination between various enforcement departments and the redevelopment and housing agencies so that they could work strategically. Critically, Wise's team had to get all the data into one system from more than 10. "It was a big driver," says Dyer.

To encourage all the city's departments to apply data analytics, OPA has developed a framework that outlines six types of analytics projects: "finding the needle in a haystack"; prioritizing work for impact; early warning tools; better, quicker decisions; optimizing resource allocation; and experimenting for what works.

An initial call for proposals drew 20 prospective projects. "Then a lot of other opportunities came up that we would have had no idea about, but for those people raising their hands and identifying opportunity," Wise says. From this long list, OPA chooses projects that align best with the mayor's strategic priorities. "We can also pick those projects that will create some spillover effect in better developing the data capacity of the city or providing us with some new capacity that we can use elsewhere."

In one successful project in the "needle in a haystack" category, OPA developed a predictive model that identified which parts of the city were most at risk for fires and fire fatalities. The city used that information to target its campaign to distribute smoke alarms to vulnerable households. Using analytics, it identified twice as many households in need of smoke alarms than it had when the city chose households at random.

Popular literature has created a mystique around analytics, Wise says. "People think of it as hyper-futuristic, with rooms full of giant monitors that predict what you're going to eat for breakfast the next morning." The reality is much simpler, he says: "Curious, smart, motivated people with a strong sense of civic obligation can pick up skills along the way and get going quickly to provide value in local government."

This was originally published by Governing