IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Nextdoor Uses Algorithms, Outreach to Curb Racial Profiling

In a beta test, the algorithmic changes reduced racial profiling incidents by more than 75 percent.

Imagine inviting a few friends over only to learn afterward that a neighbor has reported them as possible car thieves, drug dealers or gang members. How would you feel? How would your friends feel? How would people just passing through the neighborhood and being mistaken for them feel? Now imagine if the accusations weren’t centered on fact, but instead, were assumptions based on race.

For the last year and a half, neighborhood social media platform Nextdoor has wrestled with these racial profiling issues, and has been trying to rein such incidents in.

Nextdoor’s co-founders — Nirav Tolia, Sarah Leary, Prakash Janakiraman and David Wiesen — launched the platform in 2010 as a way for neighbors to privately socialize and offer advice to one another. Today, the platform has spread to more than 110,000 neighborhoods and evolved into a hub for events, business recommendations and classifieds, and an app that many use as a neighborhood watch tool.

Tolia said the four shared a sense of shock and dismay when they first learned their tools, intended to bring people together, were actually pulling people apart through racial profiling. Community groups in the city of Oakland, Calif., brought it to their attention after a series of posts surfaced that only identified potential perpetrators by ethnicity and skin color.
 
“These comments typically took the path of, ‘Dark-skinned man breaking into car,’ or, ‘Hispanic male walking around the neighborhood suspiciously.’ Things like that,” Tolia said. “It's not what we'd like to refer to as explicit racism, but as a kind of implicit bias — or unconscious racism — that ultimately can be extremely harmful and divisive since it's profiling an entire class of people.”

To combat the trend, the company launched a major update on Aug. 24 that changes Nextdoor’s algorithm so neighbors, when using race as an identifier, are forced to provide contextual information. These characteristics, prompted via an online form, include things like a suspect’s age, clothing, body type and other physical and behavioral attributes. Prior to the update, users would write their alerts in a blank text box.

In a beta test, Tolia said the alterations have delivered impressive results, reducing racial profiling incidents by more than 75 percent.

“If you choose to invoke race, then we are going to require additional information, we're going to reduce the affordance if you will," he said, "and we feel that not only will we cut down on racial profiling, but that this will result in better content.”

Greater details can be provided to police departments, many of which have partnered with Nextdoor, and the new features raise accountability with a system for flagging racially charged wording. Nextdoor and the company’s neighborhood leaders now receive notifications about the posts, and if especially disparaging, Nextdoor can cancel a user’s account. Further efforts are in the works to conduct educational outreach in Oakland while leveraging the experience to create and publish nationally an online guide of best practices.

The measures appear reasonable, and in a functional sense, highly pragmatic. Yet Tolia stressed that answers and the final decision to implement these solutions was anything but. A copious amount of contemplation and input was demanded from multiple groups. These included the affected Oakland community groups, the city of Oakland, the Oakland Police Department, outside consultants, the Justice Department and the American Civil Liberties Union.

Reflecting on the journey, Tolia said the problem was — and still is, for that matter — incredibly poignant. Unlike your typical fare of startup hurdles, this wasn’t about unlocking venture capital or a question of usability and engineering. It wasn’t even tangentially related to Nextdoor’s value proposition and business plan. This, Tolia said, was about a purely social issue, one that is centuries old and painfully destructive. It was an issue more than a few private-sector companies would likely have deferred back to their user base.

Tolia said it’s one thing for a company to be ideologically opposed to a harmful societal problem, but it’s another thing entirely to fight it within a company’s services and products.

“No tech company is supportive of things like profiling, discrimination or harassment, but taking the next step beyond and creating change in the product, that's more expensive, it takes more time, it's more difficult, and as a result is more unusual,” he said. “For us it wasn't a business issue, it was more of a moral obligation.”

Nextdoor isn’t the only tech company to struggle with racism. Studies of home sharing platform Airbnb have shown evidence of racial discrimination. Hosts, who rent out their homes for a fee, are less likely to rent rooms to minority customers. And conversely, Airbnb customers are less likely to reserve rooms from minority hosts. Along these lines, hate speech and racism on Twitter and Facebook have become so common the two companies have had to create their own guidelines and policies to strike down damaging posts.

Might this shared challenge prompt a shared solution? Tolia said he is optimistic yet skeptical when it comes down to technical remedies.

“We are supportive of all companies that have these kinds of challenges, because again, we think this is a societal ill and everyone needs to come together to make a difference,” he said. “But every platform is so different that the kind of racial profiling we see on Nextdoor is, I think, very different than what you see on Airbnb.”

Despite the social media company’s strong beta test results and a successful official launch, Nextdoor is not celebrating the update as any sort of victory. For Tolia, it’s more of a milestone, something to iterate upon, and definitely just a beginning when considering the cultural complexities that prompt bigotry and discrimination.

“The bad news is that unconscious bias is incredibly difficult to change, because of course it’s unconcious,” he said. “The good news is that people are not intrinsically trying to inflict harm on their neighbors, they just don't know better. And when it's a question of education versus ideology, I think that gives us hope.” 

Jason Shueh is a former staff writer for Government Technology magazine.