The social network is removing 5,000 options that regulators say enable advertisers to discriminate.
Facebook’s move to eliminate 5,000 options that enable advertisers on its platform to limit their audiences is unrelated to lawsuits accusing it of fostering housing and employment discrimination, the company said Wednesday.
“We’ve been building these tools for a long time and collecting input from different outside groups,” Facebook spokesman Joe Osborne told ProPublica.
Tuesday’s blog post announcing the elimination of categories that the company has described as “sensitive personal attributes” came four days after the Department of Justice joined a lawsuit brought by fair housing groups against Facebook in federal court in New York City. The suit contends that advertisers could use Facebook’s options to prevent racial and religious minorities and other protected groups from seeing housing ads.
Raising the prospect of tighter regulation, the Justice Department said that the Communications Decency Act of 1996, which gives immunity to internet companies from liability for content on their platforms, did not apply to Facebook’s advertising portal. Facebook has repeatedly cited the act in legal proceedings in claiming immunity from anti-discrimination law. Congress restricted the law’s scope in March by making internet companies more liable for ads and posts related to child sex-trafficking.
Around the same time the Justice Department intervened in the lawsuit, the Department of Housing and Urban Development filed a formal complaint against Facebook, signaling that it had found enough evidence during an initial investigation to raise the possibility of legal action against the social media giant for housing discrimination. Facebook has said that its policies strictly prohibit discrimination, that over the past year it has strengthened its systems to protect against misuse, and that it will work with HUD to address the concerns.
“The Fair Housing Act prohibits housing discrimination including those who might limit or deny housing options with a click of a mouse,” Anna MariÌa FariÌas, HUD’s assistant secretary for fair housing and equal opportunity, said in a statement accompanying the complaint. “When Facebook uses the vast amount of personal data it collects to help advertisers to discriminate, it’s the same as slamming the door in someone’s face.”
Regulators in at least one state are also scrutinizing Facebook. Last month, the state of Washington imposed legally binding compliance requirements on the company, barring it from offering advertisers the option of excluding protected groups from seeing ads about housing, credit, employment, insurance or “public accommodations of any kind.”
Advertising is the primary source of revenue for the social media giant, which is under siege on several fronts. A recent study and media coverage have highlighted how hate speech and false rumors on Facebook have spurred anti-refugee discrimination in Germany and violence against minority ethnic groups such as the Rohingya in Myanmar. This week, Facebook said it had found evidence of Russian and Iranian efforts to influence elections in the U.S. and around the world through fake accounts and targeted advertising. It also said it had suspended more than 400 apps “due to concerns around the developers who built them or how the information people chose to share with the app may have been used.”
Facebook declined to identify most of the 5,000 options being removed, saying that the information might help bad actors game the system. It did say that the categories could enable advertisers to exclude racial and religious minorities, and it provided four examples that it deleted: “Native American culture,” “Passover,” “Evangelicalism” and “Buddhism.” It said the changes will be completed next month.
According to Facebook, these categories have not been widely used by advertisers to discriminate, and their removal is intended to be proactive. In some cases, advertisers legitimately use these categories to reach key audiences. According to targeting data from ads submitted to ProPublica’s Political Ad Collector project, Jewish groups used the “Passover” category to promote Jewish cultural events, and the Michael J. Fox Foundation used it to find people of Ashkenazi Jewish ancestry for medical research on Parkinson’s disease.
Facebook is not limiting advertisers’ options for narrowing audiences by age or sex. The company has defended age-based targeting in employment ads as beneficial for employers and job seekers. Advertisers may also still target or exclude by ZIP code — which critics have described as “digital red-lining” but Facebook says is standard industry practice.
A pending suit in federal court in San Francisco alleges that, by allowing employers to target audiences by age, Facebook is enabling employment discrimination against older job applicants. Peter Romer-Friedman, a lawyer representing the plaintiffs in that case, said that Facebook’s removal of the 5,000 options “is a modest step in the right direction.” But allowing employers to sift job seekers by age, he added, “shows what Facebook cares about: its bottom line. There is real money in age-restricted discrimination.”
Senators Bob Casey of Pennsylvania and Susan Collins of Maine have asked Facebook for more information on what steps it is taking to prevent age discrimination on the site.
The issue of discriminatory advertising on Facebook arose in October 2016 when ProPublica revealed that advertisers on the platform could narrow their audiences by excluding so-called “ethnic affinity” categories such as African-Americans and Spanish-speaking Hispanics. At the time, Facebook promised to build a system to flag and reject such ads. However, a year later, we bought dozens of rental housing ads that excluded protected categories. They were approved within seconds. So were ads that excluded older job seekers, as well as ads aimed at anti-Semitic categories such as “Jew hater.”
The removal of the 5,000 options isn’t Facebook’s first change to its advertising portal in response to such criticism. Last November, it added a self-certification option, which asks housing advertisers to check a box agreeing that their advertisement is not discriminatory. The company also plans to require advertisers to read educational material on the site about ethical practices.
This story was originally published by ProPublica.