IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Amid Protests, IBM Stops Selling Facial Recognition Software

Following in Axon’s footsteps, the computer giant has vowed to drop facial recognition development and offered to work with Congress on technology policies to reduce racial bias in law enforcement.

camera facial recognition
Shutterstock/Scharfsinn
One of the world’s largest computer companies is taking a stand against facial recognition.

IBM sent a letter to Congress June 8 announcing that it no longer sells general-purpose facial recognition or analysis software, and offering to work with Congress in three policy areas: police reform, responsible use of technology, and broadening skills and educational opportunities.

The letter — specifically addressed to Sens. Cory Booker and Kamala Harris, Reps. Karen Bass, Hakeem Jeffries and Jerry Nadler — is a statement of solidarity with the causes of justice and racial equity. IBM was one of many multinational corporations to issue such a statement following nationwide public protests in recent weeks, but it didn’t stop with words of support. The letter warns about facial recognition’s potential for abuse and recommends a national conversation about it.

“IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms, or any purpose which is not consistent with our values and principles of trust and transparency,” wrote Arvind Krishna, IBM's first CEO of color. “We believe now is the time to begin a national dialogue on whether and how facial recognition technology should be employed by domestic law enforcement agencies.”

Expounding upon the category of “responsible use of technology” as it relates to public policy, IBM’s letter mentions two other potential issues. On the positive side, it says body cameras and modern data analytics techniques could improve transparency and accountability in police work. On the negative side, it says that artificial intelligence, while useful, can reinforce biases if it’s not properly tested, audited and reported. The idea is that algorithms and machine learning could produce biased conclusions if they’re based on data accumulated through decades of racial profiling and unfair policing. A task force at the University of Pittsburgh's Institute for Cyber Law, Policy, and Security is studying this now.

IBM’s hard line on facial recognition appears to be a change of tune since November, when the company argued in a blog post for what it called a “precision regulation” approach.

“Instead of simply banning an entire category of technologies with so many possible applications, including many that are helpful and benign, policymakers should employ precision regulation that applies restrictions and oversight to particular use-cases and end-users where there is greater risk of societal harm,” read a post on IBM’s Policy Lab blog last year. “For example, recent municipal bans on the use of facial recognition technology by government may cut consumers off from a convenience that could make one aspect of air travel a little less frustrating or aid first responders in rapidly identifying victims of a natural disaster. It simply does not make sense to subject a smartphone and a police body camera to the same regulatory treatment.”

Whether IBM’s new disavowal of facial recognition software includes all possible uses, meaning it’s walking back last year’s statement, is unclear. IBM did not answer requests for comment before deadline, but a spokesperson told CNN Business that the company will limit its visual technology to "visual object detection," for example to help manufacturing facilities or assist farmers with crop care. The Verge also reported being told by IBM that it would no longer develop or research facial recognition or analysis technology.

Hard line or not, IBM’s concern about facial recognition is not unique. IBM follows in the footsteps of Axon, the nation’s top body camera provider which announced a year ago that it would no longer use face recognition technology on body cameras or make face-matching technology for the foreseeable future. Anticipating potential problems with privacy and surveillance, the state of California passed a law in October forbidding the state’s law enforcers from using facial-recognition software in body cameras until at least 2023.

Besides weighing in on these specific technologies, IBM’s letter expresses support for several non-technical reforms. These include the creation of a federal registry of police misconduct, a requirement for better state reporting on deadly use of force, and anti-profiling measures.

The letter ends by asking Congress to consider training and education programs to help communities of color. Specifically, it suggests expanding eligibility for Pell Grants and growing IBM’s P-TECH program, which helps students earn their high school diploma and an associate degree without incurring debt.