A customer of the retailer has filed a lawsuit alleging that it likely used facial recognition technology without first seeking consent from customers, a potential violation of the state’s Biometric Information Privacy Act.
(TNS) — A Chicago woman has filed a lawsuit alleging Macy’s violated Illinois’ biometric privacy law by using video surveillance cameras and facial recognition technology on its customers.
The lawsuit, which is seeking class action status, was filed last week in Chicago federal court on behalf of Isela Carmean, a regular Macy’s customer whose image was likely identified — without her consent — through a facial recognition database, the complaint alleges.
The state’s Biometric Information Privacy Act requires companies to get permission before using technologies such as facial recognition to identify customers.
The suit alleges Macy’s is a client of technology startup Clearview AI, which has created facial recognition software that scrapes social media images to build a massive database capable of identifying people through photos.
The department store chain allegedly “sends or has sent” pictures of customers captured by store video surveillance to Clearview to identify them and obtain their personal information, the suit alleges.
The suit alleges Macy’s is “actively profiting” off of information gleaned from the biometric data through improved security and marketing.
Macy’s said in an email Monday it does not comment on pending litigation.
On its website, Clearview AI promotes its platform as “a new research tool used by law enforcement agencies to identify perpetrators and victims of crimes.”
In February, BuzzFeed News reported it obtained leaked internal documents from Clearview that showed Macy’s, Best Buy, Kohl’s and Walmart were among more than 200 corporate clients under contract for facial recognition services.
Clearview, which is not named as a defendant in the lawsuit, did not respond to a request for comment.
The lawsuit alleges Macy’s has run the identities of more than 6,000 customers through the Clearview database. Carmean has “such a widespread and active social media presence” that her biometric identifiers and personal information are contained in that database, the lawsuit alleges.
Mike Drew, a Chicago attorney who represents Carmean, declined to comment on the lawsuit Monday beyond correcting his client’s name, which was misspelled in the complaint.
Biometric privacy has been a growing concern in the age of artificial intelligence and social media. In January, Facebook agreed to pay $550 million to Illinois users to settle a class-action lawsuit alleging its facial tagging feature violated their privacy rights. A federal judge upped the total settlement to $650 million in June.
In May, the American Civil Liberties Union filed a lawsuit against Clearview in Cook County Circuit Court alleging the facial recognition startup violated the state’s biometric privacy law and embodied a privacy “nightmare” by capturing “untold quantities” of user photos and data from the Internet without consent.
Adopted in 2008, the Illinois biometric privacy law requires companies to get written permission to collect and use biometric information, and to publish a written policy establishing a retention schedule for that information.
The lawsuit against Macy’s is seeking $1,000 for each member of the proposed class for every negligent violation of the biometric privacy act and $5,000 for every intentional violation by the department store chain, as well as punitive damages and other costs.
It is also asking the court to order Macy’s to delete the personal information of class members from its database, and to stop its alleged practice of using surveillance photos to get a positive identification of customers through Clearview.
On July 22, Macy’s made it mandatory that every customer wear a mask while shopping in its stores to help prevent the spread of COVID-19. The requirement may provide some inadvertent relief to those with privacy concerns.
A preliminary study published July 27 by the National Institute of Standards and Technology, a government agency, found that commercial facial recognition algorithms failed to accurately identify people up to 50% of the time when they were wearing masks.
“None of these algorithms were designed to handle face masks,” said Mei Ngan, an agency computer scientist and an author of the report.
©2020 the Chicago Tribune, Distributed by Tribune Content Agency, LLC.
Looking for the latest gov tech news as it happens? Subscribe to GT newsletters.