Aid ritual He was Forbidden of using facial recognition software for five years, after the Federal Trade Commission (FTC) found that the US pharmacy giant’s “reckless use of facial surveillance systems” humiliated customers and put their “sensitive information at risk”.
Federal Trade Commission to requestwhich is subject to U.S. Bankruptcy Court approval after Rite Aid Filed for Chapter 11 bankruptcy protection In October, it also instructed Rite Aid to delete any photos it collected as part of the rollout of its facial recognition system, as well as any products created from those photos. The company must also implement a strong data security program to protect any personal data it collects.
Reuters Report from 2020 He detailed how the drugstore chain secretly introduced facial recognition systems across about 200 US stores over eight years starting in 2012, with “largely low-income, non-white neighborhoods” serving as a testbed for the technology.
With the growing Federal Trade Commission Focus on misuse of biometric surveillance, Rite Aid fell hard into the government agency’s crosshairs. Among its allegations is that Rite Aid — in partnership with two contracting companies — created a “watchlist database” containing photos of customers who the company said had engaged in criminal activity at one of its stores. These images, which were often of poor quality, were taken from surveillance cameras or employees’ mobile phone cameras.
When a customer entered a store that supposedly matched an image in its database, employees received an automatic alert instructing them to take action — most often these instructions were to “approach and identify,” which meant verifying the customer’s identity and asking them to leave. Often, these “matches” were false positives that led employees to incorrectly accuse customers of wrongdoing, resulting in “embarrassment, harassment, and other harm,” according to the FTC.
“Employees, acting on false positive alerts, followed consumers around its stores, searched them, ordered them to leave, called police to confront or remove consumers, and publicly accused them, sometimes in front of friends or family, of shoplifting or other violations,” the complaint reads.
Additionally, the FTC said Rite Aid failed to inform customers that facial recognition technology was in use, while also specifically instructing employees no Disclose this information to customers.
Remove the face
Facial recognition software has emerged as one of the most controversial aspects of the age of AI-powered surveillance. In the past few years, we’ve seen cities issue expanded bans on this technology, while politicians have struggled to regulate how police use it. Meanwhile, companies like Clearview AI have been hit with lawsuits and fines around the world over major data privacy breaches around facial recognition technology.
The FTC’s latest findings regarding Rite Aid also highlight the biases inherent in artificial intelligence systems. For example, the FTC says Rite Aid failed to mitigate risks to some consumers because of their race — and its technology was “more likely to generate false positives at stores located in majority Black and Asian communities than in majority white communities.” Results note.
Additionally, the FTC said that Rite Aid failed to test or measure the accuracy of its facial recognition system before or after deployment.
in press releaseRite Aid said it was “pleased to reach an agreement with the FTC,” but disagreed with the substance of the allegations.
“The allegations relate to a facial recognition technology pilot program that the company deployed in a limited number of stores,” Rite Aid said in its statement. “Rite Aid stopped using the technology in this small group of stores more than three years ago, before the FTC investigation began regarding the company’s use of the technology.”