Why do so many companies that rely on monetizing their users’ data seem so interested in AI? If you ask Signal CEO Meredith Whitaker (and I did), she’ll tell you it’s simply because “AI is a surveillance technology.”
On stage at TechCrunch Disrupt 2023Whittaker made her point that AI is largely inseparable from the big data and targeting industry perpetuated by companies like Google and Meta, as well as less focused but equally prominent enterprise and defense companies. (Her notes have been lightly edited for clarity.)
“It requires a monitoring business model; It’s an exacerbation of what we’ve seen since the late 1990s and the evolution of surveillance advertising. I believe that AI is a way to solidify and expand the surveillance business model. “A Venn diagram is a circle.”
“And the use of AI is also for surveillance, right?” I continued. “You know, you walk past a facial recognition camera that’s equipped with pseudoscientific emotion recognition, and it produces data about you, whether true or false, that says ‘You’re happy, you’re sad, you have a bad personality, you’re a liar, whatever.’ This is in Ultimately they are surveillance systems marketed to those who have power over us at large: our employers, governments, border controls, etc., to make decisions and predictions that will shape our access to resources and opportunities.
Ironically, she noted, the data underpinning these systems is often organized and annotated (a necessary step in the process of assembling an AI data set) by the workers who could be targeted.
“There is no way to create these systems without human labor at the level of communicating the ground truth of the data — learning augmented with human feedback, which again is just the kind of precarious human labor that washes away the technology,” she explained. “Thousands and thousands of workers are being paid very little.” “Although collectively, it’s very expensive, and there’s no other way to create these systems.” “In some ways what we’re seeing is a kind of Wizard of Oz phenomenon, where when you pull back the curtain there’s not a lot of intelligence.”
However, not all AI and ML systems are equally exploitative. When I asked if Signal uses any AI tools or processes in its app or development work, she confirmed that the app has “a small on-device model that we didn’t develop, we use it off the shelf, as part of a facial camouflage.” A feature in our media editing toolset. It’s not actually very good… but it helps detect faces in group photos and blur them, so that when you share them on social media, you don’t reveal people’s intimate biometric data to Clearview, for example.
“But that’s the thing. Like… yeah, this is a great use of AI, doesn’t it free us from all this negativity that I was putting on stage,” she added. “Sure, if this was the only market for facial recognition… but Let’s be clear. The economic incentives that drive the expensive process of developing and deploying facial recognition technology will never allow this to be its only use.