103 Meredith Whittaker: Technology acts political and hides behind objectivity
Description
With us today is Meredith Whittaker, president of the Signal Foundation who serves on its board of directors. She was formerly the Minderoo Research Professor at New York University (NYU), and the co-founder and faculty director of the AI Now Institute.
She also served as a senior advisor on AI to Chair Lina Khan at the Federal Trade Commission. Whittaker was employed at Google for 13 years, where she founded Google’s Open Research group and co-founded the M-Lab. In 2018, she was a core organizer of the Google Walkouts and resigned from the company in July 2019. She now runs Signal, the leading global privacy-orientated NGO.
Transcript of the episode:
00:00:06 Domen Savič / Citizen D
Welcome everybody, it’s the 10th of September 2024, but we are releasing this episode of Citizen D podcast on the 15th of October 2024. With us today is Meredith Whittaker, president of the Signal Foundation who serves on its board of directors. She was formerly the Minderoo Research Professor at New York University (NYU), and the co-founder and faculty director of the AI Now Institute. She also served as a senior advisor on AI to Chair Lina Khan at the Federal Trade Commission. Whittaker was employed at Google for 13 years, where she founded Google’s Open Research group and co-founded the M-Lab. In 2018, she was a core organizer of the Google Walkouts and resigned from the company in July 2019. Welcome to the show, Meredith!
00:00:55 Meredith Whittaker / Signal
Thank you so much for having me, it’s great to be here.
00:00:58 Domen Savič / Citizen D
Let’s start at the beginning. I’m curious to hear your thoughts on the consequences or the whole Google walkout situation. Back then, it seems to me that it was your personal convictions that started everything, and I want to hear your thoughts on the balance between personal involvement, personal responsibility and the push for systemic change in a particular area. Does one happen without the other?
00:01:32 Meredith Whittaker / Signal
That’s a big question and I’m not sure I have an easy one-size-fits-all answer. Ultimately, we’re all individuals and when we act, we need to act on our own volition. We need to recognize what we believe and take it seriously and be accountable to our analysis.
But I wouldn’t actually say that the walkout was the very beginning for me, the walkout was a culmination of a lot of work, a lot of thinking, a lot of conversations that I’d had over more than a decade. And the walkout also wasn’t just me. It was thousands and thousands of people. It was apparently the biggest labor action that has happened in tech, with 20,000 people leaving work in protest, you know, against the unethical business conduct at Google and against a culture that persistently valued some people more than others and developed products that often caused serious risk for those who were devalued, so to speak, due to that culture and those design decisions.
I think the walkout was one way in which throughout my career, in many, many ways I have endeavored to be accountable to my analysis, I have endeavored to do what I can to change things when I saw them going in in a bad direction, but I had worked for many years and many different ways, from the inside trying to influence trying to shape policy and many of these things I still do… So again, I think the walkout wasn’t the beginning. It was one manifestation of a theory of change that looked to collective action from below to remedy some of the dangers and harms of the concentrated tech business model.
00:03:52 Domen Savič / Citizen D
I was wondering because it’s like an example of this personal engagement that is usually pitted against “please don’t regulate us”, and “the industry is capable of regulating itself”.
So, would you think that with these types of personal engagements and campaigns from people who are working from within the industry… do you think that’s enough to resolve these issues revolving around privacy, security, surveillance and other pertaining topics
00:04:42 Meredith Whittaker / Signal
I don’t think there is one trick we’re going to find that solves these issues at once. For all the labor organizing and the ability of the structural leverage that employees/workers have relative to the institutions and people who employ them is a well-known lever before the 1980s, before the hegemonic ascent of neoliberal ideology, it was very well accepted, on the right and in the center.
Conservatives, liberals, leftists… recognized simply structural check on toxic capitalism and labor power involved, the workers having some say in what they work on and how. I don’t know that this is individual so much as going back to some of the basics and recognizing that we have an industry that is making some decisions and putting revenue and growth above the common good in ways that could be really, really dangerous given the power and information possessed by this industry.
00:06:11 Domen Savič / Citizen D
And speaking of power and controlling power, you’re now running Signal which is, for privacy activists an a journalists and many others across the globe the app or the service to use if somebody wants to protect privacy and have a decent level of security and I would like to know how do you generate trust in the app, in the system, in the symbol that is Signal currently representing in terms of privacy, security, and other protections of human rights especially in the area or in the time where you distrust in institutions, in each other is ultimately growing?
00:07:15 Meredith Whittaker / Signal
Well, I think… look, trust is earned. It’s slow to earn. It is fast to erode, and it reflects a pattern of behavior and trustworthiness overtime. So in fact, I don’t, we don’t ask people to trust us, we endeavor to be as trustworthy as possible, to be as open as possible to develop our code in the open, open source to make our encryption protocols open, to allow people to vet and scrutinize the math and the implementation, and to ensure that in fact, insofar as possible, we are never asking someone simply to blindly trust us because they like what I say or they like the way Signal looks or operates.
We are asking people to validate our claims and we are making insofar as possible everything available for them to do that and I think that is why Signal is so trusted, because in fact we, we are going above and beyond to be trustworthy in a way that most actors in the ecosystem can’t or are unwilling to for a number of reasons.
00:08:36 Domen Savič / Citizen D
An easy follow up question: how hard is it? I mean how hard is it to go above and beyond and do all the things you just said Signal is doing to encourage or to be trustworthy?
00:08:55 Meredith Whittaker / Signal
Well, it is difficult and it’s difficult for a couple of reasons. First, the tech ecosystem, the tech industry as it exists now, as it’s been built since the 1990s and before is structured around making money off data collection, it’s structured around monetizing surveillance. That’s the business model. So, you collect data, you use that data to target advertisements, or you use that data to train an AI model… but the assumption is built into all of the tools we use, just the narratives. We think with the libraries that we can access, that we would want to collect as much data as possible.
And so, it is difficult to do the opposite. We actually end up having to rewrite parts of the stack, so to speak, in order to enable privacy, in order to reject data collection as a norm. So that is difficult because we are swimming upstream against a massive current in a trillion-dollar industry, where privacy has not been something that was prioritized and trust around privacy is certainly not been part of the business model. Now it’s also difficult or related to that it’s difficult because there isn’t a business model for privacy at this point in the tech industry, and this is one of the huge harms that we are, we are grappling with.
The profit motive is oppositional to privacy, data collection is oppositional to privacy. So it’s difficult from that perspective in that we have to really think about our structure and protect ourselves from the imperatives of profit and growth, not necessarily because they’re bad in and of themselves, but because following those imperatives, would at this point lead us down a path towards surveillance toward data collection.
So, this is why Signal is structured as a non-profit. This is why we really go out of our way to take the incentives for surveillance off the table when it comes to Signal again. So, we’re structured for success in the long term, so we stay laser focused on our mission.
00:11:29 Domen Savič / Citizen D
And speaking of economy or surveillance capitalism… that’s one part of the equation, right? The industry is focusing on gathering information, repacking it, reselling it or selling it to the highest bidder. On the other hand, you have politicians, who are worried about privacy eroding security.
So, is it hard for you, for Signal to argue for privacy when faced with a fake dilemma of choice between privacy and security?
00:12:17 M