DiscoverComputer Says Maybe
Computer Says Maybe
Claim Ownership

Computer Says Maybe

Author: Alix Dunn

Subscribed: 11Played: 336
Share

Description

Technology is changing fast. And it's changing our world even faster. Host Alix Dunn interviews visionaries, researchers, and technologists working in the public interest to help you keep up. Step outside the hype and explore the possibilities, problems, and politics of technology. We publish weekly.
77 Episodes
Reverse
Thought we were at peak scam? Well, ScamGPT just entered the chat.More like this: Gotcha! The Crypto Grift w/ Mark HaysThis is part three of Gotcha! — our series on scams, how they work, and how technology is supercharging them. This week Lana Swartz and Alice Marwick join Alix to discuss their primer on how generative AI is automating fraud.We dig into the very human, very dark world of the scam industry, where the scammers are often being exploited in highly sophisticated human trafficking operations — and are now using generative AI to scale up and speed up.We talk about how you probably aren’t going to get a deepfake call from a family member to demand a ransom, but the threats are still evolving in ways that are scary and until now largely unregulated. And as ever even though the problems are made worse by technology, we explore the limitations of technology and laws to stem the tide.Further reading & resources:Read the primer here!More about Lana SwartzMore about Alice MarwickNew Money by Lana SwartzScam: Inside Southeast Asia's Cybercrime Compounds by Mark Bo, Ivan Franceschini, and Ling LiRevealed: the huge growth of Myanmar scam centres that may hold 100,000 trafficked peopleAl Jazeera True Crime Report on scamming farms in South East AsiaScam Empire project by the Organised Crime and Corruption Reporting Project**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**
This just in with data centers: Energy grids are strained, water is scarce, utility costs are through the roof — ah well, let them eat compute, I guess!More like this: AI Thirst in a Water-Scarce World w/ Julie McCarthyIt was just climate week in NYC and we did a live show on data centers with four amazing guests from around the US…Thank you to the Luminate Foundation for sponsoring this live show and for all of our NY-based friends, and network from around the world that made it to Brooklyn for a magical evening. You can also watch the live recording on Youtube.KeShaun Pearson (Memphis Community Against Pollution) will break down how Elon Musk’s xAI supercomputer is polluting the air of historically Black neighborhoods in Memphis, and how organizers are fighting back against yet another chapter of corporate extraction in their communities.KD Minor (Alliance for Affordable Energy) will demystify the energy impacts of data centers in Louisiana and share organizing strategies to mobilize community opposition to Big Tech and Big Oil infrastructure.Marisol (No Desert Data Center) will talk about their grassroots coalition’s recent win in Tucson to stop Amazon’s Project Blue data center proposal, which threatened the city’s scarce water supply, and how they’re organizing for future protections.Amba Kak (AI Now Institute) will talk us through the bigger picture: what’s behind Big Tech’s AI data center expansion, who stands to benefit from this boom, and what we sacrifice in return.Further reading & resources:Amazon Web Services is company behind Tucson’s Project Blue, according to 2023 county memo — from LuminariaTuscon to create new policies around NDAs following the councils regret around not knowing more about Project Blue — from LuminariaHow Marana, also in the Tuscon area, employed an ordinance to regulate water usage after learning about data center interest in the area.xAI has requested an additional 150MGW of power for Colossus in Memphis, bring it to a total of 300MGWTime reports on increase in nitrogen dioxide pollution around Memphis due to xAI turbinesKeshaun and Justin Pearson on Democracy Now discussing xAI’s human rights violationsMeta’s Mega Data Center Could Strain Louisiana’s Grid — and Entergy Isn’t Prepared — report by the Alliance for Affordable Energy'A Black Hole of Energy Use': Meta's Massive AI Data Center Is Stressing Out a Louisiana Community — 404 Media**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**
OpenAI just secured a bizarre financial deal with Nvidia — but the math is not mathing. Is the AI sector an actual market, or a series of high-profile announcements of circular relationships between a tiny number of companies?More like this: Making Myths to Make Money w/ AI NowAlix sat down with Sarah Myers-West to go through the particulars of this deal, and other similar deals that are propping up AI’s industry of vapour. This is not your traditional bubble that’s about to burst — there is no bubble, it’s just that The New Normal is to pour debt into an industry that cannot promise any returns…Further reading & resources:More on the Nvidia OpenAI deal — CNBCAnalysts refer to deal as ‘vendor financing’ — Insider MonkeySpending on AI is at Epic Levels. Will it Ever Pay Off? — WSJOpenAI, Softbank, and Oracle spending $500bn on data centre expansion in Abilene — ReutersHow Larry Ellison used the AI boom and the Tony Blair Institute to bolster his wealthOracle funding Open AI data centers with heaps of debt and will have to borrow at least $25bn a year — The Register**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**
Multi-level marketing schemes have built an empire by enticing people with promises of self-realisation and economic freedom. The cost is simple: exploit and be exploited.More like this: Worker Power & Big Tech Bossmen w/ David SeligmanThis is part two of Gotcha! Our series on scams, how they work, and how technology is super-charging them. This week Bridget Read came to Alix with a very exciting business opportunity. Bridget authored Little Bosses Everywhere — a book on the history of MLM.We explore how door-to-door sales in the mid 20th century US took on the business model of a ponzi scheme, and transformed the sweaty salesman into an entrepreneurial recruiter with a downline.MLM originators were part of a coordinated plan to challenge the new deal in lieu of radical free enterprise, where the only thing holding you back is yourself, and the economy consists solely of consumers selling to each other in a market of speculation. The secret is, no one is selling a product — they’re selling a way of life.Further reading & resources:Buy Bridget’s book: Little Bosses Everywhere: How the Pyramid Scheme Shaped AmericaFamily Values by Melinda CooperThe Missing Crypto Queen: a podcast by BBC Sounds, about a large scale crypto scam, where there wasn’t even any cryptoLuLaRoe — the pyramid scheme that tricked American mums into selling cheap clothes to their friends and family with the promise of financial independence.My Experience of Being in a Pyramid Scheme (Amway) — a personal account by Darren Mudd on LinkedInWatch our recent live show at NYC Climate WeekSubscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!
Hey you! Do you want some free internet money? If this sounds too good to be true, that’s because it is!More like this: Making Myths to Make Money w/ AI NowThis is Gotcha! A four-part series on scams, how they work, and how technology is supercharging them. We start with Mark Hays from Americans for Financial Reform (AFR), and get into one of the biggest tech-fuelled financial scams out there: cryptocurrencies.Like many things that require mass-buy in, crypto started with an ideology (libertarianism, people hating on Wall Street post 2008). But where does that leave us now? What has crypto morphed into since then, and how does it deceive both consumers and regulators into thinking it’s something that it’s not?Further reading & resources:Seeing Like a State by James C. ScottCapital Without Borders by Brooke HarringtonThe Politics of Bitcoin by David GolumbiaLearn more about Americans for Financial ReformCheck out Web3 Is Going Great by Molly WhiteLine Goes Up by Folding Ideas — an excellent survey of all the tactics and rug-pulls during the height of the NFT boomThe Missing Crypto Queen: a podcast by BBC Sounds, about a large scale crypto scam, where there wasn’t even any crypto**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**
Gotcha!

Gotcha!

2025-09-1801:28

Gotcha! is a four-part series on scams, how they work, and how technology is supercharging them — running through to October 10.In the series we look at:Crypto: Mark Hays on how a thing touting financial freedom ended up being a kind of fin-cult, rife with scammingMulti-Level Marketing : Bridget Read on the history of the biggest and most successful type of scam that still plagues us todayGenerative AI: Data & Society’s primer on how generative AI is juicing the scam industrial complexEnshittification: Cory Doctorow on his upcoming book, and how the process of Enchittification represents user-hostile practices that scam people into paying more, and ecosystem lock-in
What if you could listen to multiple people at once, and actually understand them?More like this: **The Age of Noise w/ Eryk Salvaggio**In our final instalment (for now!) of Nodestar, Andrew Trask shares his vision for a world where we can assembly understanding from data everywhere. But not in a way that requires corporate control of our world.If broadcasting is the act of talking to multiple people at once — what about broad listening? Where you listen to multiple sources of information, and actually learn something, without trampling over the control that individuals have over who sees what, when.Andrew says that broad listening is difficult to achieve because of three huge problems: information overload, privacy, and veracity — and we are outsourcing these problems to central authorities, who abuse their power in deciding how to relay information to the public. What is Andrew doing at OpenMined to remedy this? Building protocols that decentralise access to training data for model development, obviously.Further Reading & ResourcesThe Computer as a Communication Device by JCR Licklider and Robert W Taylor, 1968World Brain by HG WellsLearn more about OpenMinedWe’re gonna be streaming LIVE at Climate Week — subscribe to our Youtube**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**
Social media isn’t really social anymore. But that might be changing. Rudy Fraser over at Blacksky Algorithms has built something new. He has built the infrastructure to provide a safe online space for the black community, and in the process challenges the ideas of hierarchical, centralised networks. His work — even outside the very cool development of Blacksky — is an amazing, concrete example of how the abstract ambitions of decentralisation can provide real value for people, and sets us up for a new kind of tech politics.More like this: How to (actually) Keep Kids Safe Online w/ Kate SimThis is part two of Nodestar, our three-part series on decentralisation. Blacksky is a community built using the AT Protocol by Rudy Fraser. Rudy built this both out of a creative drive to make something new using protocol thinking, and out of frustration over a lack of safe community spaces for black folks where they could be themselves, and not have to experience anti-black racism or misogynoir as a price of entry.Rudy and Alix discuss curation as moderation, the future of community stewardship, freeing ourselves from centralised content decision-making, how technology might connect with mutual aid, and the beauty of what he refers to as ‘dotted-line communities’.Further reading:Blacksky AlgorithmsBlacksky the app — if you want an alternative to BlueskyMore about Rudy FraserOpen Collective — a fiscal host for communities and non-profitsPaper Tree — community food bankThe Implicit Feudalism of Online Communities by Nathan SchneiderFlashes — a 3rd party Bluesky app for viewing photosThe Tyranny of Struturelessness by JoreenRudy is a technologist, community organizer, and founder of Blacksky Algorithms, where he builds decentralized social media infrastructure that prioritizes community-driven safety, data ownership, and interoperability. As a Fellow at the Applied Social Media Lab at Harvard’s Berkman Klein Center for Internet & Society, he advances research and development on technology that empowers marginalized communities, particularly Black users
How did the internet become three companies in a trenchcoat? It wasn’t always that way! It used to be fun, and weird, and full of opportunity. To set the scene for the series, we spoke to a stalwart advocate of decentratilsation, Mike Masnick.More like this: Big Tech’s Bogus Vision for the Future w/ Paris MarxThis is part one of Nodestar, a three-part series on decentralisation: how the internet started as a wild west of decentralised exploration, got centralised into the hands of a small number of companies, and how the pendulum has begun it’s swing in the other direction.In this episode Mike Masnick gives us a history of the early internet — starting with what was called the Eternal September, when millions of AOL users flooded the scene, creating a messy, unpredictable, exciting ecosystem of open protocols and terrible UIs.Further reading & resources:Protocols, Not Platforms by Mike MasnickList of apps being built on AT ProtocolGraze — a service to help you make custom feed with ads on AT protoOtherwise Objectionable — an eight part podcast series on the history of section 230Techdirt podcastCTRL-ALT-SPEECH podast**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**
Foxglove and Global Action Plan have just sued the UK government over their YOLO hyperscale data center plans.More like this: Net0++: Data Centre SprawlLocal government rejected the data center. But Starmer’s administration overruled them. They want to force the development of a water-guzzling, energy draining data center on a local community who has said no. And all of this is on the green belt. The lawsuit filed this week might put a stop to those plans.Alix sat down Ollie Hayes from Global Action Plan and Martha Dark from Foxglove to discuss the legal challenge filed this week. Why now? Aren’t the UK aiming for Net 0? And how does this relate to the UK government’s wider approach to AI?Further reading & resources:Read the Guardian article about the suitRead the Telegraph piece about the suitDonate to the campaignData Centre Finder on Global Action PlanComputer Says Maybe Shorts bring in experts to give their ten-minute take on recent news. If there’s ever a news story you think we should bring in expertise on for the show, please email pod@themaybe.org
What’s the deal with Silicon Valley selling imagined futures and never delivering on them. What are the consequences of an industry all-in on AI? What if we thought more deeply than just ‘more compute’?More like this: Big Dirty Data Centres with Boxi Wu and Jenna RuddockThis week, Paris Marx (host of Tech Won’t Save Us) joined Alix to chat about his recent work on hyperscale data centres, and his upcoming book on the subjectWe discuss everything from the US shooting itself in the foot with it’s lack of meaningful industrial policy and how decades of lackluster political vision from governments created a vacuum that has now been filled with Silicon Valley's garbage ideas. And of course, how the US’s outsourcing of manufacturing to China has catalysed China’s domestic technological progress.Further reading & resources:Buy Road To Nowhere: What Silicon Valley Gets Wrong About the Future of Transportation by Paris MarxData Vampires — limited series on data centres by Tech Won’t Save UsApple in China by Patrick McGee**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**
How do we yank power out of tech oligarch hands without handing it over to someone else?More like this: Is Digitisation Killing Democracy? w/ Marietje SchaakeCori Crider is a fearless litigator turned market-shaping advocate. She started litigating during many years at leading human rights organisation Reprieve, and then moved on to co-founding Foxglove so she could sue big tech. Now she’s set her sights on market concentration.Cori’s analysis concludes with a hopeful message: we are not stuck in place with eight dudes running the show. In fact, we’ve been here before. The computer age never would have happened the way it did if thousands of patents weren’t liberated from Bell Labs in 1956. How can we use similar tactics to dethrone monopolies and think about how Europe and other large jurisdictions can decouple themselves from silicon valley infrastructure?Further reading & resources:Antitrust Policy for the Conservative by Mark Meader of the FTCThe Open Markets InstituteThe Future of Tech Institute**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**Do you have an idea for the show? Email pod@themaybe.org
Did you miss FAccT? We interviewed some of our favourite session organisers!More like this: Part One of our FAccT roundup: Materiality and Militarisation.Georgia, Soizic, and Hanna from The Maybe team just went to FAccT. Georgia and Soizic interviewed a bunch of amazing researchers, practitioners, and artists to give you a taste of what the conference was like if you didn’t get to go. Alix missed it too — you’ll learn along with her!In part two we look into how AI is used to misrepresent people through things like image generation, and even care labour. These are conversations about AI misrepresenting hidden identities, care work becoming data work, how pride and identity is tied to labour — and how labour organisers are building solidarity and movement around this.Who features in this episode:Priya Goswami brought a multimedia exhibition to FAccT: Digital Bharat. This explores the invisibilised care work and manual labour by women in India, and how their day-to-day has become mediated by digital public infrastructures.Kimi Wenzel organised Invisible by Design? Generative AI and Mirrors of Misrepresentation, which invited users to confront generated images of themselves and discuss issues of representation within these systems.Alex Hanna and Clarissa Redwine ran the AI Workers Inquiry, which brought people together to share in how AI has transformed their work, identify common ground, and potentially begin building resistance.Further reading & resources:Circuit Breakers — tech worker conference organised by Clarissa RedwineKimi Wenzel’s researchBuy The AI Con by Alex Hanna and Emily Bender**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**
In May, Grok couldn’t stop talking about white genocide. This injection of right-wing South African politics triggered a conversation with a Musk contemporary, Nic Dawes.In this short Nic shares his perspective on how post-apartheid white communities have dealt with apartheid’s end. And how Musk is basically seeking out an information environment that can recreate the apartheid information system: Grok is just an extension of a media ecosystem designed to soothe guilt and stoke resentment.Computer Says Maybe Shorts cover recent news with an expert in our network. If there is a news story you want us to cover, please email pod@themaybe.orgNic is Executive Director at THE CITY, a news outlet serving the people of New York through independent journalism that holds the powerful to account, deepens democratic participation, and helps make sense of the greatest city in the world. He has led news and human rights organizations on three continents, and was previously Deputy Executive Director of Human Rights Watch, Chief Content Officer of Hindustan Times in Delhi, and Editor-in-Chief of South Africa's Mail & Guardian newspaper.
Georgia, Soizic, and Hanna from The Maybe team just went to FAccT. Georgia and Soizic interviewed a bunch of amazing researchers, practitioners, and artists to give you a taste of what the conference was like if you didn’t get to go. Alix missed it too — you’ll learn along with her!In part one we explore the depth of AI’s hidden material impacts, including its use in military applications and to aid genocide. One of our interviewees talked about why they spoke up at the town hall — questioning why FAccT, the biggest AI ethics conference there is, accepts sponsorship from those same military contractors.Who we interviewed for Part One:Charis Papaevangelou who co-organised a CRAFT session called The Hidden Costs of Digital Sovereignty. Greece is trying to position itself as a central digital hub by building data centres and participating in the ‘fourth industrial revolution’ — but what does this actually mean for the people and infrastructure of Greece?Georgia Panagiotidou ran a session on The Tools and Tactics for Supporting Agency in AI Environmental Action — offering some ideas on how the community can get together and meaningfully resist extractive practices.David Widder discussed his workshop on Silicon Valley and The Pentagon, and his research on the recent history of the DoD funding academic papers — is it ever worth taking military money, even for basic research?Tania Duarte offered something very different: a demonstration of two workshops she runs for marginalised groups, to better explain the true materiality of AI, and build knowledge that gives people more agency over the dominant narratives and framings in the industry.Further reading & resources:Recording of Charis’s CRAFT session: The Hidden Cost of Digital SovereigntyCloud hiding undersea: Cables & Data Centers in the Mediterranean crossroads by Theodora KostakaBasic Research, Lethal Effects: Military AI Research Funding as Enlistment and Why ‘open’ AI systems are actually closed and why this matters by David WidderThe video that David quoted the Carnegie Mellon professor from — David was paraphrasing in the episode!We and AI & Better Images of AIMore on Georgia Panagiotidou’s work and resources from her session**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**
AI Now have just released their 2025 AI Landscape report — Artificial Power. Alix sat down with two of it’s authors, Amba Kak and Sarah Myers-West for a light unpacking of the themes within.This report isn’t a boring survey of what AI Now have been doing this year; it’s a comprehensive view of the state of AI, and the concentrated powers that prop it up. What are the latest AI-shaped solutions that the hype guys are trying to convince us are real? And how can we reclaim a positive agenda for innovation — and unstick ourselves from a path towards pseudo religious AGI.Further reading & resources:Read the AI Now 2025 Landscape Report: Artificial Power**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!***Amba Kak has spent the last fifteen years designing and advocating for technology policy in the public interest, across government, industry, and civil society roles – and in many parts of the world. Amba brings this experience to her current role co-directing AI Now, a New York-based research institute where she leads on advancing diagnosis and actionable policy to tackle concerns with artificial intelligence and concentrated power. She has served as Senior Advisor on AI to the Federal Trade Commission and was recognized as one of TIME’s 100 Most Influential People in AI in 2024.**Sarah Myers-West has spent the last fifteen years interrogating the role of technology companies and their emergence as powerful political actors on the front lines of international governance. Sarah brings this depth of expertise to policymaking in her current role co-directing AI Now, with a focus on addressing the market incentives and infrastructures that shape tech’s role in society at large and ensuring it serves the interests of the public. Her forthcoming book, Tracing Code (University of California Press) draws on years of historical and social science research to examine the origins of data capitalism and commercial surveillance.*
Felienne Hermans calls herself an ‘involuntary ethnographer of computer science’. She studies the culture behind programming, and challenges the dominant idea that learning to program has to be painful. Alix and Felienne chat about the history of programming and how it went from multidisciplinary and inclusive, to masochistic and exclusive. They also dig into all the ways it excludes women and people who do not speak English.Further reading & resources:Scratch — a high level programming language aimed at kidsHedy — the programming language that Felienne designedJoin in and help out with Hedy!GenderMag by Margaret Burnett — how to ensure more gender inclusiveness in your softwareElm — an easy and kind browser-based programming languageA Case for Feminism in Programming Language Design by Felienne Hermans & Ari SchlesingerA Framework for the Localization of Programming Languages by Felienne Hermans & Alaaeddin SwidanSubscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!Felienne is the creator of the Hedy programming language, a gradual and multi-lingual programming language designed for teaching. She is the author of “The Programmer’s Brain“, a book that helps programmers understand how their brain works and how to use it more effectively. In 2021, Felienne was awarded the Dutch Prize for ICT research. She also has a weekly column on BNR, a Dutch radio station.
Smart people focused on technology politics issues get it. We trade high level helpful concepts like surveillance capitalism, automated inequality, and enshittification. And even as some of these ideas are making it more mainstream, normies aren’t getting the message. We need stories for that. But how? How do we take the technical jargon and high-level concepts that dominate tech narratives and instead create stories that are personal, relatable, and powerful?And how do we combat the amazing hero-god narratives of Silicon Valley without reinforcing them?Alix went to storytelling festival ZEG Fest in Tbilisi to chat with three amazing storytellers about that challenge:Armando Iannucci, creator of Veep and The Thick of It: who discusses how to use humour and satire to keep things simple — and that stories are not ‘made up’, but rather a way to relay a series of facts and concepts that are complex and difficult to process.Chris Wylie, Cambridge Analytica whistleblower: on how the promise of superintelligence and transhumanism is basically like a religious prophecy. His new show Captured explores the stories that tech elites are telling us about our utopian AI future.Adam Pincus, producer of The Laundromat and Leave no Trace: shares his frustrations with the perceived inevitability of AI in his day to day, and also tells us more about his podcast series ‘What Could Go Wrong?’ in which he explores writing a Contagion sequel with director Scott Burns.Further reading & resources:Captured: The Secret Behind Silicon Valley’s AI Takeover — limited podcast series featuring Chris Wylie**‘Contagion’ Screenwriter Scott Z. Burns Asks AI to Write a Sequel to Pandemic Film in Audible Original Series ‘What Could Go Wrong?’** — Variety articleWhat Could Go Wrong? — limited podcast series by Scott Burns**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**
There has been an intentional and systematic narrative push that tells governments they are not good enough to provide their own public infrastructure or regulate tech companies that provide it for them.Shocking: these narratives stem from large tech companies, and this represents what Marietje Schaake refers to as a Tech Coup — which is the title of her book (which you should buy!).The Tech Coup refers to the inability of democratic policymakers to provide oversight, regulation, and even visibility into the structural systems that big tech is building, managing, and selling. Marietje and Alix discuss what happens when you have a system of states whose knowledge and confidence have been gutted over decades — hindering them from providing good services, and understanding how to meaningfully regulate the tech space.Further Reading & Resources:Buy The Tech Coup by Marietje Schaake**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**Marietje Schaake is a non-resident Fellow at Stanford’s Cyber Policy Center and at the Institute for Human-Centered AI. She is a columnist for the Financial Times and serves on a number of not-for-profit Boards as well as the UN's High Level Advisory Body on AI. Between 2009-2019 she served as a Member of European Parliament where she worked on trade-, foreign- and tech policy. She is the author of **The Tech Coup.**
This episode contains some descriptions of torture methods, automated human targeting by machines, and psychological warfare throughoutLast week Alix hosted a live show in Mexico City right after REAL ML. Four panellists discussed a huge important topic, which has been wrongfully deemed as taboo by other conferences: the use of AI and other technologies to support the ongoing genocide in Palestine.Here’s a preview of what the four speakers shared:Karen Palacio AKA kardaver gave us an overview of Operation Condor — a program of psychological warfare that ran in the late 20th century in South America to suppress activist voices.Marwa Fatafta explains how these methods are still used today against Palestinians; there are coordinated surveillance projects that make Palestinian citizens feel they are living in a panopticon, and the granular data storage and processing is facilitated by AWS, Google, and Azure.Matt Mahmoudi goes on to describe how these surveillance projects have crystallised into sophisticated CCTV and facial recognition networks through which Palestinians are continuously dehumanised via face-scanning and arbitrary checks that restrict movements.Wanda Muñez discusses how fully autonomous weapons obviously violate human rights in all kinds of ways — but ‘AI ethics’ frameworks never make any considerations for machines that make life or death decisions.Further reading & resources:The Biometric State by Keith Breckenridge — where the phrase ‘automated apartheid’ was conceivedCOGWAR Report by Karen Palacio, AKA KardaverSubscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!Wanda Muñez is an international consultant with twenty years of experience in the design, implementation and evaluation of programs and policies on human rights, gender equality, inclusion and the rights of people with disabilities. Wanda has worked for international NGOs and UN organizations in Asia, Africa, Europe and Latin America. She became involved in the field of artificial intelligence in 2017, initially through the analysis of its intersection with International Humanitarian Law in the issues of autonomous weapons systems; and later focusing on the intersection between human rights and AI. In 2020, she was nominated by the Ministry of Foreign Affairs of Mexico as an independent expert at the Global Alliance on Artificial Intelligence (GPAI), where she contributed to various publications and panels, and led the design of the research “Towards true gender equality and diversity in AI” that is currently being implemented. In 2020, Wanda Muñoz was recognized by the Nobel Women's Initiative as "a peacebuilder working for peace, justice and equality" and by UNLIREC as one of Latin America's "forces of change, working for humanitarian disarmament, non-proliferation and arms control. Wanda also just recently won DEI Champion of Year Award from Women in AI.Karen Palacio, aka kardaver, is an interdisciplinary digital artist, industrial programmer specialized in AI, and data scientist from Córdoba, Argentina. She researches and creates through iterative loops of implementation and reflection, aiming to understand what it means to articulate artistic-technological discourses from the Global South. Her performances, installations, and audiovisual works engage critically and rootedly with the depths of computation, the histories of computing and archives, freedom of knowledge, feminisms, and the pursuit of technological sovereignty. She develops and works with Free Software in her processes, resemanticizing technologies she knows from her background as an industrial programmer.Dr Matt Mahmoudi is Assistant Professor in Digital Humanities at the University of Cambridge, and a Researcher/Advisor on Artificial Intelligence and Human Rights at Amnesty International. Matt’s work has looked at AI-driven surveillance from the NYPD’s surveillance machine to Automated Apartheid in the occupied Palestinian territory. Matt is author of Migrants in the Digital Periphery: New Urban Frontiers of Controls (University of California Press, February 2025), and co-editor of Resisting Borders & Technologies of Violence (Haymarket, 2024) together with Mizue Aizeki and Coline Schupfer.Marwa Fatafta leads Access Now’s policy and advocacy work on digital rights in the Middle East and North Africa (MENA) region. Her work spans a number of issues at the nexus of human rights and technology including content governance and platform accountability, online censorship, digital surveillance, and transnational repression. She has written extensively on the digital occupation in Palestine and focuses on the role of new technologies in armed conflicts and humanitarian contexts and their impact on historically marginalized and oppressed communities. Marwa is a Policy Analyst at Al-Shabaka: The Palestinian Policy Network, an advisory board member of the Tahrir Institute for Middle East Policy, and an advisory committee member for Bread&Net. Marwa was a Fulbright scholar in the US and holds an MA in International Relations from Maxwell School of Citizenship and Public Affairs, Syracuse University. She holds a second MA in Development and Governance from the University of Duisburg-Essen.
loading
Comments