DiscoverThe Data Fix with Dr. Mél Hogan
The Data Fix with Dr. Mél Hogan
Claim Ownership

The Data Fix with Dr. Mél Hogan

Author: Mél Hogan

Subscribed: 31Played: 542
Share

Description

Hi everyone, my name is Mél Hogan and I’m a critical media studies scholar based in Canada. I’m working on a project called The Data Fix through a series of conversations with scholars, thinkers, and feelers. Together we explore the significance of living in a world of data, and especially the growing trend of “digital humans” in the form of chatbots, holograms, deepfakes, ai images and videos, and even tech that revives the dead. The conversations are minimally edited, and serve as an archive of the collective thinking and feeling that is going into the Data Fix project. Please see thedatafix.net for more details and show notes. Thank you so much for listening. 


Cover art by Oona Ode.


Hosted on Acast. See acast.com/privacy for more information.

78 Episodes
Reverse
In this episode, Olivia Guest, Iris van Rooij and Andrea Reyes Elizondo discuss why it’s important to the overall purpose and significance of the university to resist the uncritical adoption of AI in academia. The risk of AI adoption is that it’ll hollow out the institutions first, and then society at large. Recorded Feb 27, 2026. Released March 16, 2026.Against the Uncritical Adoption of 'AI' Technologies in AcademiaOlivia Guest, Marcela Suarez, Barbara Müller, Edwin van Meerkerk, Arnoud Oude Groote Beverborg, Ronald de Haan, Andrea Reyes Elizondo, Mark Blokpoel, Natalia Scharfenberg, Annelies Kleinherenbrink, Ileana Camerino, Marieke Woensdregt, Dagmar Monett, Jed Brown, Lucy Avraamidou, Juliette Alenda-Demoutiez, Felienne Hermans & Iris van Rooijhttps://philarchive.org/rec/GUEATU*the 2nd quote read on the episode:Guest, O. (2025). What Does 'Human-Centred AI' Mean?. arXiv. https://doi.org/10.48550/arXiv.2507.19960We've been here before! What do you mean?Olivia Guest, 18 February 2026https://olivia.science/before/Summer School: Critical AI Literacies for Resisting and Reclaiminghttps://irisvanrooijcogsci.com/2026/02/18/summer-school-critical-ai-literacies-for-resisting-and-reclaiming/ and https://olivia.science/ai/ Academic Collaborations and Public Health: Lessons from Dutch Universities' Tobacco Industry Partnerships for Fossil Fuel Ties. Zenodo. Knoester, L., Pereira, A., Vanheule, L., Reyes Elizondo, A., Littlejohn, A., & Urai, A. (2025). https://doi.org/10.5281/zenodo.15274865Why AI transparency is not enoughhttps://www.leidenmadtrics.nl/articles/why-ai-transparency-is-not-enough  Hosted on Acast. See acast.com/privacy for more information.
In this episode, Daisy Maldonado, Annie Ersinghaus, Gilberto Manzanarez and I discuss on-the-ground opposition to two data centers being built against the will of local residents: Project Jupiter (Oracle/Open AI) in New Mexico, and the largest data center in California -- in Imperial Valley. This conversation was part of: Powering AI from the Borderlands: Organizing Against Data Centers, facilitated by Dustin Edwards. Recorded Feb 24, 2026. Released Monday March 2, 2026.Powering AI from the Borderlands: Organizing Against Data Centershttps://cal.sdsu.edu/humtech/eventsGilberto Manzanarez on Instagramhttps://www.instagram.com/valleimperialresiste/Resistance to data centers rises on the borderhttps://www.hcn.org/articles/resistance-to-data-centers-rises-on-the-border/"Jupiter Watch" videos on Project Jupiter in New Mexicohttps://www.youtube.com/watch?v=Pi_phIsUpRg&t=5sThe Water Is Coming ¡Ya Viene La Agua!Annie Ersinghaus’s short doc on the politics of the Rio Grande (Project Jupiter is pulling from underground water along the Rio Grande)https://www.youtube.com/watch?v=Kgab5ICoWJI&t=430s  Hosted on Acast. See acast.com/privacy for more information.
Enough, with Adam Becker

Enough, with Adam Becker

2026-02-1601:11:36

In this episode, Adam Becker and I talk about our AI overlords and their “philosophical” influences — mostly eugenics-based pseudoscience and bad readings of sci-fi that make tech billionaires feel like they’ve earned their billions by being the smartest people on the planet… while ruining the planet. Recorded Feb 13, 2026. Released Feb 16, 2026.More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity (2025) https://mitpressbookstore.mit.edu/book/9781541619593 BBC: https://www.bbc.co.uk/programmes/w172zsskmkss5ll (start at 39:15)Rolling Stone Q&A: https://www.rollingstone.com/culture/culture-features/tech-billionaires-adam-becker-1235381649/Dreaming Against the Machinehttp://dreamingagainstthemachine.com/Forthcoming: Adam Becker’s new podcast website (bookmark this now for later!) Mentioned in our conversation:I Am An AI Hater by Anthony Moserhttps://anthonymoser.github.io/writing/ai/haterdom/2025/08/26/i-am-an-ai-hater.html  Hosted on Acast. See acast.com/privacy for more information.
Great theorists make you see the world differently forever after. This is the case with my guests for this episode, Zoë Sofoulis (Zoe Sofia) and Ingrid Richardson — brilliant scholars helping us rethink containers and containment as a feminist strategy to rework worldly narratives about what holds, filters, leaks. This conversation was a special honour for me. Recorded January 23, 2026. Released February 2, 2026. Containment Technologies of Holding, Filtering, Leaking (2025) https://meson.press/books/containment/ Zoë Sofia Container Technologies (2000)https://www.researchgate.net/publication/227700296_Container_Technologies Shoutouts to:Re-Understanding Media: Feminist Extensions of Marshall McLuhan (Duke University Press) https://dukeupress.edu/re-understanding-media Insufferable Tools: Feminism Against Big Tech https://www.dukeupress.edu/insufferable-tools Understanding Media: The Extensions of Man Paperback –1994 https://mitpress.mit.edu/9780262631594/understanding-media/ And much more… Hosted on Acast. See acast.com/privacy for more information.
In this first episode of 2026, I speak with Am Johal and Matt Stern, authors of the "O My Friends, There Is No Friend: The Politics of Friendship at the End of Ecology" (2025) to better understand what friendship means these days... at the "end of ecology". Is friendship political? Can an AI chatbot be a friend? Recorded Jan 16, 2026. Released Jan 19, 2026. Matt Hern, Am Johal O My Friends, There is No Friend (pdf)https://library.oapen.org/bitstream/id/022e1f6f-f14a-4c11-a94c-c95610919f8c/9783839470268.pdfBelow the Radar podcast:https://www.sfu.ca/vancity-office-community-engagement/below-the-radar-podcast.html***New Cover art by Oona Ode! Hosted on Acast. See acast.com/privacy for more information.
José Marichal is exceptionally good at making the case for why we should all become algorithmic problems and also much more reflective of our engagement with social media. We are altered by recommendation algorithms -- and we should probably think now about how to renegotiate these terms? Happy holidays! Recorded Nov 27. Released Dec 22, 2025.You Must Become an Algorithmic ProblemRenegotiating the Socio-Technical Contracthttps://bristoluniversitypress.co.uk/you-must-become-an-algorithmic-problem  Hosted on Acast. See acast.com/privacy for more information.
In this episode I speak with friends-neighbours-advocates Meg Rintoul and Kathryn Barnwell about current plans in place to build an AI data center in Nanaimo, B-C. -- and allowing ourselves to write a new story about the future. Recorded Nov 27. Released Dec 8, 2025.'Very scary': Nanaimo neighbours have water worries about new data centrehttps://www.reddit.com/r/nanaimo/comments/1oo24s3/very_scary_nanaimo_neighbours_have_water_worries/'An Island concern': Nanaimo water advocates want new data centre stoppedhttps://www.youtube.com/watch?v=0OQzc2bhtXwCBC: AI-related data centres use vast amounts of water. But gauging how much is a murky businesshttps://www.cbc.ca/news/ai-data-centre-canada-water-use-9.6939684 Hosted on Acast. See acast.com/privacy for more information.
In this episode, I got to walk through a recent report called “the high stakes of tracking menstruation” with its author, Stefanie Felsberger, a sociologist of tech & gender. I cannot express enough how much there is to learn from this topic that can help us understand the bigger landscape of tech promises and harms. Recorded Oct 23, 2025. Released Nov 24, 2025.The High Stakes of Tracking Menstruation - MCTD Cambridgehttps://www.mctd.ac.uk/femtech-high-stakes-tracking-menstruation/ https://www.mctd.ac.uk/wp-content/uploads/2025/06/The-High-Stakes-of-Tracking-Menstruation_Accessible.html Menstrual apps harvest data that ‘puts women’s safety at risk’https://www.thetimes.com/uk/healthcare/article/menstrual-apps-harvest-data-that-puts-womens-safety-at-risk-bd0srb8mt? Period: The Real Story of Menstruation (by Kate Clancy)https://press.princeton.edu/books/hardcover/9780691191317/period? Websitehttps://www.stefaniefelsberger.com Hosted on Acast. See acast.com/privacy for more information.
In this episode, I speak with Justin Hendrix, the CEO and Editor of Tech Policy Press, a nonprofit media venture concerned with the intersection of technology and democracy. We talk about ICE (US Immigration and Customs Enforcement), surveillance, and AI. Recorded Oct 21, 2025. Released Nov 10, 2025. Republican Budget Bill Signals New Era in Federal SurveillanceDEAN JACKSON, JUSTIN HENDRIX / JUL 2, 2025https://www.techpolicy.press/republican-budget-bill-signals-new-era-in-federal-surveillance/Amidst Violent Immigration Raids, DHS Turns to Big Tech to Silence DissentJENNA RUDDOCK / OCT 3, 2025https://www.techpolicy.press/amidst-violent-immigration-raids-dhs-turns-to-big-tech-to-silence-dissent/ AI Surveillance on the Rise in US, but Tactics of Repression Not NewDIA KAYYALI / MAR 26, 2025https://www.techpolicy.press/ai-surveillance-on-the-rise-in-us-but-tactics-of-repression-not-new/ Hosted on Acast. See acast.com/privacy for more information.
Host of Tech Won't Save Us and acclaimed tech critic, author, and international speaker, Paris Marx joins me for this episode where we discuss AI futures in a Canadian context: the idea of a "sovereign cloud", an "AI minister", and much more! Recorded Oct 15, 2025. Released Oct 20, 2025.Websitehttps://parismarx.com/Tech Won't Save Ushttps://techwontsave.us/Disconnecthttps://disconnect.blog/im-writing-a-new-book/ Hosted on Acast. See acast.com/privacy for more information.
In this episode, I get to chat with the brilliant Michael Richardson on the concept of "Nonhuman Witnessing" especially in how this relates to algorithms and AI. In his book, "Nonhuman Witnessing" (Duke), he argues that a "radical rethinking of what counts as witnessing is central to building frameworks for justice in an era of endless war, ecological catastrophe, and technological capture". Recorded August 13, 2025. Released August 25, 2025. Nonhuman Witnessing: War, Data, and Ecology after the End of the World (Duke, 2024)https://read.dukeupress.edu/books/book/3310/Nonhuman-WitnessingWar-Data-and-Ecology-after-theWebsite https://www.unsw.edu.au/staff/michael-richardsonBluesky@richardsonma.bsky.social‬ Hosted on Acast. See acast.com/privacy for more information.
Rohit Revi walks us through paranoia, care, conspiracy, capitalism, and catastrophe, in relation to technology and culture, to draw us into a deeper consideration of collective psychic resources and psychological commons. We talk about psychometry and linger on the dopaminergic. Recorded July 16, 2025. Released August 11, 2025.Great Delirium: Culture, Technology, and Paranoia in the New Age of Catastrophe(2025-02-24) Revi, Rohit; Cultural Studies; Murakami Wood, David; McBlane, Angus Hosted on Acast. See acast.com/privacy for more information.
Emily M.Bender and Alex Hanna have been leading the charge against "AI", helping us understand it for the con that it is, and how AI companies are turning to health, education, and other social realms to try to recover their costs. In this episode we discuss LLMs vs. what the AI (and AGI) con is -- who benefits, and who loses -- and much more. Recorded July 15, 2025. Released July 28, 2025.The AI Conhttps://thecon.ai/Mystery AI Hype Theater 3000https://www.dair-institute.org/maiht3k/The predatory fantasy of worker empowerment in AI marketingJustine Zhang, Su Lin Blodgett, Nina MarklAI x Crisis: Tracing New Directions Beyond Deployment and Use workshop, Aarhus 2025.AI isn’t replacing student writing – but it is reshaping ithttps://theconversation.com/ai-isnt-replacing-student-writing-but-it-is-reshaping-it-254878Sparks of Artificial General Intelligence: Early experiments with GPT-4https://arxiv.org/abs/2303.12712 Hosted on Acast. See acast.com/privacy for more information.
In this episode, Paul Schütze and I pick apart the inherent contradictions of “sustainable AI”, marketing language that aims to convince the public that one of the most extractive industries can be used to solve climate change. We delve into the layers of control embedded in the logics of AI, when technology becomes the fix that needs fixing. Recorded May 20, 2025. Released July 7, 2025. The impacts of AI Futurismhttps://link.springer.com/article/10.1007/s10676-024-09758-6 The Problem of Sustainable AI https://doi.org/10.34669/WI.WJDS/4.1.4 contact paul.schuetze@uos.de and website: paulschuetze.de  Hosted on Acast. See acast.com/privacy for more information.
In this episode I speak with Rebecca Kilberg, Mary-Clare Bosco and Jonathan Gilmour who together use policy approaches to solve problems related to data center water usage and the various planetary and health outcomes that emerge from water consumption and extraction. They talk about how you get such data and what to do with it, and the importance of creating many local sites of resistance for a more sustainable future. Want in? Get in touch. Recorded May 19, 2025. Released June 16, 2025.Voices: Data centers must be transparent about water usage — for the sake of the Great Salt Lakehttps://www.sltrib.com/opinion/commentary/2024/12/31/voices-utah-data-centers-must-beReducing Data Centers’ Water Consumption https://aspenpolicyacademy.org/project/reducing-data-centers-water-consumption Hosted on Acast. See acast.com/privacy for more information.
In this episode, Shannon Wait — Alphabet Workers Union-CWA organizer — speaks with me about the labour conditions for data center and AI workers. We talk about contracts, sub-contracts, sub-sub-contracts, NDAs, invisible labour -- and how all of this leads to unions, solidarity, and a fight for tech workers’ rights globally. Recorded May 8, 2025. Released June 2, 2025. Interview with Shannon Wait, Alphabet Workers Union-CWA Organizer (2024)https://poweratwork.us/shannon-wait-interviewA union of Alphabet workers in the U.S. and Canadahttps://www.alphabetworkersunion.org/Google Raters Participated in Historic Action at Google HQ to Demand Google End Poverty Wages for 5,000 Workershttps://code-cwa.org/news/google-raters-participated-historic-actionThe woman who took on Google and won (2021)https://www.bbc.com/news/technology-56659212 Hosted on Acast. See acast.com/privacy for more information.
Allison Carruth and I talk about her new book which gets at some of the material infrastructure and social systems that have made the US a settler state ever obsessed with new frontiers, including space. We talk about tech imaginaries, worlds remade, and better futures — a vision that invites confronting the state of things head-on, a slower redoing, and is based on connection, love, and friendship (maybe with aliens, too). Recorded May 7, 2025. Released May 19, 2025.Novel Ecologies: Nature Remade and the Illusions of Tech (2025)https://press.uchicago.edu/ucp/books/book/chicago/N/bo239362741.htmlAllison Carruth https://allisoncarruth.com/ Hosted on Acast. See acast.com/privacy for more information.
In this episode, Dani Shanley and Gemma Milne walk me through "hype" -- what it means in various technological contexts, how it works, what it is definitionally, how it feels in the body, who it serves, who it harms, and how we might need to nuance our relationship to it, especially as critical (tech) scholars. Recorded May 1, 2025. Released May 5, 2025.https://www.hachette.co.uk/titles/gemma-milne/smoke-mirrors/9781472143655/https://radicalsciencepodcast.com/https://brainreel.substack.com/https://hypestudies.org/https://www.biss-institute.com/en/abouthttps://www.maastrichtuniversity.nl/news/new-technologies-heroes-or-villains Hosted on Acast. See acast.com/privacy for more information.
Dustin Edwards and I discuss the damage caused by digital infrastructure and its extractive requirements. We talk about data centers and copper mines, but more than this, we delve into the what a decolonial, feminist, anti-racist approach can look like for white settler scholars grappling with their inheritances and obligations to the landscapes and to the stories they tell themselves, as we make (new) worlds. Recorded Apr 8, 2025. Released April 28, 2025.Enduring Digital Damage: Rhetorical Reckonings for Planetary Survivalhttps://www.uapress.ua.edu/9780817322472/enduring-digital-damage/ The making of critical data center studiesDustin Edwards, Zane Griffin Talley Cooper and Mél Hoganhttps://journals.sagepub.com/doi/full/10.1177/13548565231224157  Hosted on Acast. See acast.com/privacy for more information.
In this episode, I ask Jasmine McNealy about the role of consent online, from social media exchanges to the circulation of deep fakes. Who gets to define harm? Who is responsible for the damage? Does anyone have to take accountability? We also talk about surveillance, sonic privacy, and the many data trails the body leaves behind. Recorded Apr 4, 2025. Released April 14, 2025.Sonic Privacy. Yale Journal of Law & Technology/Yale ISP-Knight Foundation Public Sphere Series.https://law.yale.edu/sites/default/files/area/center/isp/documents/mcnealy.pdfConsent (Still) Won’t Save Us Chapter from: Feminist Cyberlaw https://uplopen.com/chapters/e/10.1525/luminos.190.p  Hosted on Acast. See acast.com/privacy for more information.
loading
Comments 
loading