Discover
How to Fix the Internet

How to Fix the Internet
Author: Electronic Frontier Foundation (EFF)
Subscribed: 26,284Played: 65,896Subscribe
Share
Description
The internet is broken—but it doesn’t have to be. If you’re concerned about how surveillance, online advertising, and automated content moderation are hurting us online and offline, the Electronic Frontier Foundation’s How to Fix the Internet podcast offers a better way forward. EFF has been defending your rights online for over thirty years and is behind many of the biggest digital rights protections since the invention of the internet. Through curious conversations with some of the leading minds in law and technology, this podcast explores creative solutions to some of today’s biggest tech challenges. Hosted by EFF Executive Director Cindy Cohn and EFF Associate Director of Digital Strategy Jason Kelley, How to Fix the Internet will help you become deeply informed on vital technology issues as we work to build a better technological future together.
65 Episodes
Reverse
All this season, “How to Fix the Internet” has been focusing on the tools and technology of freedom – and one of the most important tools of freedom is a library. Access to knowledge not only creates an informed populace that democracy requires, but also gives people the tools they need to thrive. And the internet has radically expanded access to knowledge in ways that earlier generations could only have dreamed of – so long as that knowledge is allowed to flow freely.(You can also find this episode on the Internet Archive and on YouTube.) A passionate advocate for public internet access and a successful entrepreneur, Brewster Kahle has spent his life intent on a singular focus: providing universal access to all knowledge. The Internet Archive, which he founded in 1996, now preserves 99+ petabytes of data - the books, Web pages, music, television, government information, and software of our cultural heritage – and works with more than 400 library and university partners to create a digital library that’s accessible to all. The Archive is known for the Wayback Machine, which lets users search the history of almost one trillion web pages. But it also archives images, software, video and audio recordings, documents, and it contains dozens of resources and projects that fill a variety of gaps in cultural, political, and historical knowledge. Kahle joins EFF’s Cindy Cohn and Jason Kelley to discuss how the free flow of knowledge makes all of us more free. In this episode you’ll learn about: The role AI plays in digitizing, preserving, and easing access to all kinds of information How EFF helped the Internet Archive fight off the government’s demand for information about library patrons The importance of building a decentralized, distributed web to finding and preserving information for all Why building revolutionary, world-class libraries like the Internet Archive requires not only money and technology, but also people willing to dedicate their lives to the work How nonprofits are crucial to filling societal gaps left by businesses, governments, and academia Brewster Kahle is the founder and digital librarian of the Internet Archive, which serves millions of people each day and is among the world’s largest libraries. After studying AI at and graduating from the Massachusetts Institute of Technology in 1982, Kahle helped launch the company Thinking Machines, a parallel supercomputer maker. In 1989, he helped create the internet's first publishing system called Wide Area Information Server (WAIS); WAIS Inc. was later sold to AOL. In 1996, Kahle co-founded Alexa Internet, which helps catalog the Web, selling it to Amazon.com in 1999. He is a former member of EFF’s Board of Directors.
The human brain might be the grandest computer of all, but in this episode, we talk to two experts who confirm that the ability for tech to decipher thoughts, and perhaps even manipulate them, isn't just around the corner – it's already here. Rapidly advancing "neurotechnology" could offer new ways for people with brain trauma or degenerative diseases to communicate, as the New York Times reported this month, but it also could open the door to abusing the privacy of the most personal data of all: our thoughts. Worse yet, it could allow manipulating how people perceive and process reality, as well as their responses to it – a Pandora’s box of epic proportions. (You can also find this episode on the Internet Archive and on YouTube.) Neuroscientist Rafael Yuste and human rights lawyer Jared Genser are awestruck by both the possibilities and the dangers of neurotechnology. Together they established The Neurorights Foundation, and now they join EFF’s Cindy Cohn and Jason Kelley to discuss how technology is advancing our understanding of what it means to be human, and the solid legal guardrails they're building to protect the privacy of the mind. In this episode you’ll learn about: How to protect people’s mental privacy, agency, and identity while ensuring equal access to the positive aspects of brain augmentation Why neurotechnology regulation needs to be grounded in international human rights Navigating the complex differences between medical and consumer privacy laws The risk that information collected by devices now on the market could be decoded into actual words within just a few years Balancing beneficial innovation with the protection of people’s mental privacy Rafael Yuste is a professor of biological sciences and neuroscience, co-director of the Kavli Institute for Brain Science, and director of the NeuroTechnology Center at Columbia University. He led the group of researchers that first proposed the BRAIN (Brain Research through Advancing Innovative Neurotechnologies) Initiative launched in 2013 by the Obama Administration. Jared Genser is an international human rights lawyer who serves as managing director at Perseus Strategies, renowned for his successes in freeing political prisoners around the world. He’s also the Senior Tech Fellow at Harvard University’s Carr-Ryan Center for Human Right, and he is outside general counsel to The Neurorights Foundation, an international advocacy group he co-founded with Yuste that works to enshrine human rights as a crucial part of the development of neurotechnology.
If you believe the hype, artificial intelligence will soon take all our jobs, or solve all our problems, or destroy all boundaries between reality and lies, or help us live forever, or take over the world and exterminate humanity. That’s a pretty wide spectrum, and leaves a lot of people very confused about what exactly AI can and can’t do. In this episode, we’ll help you sort that out: For example, we’ll talk about why even superintelligent AI cannot simply replace humans for most of what we do, nor can it perfect or ruin our world unless we let it.Arvind Narayanan studies the societal impact of digital technologies with a focus on how AI does and doesn’t work, and what it can and can’t do. He believes that if we set aside all the hype, and set the right guardrails around AI’s training and use, it has the potential to be a profoundly empowering and liberating technology. Narayanan joins EFF’s Cindy Cohn and Jason Kelley to discuss how we get to a world in which AI can improve aspects of our lives from education to transportation—if we make some system improvements first—and how AI will likely work in ways that we barely notice but that help us grow and thrive. In this episode you’ll learn about: What it means to be a “techno-optimist” (and NOT the venture capitalist kind) Why we can’t rely on predictive algorithms to make decisions in criminal justice, hiring, lending, and other crucial aspects of people’s lives How large-scale, long-term, controlled studies are needed to determine whether a specific AI application actually lives up to its accuracy promises Why “cheapfakes” tend to be more (or just as) effective than deepfakes in shoring up political support How AI is and isn’t akin to the Industrial Revolution, the advent of electricity, and the development of the assembly line Arvind Narayanan is professor of computer science and director of the Center for Information Technology Policy at Princeton University. Along with Sayash Kapoor, he publishes the AI Snake Oil newsletter, followed by tens of thousands of researchers, policy makers, journalists, and AI enthusiasts; they also have authored “AI Snake Oil: What Artificial Intelligence Can Do, What It Can’t, and How to Tell the Difference” (2024, Princeton University Press). He has studied algorithmic amplification on social media as a visiting senior researcher at Columbia University's Knight First Amendment Institute; co-authored an online a textbook on fairness and machine learning; and led Princeton's Web Transparency and Accountability Project, uncovering how companies collect and use our personal information.
Many of the internet’s thorniest problems can be attributed to the concentration of power in a few corporate hands: the surveillance capitalism that makes it profitable to invade our privacy, the lack of algorithmic transparency that turns artificial intelligence and other tech into impenetrable black boxes, the rent-seeking behavior that seeks to monopolize and mega-monetize an existing market instead of creating new products or markets, and much more. Kara Swisher has been documenting the internet’s titans for almost 30 years through a variety of media outlets and podcasts. She believes that with adequate regulation we can keep people safe online without stifling innovation, and we can have an internet that’s transparent and beneficial for all, not just a collection of fiefdoms run by a handful of homogenous oligarchs. In this episode you’ll learn about: Why it’s so important that tech workers speak out about issues they want to improve and work to create companies that elevate best practices Why completely unconstrained capitalism turns technology into weapons instead of tools How antitrust legislation and enforcement can create a healthier online ecosystem Why AI could either bring abundance for many or make the very rich even richer The small online media outlets still doing groundbreaking independent reporting that challenges the tech oligarchy Kara Swisher is one of the world's foremost tech journalists and critics, and currently hosts two podcasts: On with Kara Swisher and Pivot, the latter co-hosted by New York University Professor Scott Galloway. She's been covering the tech industry since the 1990s for outlets including the Washington Post, the Wall Street Journal, and the New York Times; she is an New York Magazine editor-at-large, a CNN contributor, and cofounder of the tech news sites Recode and All Things Digital. She also has authored several books, including “Burn Book” (Simon & Schuster, 2024) in which she documents the history of Silicon Valley and the tech billionaires who run it.
Many people approach digital security training with furrowed brows, as an obstacle to overcome. But what if learning to keep your tech safe and secure was consistently playful and fun? People react better to learning, and retain more knowledge, when they're having a good time. It doesn’t mean the topic isn’t serious – it’s just about intentionally approaching a serious topic with joy. That’s how Helen Andromedon approaches her work as a digital security trainer in East Africa. She teaches human rights defenders how to protect themselves online, creating open and welcoming spaces for activists, journalists, and others at risk to ask hard questions and learn how to protect themselves against online threats. She joins EFF’s Cindy Cohn and Jason Kelley to discuss making digital security less complicated, more relevant, and more joyful to real users, and encouraging all women and girls to take online safety into their own hands so that they can feel fully present and invested in the digital world. In this episode you’ll learn about: How the Trump Administration’s shuttering of the United States Agency for International Development (USAID) has led to funding cuts for digital security programs in Africa and around the world, and why she’s still optimistic about the work The importance of helping women feel safe and confident about using online platforms to create positive change in their communities and countries Cultivating a mentorship model in digital security training and other training environments Why diverse input creates training models that are accessible to a wider audience How one size never fits all in digital security solutions, and how Dungeons and Dragons offers lessons to help people retain what they learn Helen Andromedon – a moniker she uses to protect her own security – is a digital security trainer in East Africa who helps human rights defenders learn how to protect themselves and their data online and on their devices. She played a key role in developing the Safe Sisters project, which is a digital security training program for women. She’s also a UX researcher and educator who has worked as a consultant for many organizations across Africa, including the Association for Progressive Communications and the African Women’s Development Fund.
The cryptography that protects our privacy and security online relies on the fact that even the strongest computers will take essentially forever to do certain tasks, like factoring prime numbers and finding discrete logarithms which are important for RSA encryption, Diffie-Hellman key exchanges, and elliptic curve encryption. But what happens when those problems – and the cryptography they underpin – are no longer infeasible for computers to solve? Will our online defenses collapse? Not if Deirdre Connolly can help it. As a cutting-edge thinker in post-quantum cryptography, Connolly is making sure that the next giant leap forward in computing – quantum machines that use principles of subatomic mechanics to ignore some constraints of classical mathematics and solve complex problems much faster – don’t reduce our digital walls to rubble. Connolly joins EFF’s Cindy Cohn and Jason Kelley to discuss not only how post-quantum cryptography can shore up those existing walls but also help us find entirely new methods of protecting our information. In this episode you’ll learn about: Why we’re not yet sure exactly what quantum computing can do yet, and that’s exactly why we need to think about post-quantum cryptography now What a “Harvest Now, Decrypt Later” attack is, and what’s happening today to defend against itHow cryptographic collaboration, competition, and community are key to exploring a variety of paths to post-quantum resilienceWhy preparing for post-quantum cryptography is and isn’t like fixing the Y2K bugHow the best impact that end users can hope for from post-quantum cryptography is no visible impact at allDon’t worry—you won’t have to know, or learn, any math for this episode! Deidre Connolly is a research and applied cryptographer at Sandbox AQ with particular expertise in post quantum encryption. She also co-hosts the “Security Cryptography Whatever” podcast about modern computer security and cryptography, with a focus on engineering and real-world experiences. Earlier, she was an engineer at the Zcash Foundation – a nonprofit that builds financial privacy infrastructure for the public good – as well as at Brightcove, Akamai, and HubSpot.
Public-interest journalism speaks truth to power, so protecting press freedom is part of protecting democracy. But what does it take to digitally secure journalists’ work in an environment where critics, hackers, oppressive regimes, and others seem to have the free press in their crosshairs? That’s what Harlo Holmes focuses on as Freedom of the Press Foundation’s digital security director. Her team provides training, consulting, security audits, and other support to newsrooms, independent journalists, freelancers, documentary filmmakers – anyone who is making independent journalism in the public interest – so that they can do their jobs more safely and securely. Holmes joins EFF’s Cindy Cohn and Jason Kelley to discuss the tools and techniques that help journalists protect themselves and their sources while keeping the world informed. In this episode you’ll learn about: The importance of protecting online anonymity on an ever-increasingly “data-greedy” internet. How digital security nihilism in the United States compares with regions of the world where oppressive and repressive governance are more common Why compartmentalization can be a simple, easy approach to digital security The need for middleware to provide encryption and other protections that shield sources’ anonymity and journalists’ work product when using corporate data platforms How podcasters, YouTubers, and TikTokers fit into the broad sweep of media history, and need digital protections as well Harlo Holmes is the chief information security officer and director of digital security at Freedom of the Press Foundation. She strives to help individual journalists in various media organizations become confident and effective in securing their communications within their newsrooms, with their sources, and with the public at large. She is a media scholar, software programmer, and activist. Holmes was a regular contributor to the open-source mobile security collective Guardian Project, where she spearheaded the media metadata verification initiative currently empowering ProofMode, Save by OpenArchive, eyeWitness to Atrocities, and others.
Many in Silicon Valley, and in U.S. business at large, seem to believe innovation springs only from competition, a race to build the next big thing first, cheaper, better, best. But what if collaboration and community breeds innovation just as well as adversarial competition? Isabela Fernandes believes free, open-source software has helped build the internet, and will be key to improving it for all. As executive director of the Tor Project – the nonprofit behind the decentralized, onion-routing network providing crucial online anonymity to activists and dissidents around the world – she has fought tirelessly for everyone to have private access to an uncensored internet, and Tor has become one of the world's strongest tools for privacy and freedom online. Fernandes joins EFF’s Cindy Cohn and Jason Kelley to discuss the importance of not just accepting technology as it’s given to us, but collaboratively breaking it, tinkering with it, and rebuilding it together until it becomes the technology that we really need to make our world a better place. In this episode you’ll learn about: How the Tor network protects the anonymity of internet users around the world, and why that’s so important Why online privacy is NOT only for “people who have something to hide” The importance of making more websites friendly and accessible to Tor and similar systems How Tor can actually benefit law enforcement How free, open-source software can power economic booms Isabela Fernandes has been executive director of the Tor Project since 2018; she had been a project manager there since 2015. She also has served since 2023 as a board member of both European Digital Rights – an association of civil and human rights organizations aimed at building a people-centered, democratic society – and The Engine Room, a nonprofit that supports social justice movements to use technology and data in safe, responsible and strategic ways, while actively mitigating the vulnerabilities created by digital systems. Earlier, Fernandes worked as a product manager for Twitter; Latin America project manager for North by South, which offered open-source technology integration to companies using expertise of Latin American free software specialists; as a project manager for Brazil’s President, overseeing migration of the IT department to free software; and as a technical advisor to Brazil’s Ministry of Communications, creating and implementing new features and free-software tools for the National Digital Inclusion Program serving 3,500 communities. She’s a former member of the board of the Calyx Institute, an education and research organization devoted to studying, testing and developing and implementing privacy technology and tools to promote free speech, free expression, civic engagement and privacy rights on the internet and in the mobile telephone industry. And she was a cofounder and longtime volunteer with Indymedia Brazil, an independent journalism collective.
There’s a weird belief out there that tech critics hate technology. But do movie critics hate movies? Do food critics hate food? No! The most effective, insightful critics do what they do because they love something so deeply that they want to see it made even better. The most effective tech critics have had transformative, positive online experiences, and now unflinchingly call out the surveilled, commodified, enshittified landscape that exists today because they know there has been – and still can be – something better.That’s what drives Molly White’s work. Her criticism of the cryptocurrency and technology industries stems from her conviction that technology should serve human needs rather than mere profits. Whether it’s blockchain or artificial intelligence, she’s interested in making sure the “next big thing” lives up to its hype, and more importantly, to the ideals of participation and democratization that she experienced. She joins EFF’s Cindy Cohn and Jason Kelley to discuss working toward a human-centered internet that gives everyone a sense of control and interaction – open to all in the way that Wikipedia was (and still is) for her and so many others: not just as a static knowledge resource, but as something in which we can all participate.In this episode you’ll learn about:Why blockchain technology has built-in incentives for grift and speculation that overwhelm most of its positive usesHow protecting open-source developers from legal overreach, including in the blockchain world, remains criticalThe vast difference between decentralization of power and decentralization of computeHow Neopets and Wikipedia represent core internet values of community, collaboration, and creativityWhy Wikipedia has been resilient against some of the rhetorical attacks that have bogged down media outlets, but remains vulnerable to certain economic and political pressuresHow the Fediverse and other decentralization and interoperability mechanisms provide hope for the kind of creative independence, self-expression, and social interactivity that everyone deserves Molly White is a researcher, software engineer, and writer who focuses on the cryptocurrency industry, blockchains, web3, and other tech in her independent publication, Citation Needed. She also runs the websites Web3 is Going Just Great, where she highlights examples of how cryptocurrencies, web3 projects, and the industry surrounding them are failing to live up to their promises, and Follow the Crypto, where she tracks cryptocurrency industry spending in U.S. elections. She has volunteered for more than 15 years with Wikipedia, where she serves as an administrator (under the name GorillaWarfare) and functionary, and previously served three terms on the Arbitration Committee. She’s regularly quoted or bylined in news media, speaks at major conferences including South by Southwest and Web Summit; guest lectures at universities including Harvard, MIT, and Stanford; and advises policymakers and regulators around the world.
We all leave digital trails as we navigate the internet – records of what we searched for, what we bought, who we talked to, where we went or want to go in the real world – and those trails usually are owned by the big corporations behind the platforms we use. But what if we valued our digital autonomy the way that we do our bodily autonomy? What if we reclaimed the right to go, read, see, do and be what we wish online as we try to do offline? Moreover, what if we saw digital autonomy and bodily autonomy as two sides of the same coin – inseparable?Kate Bertash wants that digital autonomy for all of us, and she pursues it in many different ways – from teaching abortion providers and activists how to protect themselves online, to helping people stymie the myriad surveillance technologies that watch and follow us in our communities. She joins EFF’s Cindy Cohn and Jason Kelley to discuss how creativity and community can align to center people in the digital world and make us freer both online and offline.In this episode you’ll learn about:Why it’s important for local communities to collaboratively discuss and decide whether and how much they want to be surveilledHow the digital era has blurred the bright line between public and private spacesWhy we can’t surveil ourselves to safetyHow DefCon – America's biggest hacker conference – embodies the ideal that we don’t have to simply accept technology as it’s given to us, but instead can break, tinker with, and rebuild it to meet our needsWhy building community helps us move beyond hopelessness to build and disseminate technology that helps protects everyone’s privacy Kate Bertash works at the intersection of tech, privacy, art, and organizing. She directs the Digital Defense Fund, launched in 2017 to meet the abortion rights and bodily autonomy movements’ increased need for security and technology resources after the 2016 election. This multidisciplinary team of organizers, engineers, designers, abortion fund and practical support volunteers provides digital security evaluations, conducts staff training, maintains a library of go-to resources on reproductive justice and digital privacy, and builds software for abortion access, bodily autonomy, and pro-democracy organizations. Bertash also engages in various multidisciplinary civic tech projects as a project manager, volunteer, activist, and artist; she’s especially interested in ways that artistic methods can interrogate use of AI-driven computer vision, other analytical technologies in surveillance, and related intersections with our civil rights.
Now more than ever, we need to build, reinforce, and protect the tools and technology that support our freedom. EFF’s How to Fix the Internet returns with another season full of forward-looking and hopeful conversations with the smartest and most creative leaders, activists, technologists, policy makers, and thinkers around. People who are working to create a better internet – and world – for all of us.Co-hosts Executive Director Cindy Cohn and Activism Director Jason Kelley will speak with people like journalist Molly White, reproductive rights activist Kate Bertash, press freedom advocate Harlo Holmes, the Tor Project’s Isabela Fernandes and computer scientist and AI skeptic Arvind Narayanan, among many others.
EFF’s “How to Fix the Internet” podcast is a nominee in the Webby Awards 29th Annual People's Voice competition – and we need your support to bring the trophy home! Voting ends on April 17, so if you like what we do here by trying to envision a better digital future—please take a moment to go to eff.org/webby to cast your vote.
This episode was first released on May 2, 2023. Dr. Seuss wrote a story about a Hawtch-Hawtcher Bee-Watcher whose job it is to watch his town’s one lazy bee, because “a bee that is watched will work harder, you see.” But that doesn’t seem to work, so another Hawtch-Hawtcher is assigned to watch the first, and then another to watch the second... until the whole town is watching each other watch a bee. To Federal Trade Commissioner Alvaro Bedoya, the story—which long predates the internet—is a great metaphor for why we must be wary of workplace surveillance, and why we need to strengthen our privacy laws. Bedoya has made a career of studying privacy, trust, and competition, and wishes for a world in which we can do, see, and read what we want, living our lives without being held back by our identity, income, faith, or any other attribute. In that world, all our interactions with technology —from social media to job or mortgage applications—are on a level playing field. Bedoya speaks with EFF’s Cindy Cohn and Jason Kelley about how fixing the internet should allow all people to live their lives with dignity, pride, and purpose. In this episode, you’ll learn about: The nuances of work that “bossware,” employee surveillance technology, can’t catch.Why the Health Insurance Portability Accountability Act (HIPAA) isn’t the privacy panacea you might think it is.Making sure that one-size-fits-all privacy rules don’t backfire against new entrants and small competitors.How antitrust fundamentally is about small competitors and working people, like laborers and farmers, deserving fairness in our economy.Alvaro Bedoya was nominated by President Joe Biden, confirmed by the U.S. Senate, and sworn in May 16, 2022 as a Commissioner of the Federal Trade Commission; his term expires in September 2026. Bedoya was the founding director of the Center on Privacy & Technology at Georgetown University Law Center, where he was also a visiting professor of law. He has been influential in research and policy at the intersection of privacy and civil rights, and co-authored a 2016 report on the use of facial recognition by law enforcement and the risks that it poses. He previously served as the first Chief Counsel to the Senate Judiciary Subcommittee on Privacy, Technology and the Law after its founding in 2011, and as Chief Counsel to former U.S. Sen. Al Franken (D-MN); earlier, he was an associate at the law firm WilmerHale. A naturalized immigrant born in Peru and raised in upstate New York, Bedoya previously co-founded the Esperanza Education Fund, a college scholarship for immigrant students in the District of Columbia, Maryland, and Virginia. He also served on the Board of Directors of the Hispanic Bar Association of the District of Columbia. He graduated summa cum laude from Harvard College and holds a J.D. from Yale Law School, where he served on the Yale Law Journal and received the Paul & Daisy Soros Fellowship for New Americans. This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology.Music for How to Fix the Internet was created for us by Reed Mathis and Nat Keefe of BeatMower. This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators: http://dig.ccmixter.org/files/airtone/64772lostTrack by Airtone (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) license. http://dig.ccmixter.org/files/airtone/64772 Ft. mwic__________________________________http://dig.ccmixter.org/files/djlang59/59729Probably Shouldn’t by J.Lang (c) copyright 2012 Licensed under a Creative Commons Attribution (3.0) license. Ft: Mr_Yesterday__________________________________http://dig.ccmixter.org/files/airtone/58703CommonGround by airtone (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Ft: simonlittlefield
This episode was first released on March 21, 2023. The promise of the internet was that it would be a tool to melt barriers and aid truth-seekers everywhere. But it feels like polarization has worsened in recent years, and more internet users are being misled into embracing conspiracies and cults. From QAnon to anti-vax screeds to talk of an Illuminati bunker beneath Denver International Airport, Alice Marwick has heard it all. She has spent years researching some dark corners of the online experience: the spread of conspiracy theories and disinformation. She says many people see conspiracy theories as participatory ways to be active in political and social systems from which they feel left out, building upon beliefs they already harbor to weave intricate and entirely false narratives. Marwick speaks with EFF’s Cindy Cohn and Jason Kelley about finding ways to identify and leverage people’s commonalities to stem this flood of disinformation while ensuring that the most marginalized and vulnerable internet users are still empowered to speak out. In this episode you’ll learn about: Why seemingly ludicrous conspiracy theories get so many views and followersHow disinformation is tied to personal identity and feelings of marginalization and disenfranchisementWhen fact-checking does and doesn’t workThinking about online privacy as a political and structural issue rather than something that can be solved by individual action Alice Marwick is director of research at Data & Society. Previously she was an Associate Professor in the Department of Communication and cofounder and Principal Researcher at the Center for Information, Technology and Public Life at the University of North Carolina, Chapel Hill. She researches the social, political, and cultural implications of popular social media technologies. In 2017, she co-authored Media Manipulation and Disinformation Online (Data & Society), a flagship report examining far-right online subcultures’ use of social media to spread disinformation, for which she was named one of Foreign Policy magazine’s 2017 Global Thinkers. She is the author of Status Update: Celebrity, Publicity and Branding in the Social Media Age (Yale 2013), an ethnographic study of the San Francisco tech scene which examines how people seek social status through online visibility, and co-editor of The Sage Handbook of Social Media (Sage 2017). Her forthcoming book, The Private is Political (Yale 2023), examines how the networked nature of online privacy disproportionately impacts marginalized individuals in terms of gender, race, and socio-economic status. She earned a political science and women's studies bachelor's degree from Wellesley College, a Master of Arts in communication from the University of Washington, and a PhD in media, culture and communication from New York University. This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators: http://dig.ccmixter.org/files/djlang59/59729Probably Shouldn’t by J.Lang (c) copyright 2012 Licensed under a Creative Commons Attribution (3.0) license. Ft: Mr_Yesterday__________________________________http://dig.ccmixter.org/files/airtone/58703CommonGround by airtone (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Ft: simonlittlefield__________________________________Additional beds and alternate theme remixes by Gaëtan Harris
The early internet had a lot of “technological self-determination" — you could opt out of things, protect your privacy, control your experience. The problem was that it took a fair amount of technical skill to exercise that self-determination. But what if it didn’t? What if the benefits of online privacy, security, interoperability, and free speech were more evenly distributed among all internet users?This is the future that award-winning author and EFF Special Advisor Cory Doctorow wants us to fight for. His term “enshittification” — a downward spiral in which online platforms trap users and business customers alike, treating them more and more like commodities while providing less and less value — was selected by the American Dialect Society as its 2023 Word of the Year. But, he tells EFF’s Cindy Cohn and Jason Kelley, enshittification analysis also identifies the forces that used to make companies treat us better, helping us find ways to break the cycle and climb toward a better future.In this episode you’ll learn about: Why “intellectual property” is a misnomer, and how the law has been abused to eliminate protections for societyHow the tech sector’s consolidation into a single lobbying voice helped bulldoze the measures that used to check companies’ worst impulsesWhy recent antitrust actions provide a glimmer of hope that megacompanies can still be forced to do better for usersWhy tech workers’ labor rights are important to the fight for a better internetHow legislative and legal losses can still be opportunities for future changeCory Doctorow is an award-winning science fiction author, activist, journalist and blogger, and a Special Advisor to EFF. He is the editor of Pluralistic and the author of novels including “The Bezzle” (2024), “The Lost Cause” (2023), “Attack Surface” (2020), and “Walkaway” (2017); young adult novels including “Homeland” (2013) and “Little Brother” (2008); and nonfiction books including “The Internet Con: How to Seize the Means of Computation” (2023) and “How to Destroy Surveillance Capitalism” (2021). He is EFF's former European director and co-founded the UK Open Rights Group. Born in Toronto, Canada, he now lives in Los Angeles.
Artificial intelligence will neither solve all our problems nor likely destroy the world, but it could help make our lives better if it’s both transparent enough for everyone to understand and available for everyone to use in ways that augment us and advance our goals — not for corporations or government to extract something from us and exert power over us. Imagine a future, for example, in which AI is a readily available tool for helping people communicate across language barriers, or for helping vision- or hearing-impaired people connect better with the world. This is the future that Kit Walsh, EFF’s Director of Artificial Intelligence & Access to Knowledge Legal Projects, and EFF Senior Staff Technologist Jacob Hoffman-Andrews, are working to bring about. They join EFF’s Cindy Cohn and Jason Kelley to discuss how AI shouldn’t be a tool cash in, or to classify people for favor or disfavor, but instead to engage with technology and information in ways that advance us all. In this episode you’ll learn about: The dangers in using AI to determine who law enforcement investigates, who gets housing or mortgages, who gets jobs, and other decisions that affect people’s lives and freedoms. How "moral crumple zones” in technological systems can divert responsibility and accountability from those deploying the tech. Why transparency and openness of AI systems — including training AI on consensually obtained, publicly visible data — is so important to ensure systems are developed without bias and to everyone’s benefit. Why “watermarking” probably isn’t a solution to AI-generated disinformation. Kit Walsh is a senior staff attorney at EFF, serving as Director of Artificial Intelligence & Access to Knowledge Legal Projects. She has worked for years on issues of free speech, net neutrality, copyright, coders' rights, and other issues that relate to freedom of expression and access to knowledge, supporting the rights of political protesters, journalists, remix artists, and technologists to agitate for social change and to express themselves through their stories and ideas. Before joining EFF, Kit led the civil liberties and patent practice areas at the Cyberlaw Clinic, part of Harvard University's Berkman Klein Center for Internet and Society; earlier, she worked at the law firm of Wolf, Greenfield & Sacks, litigating patent, trademark, and copyright cases in courts across the country. Kit holds a J.D. from Harvard Law School and a B.S. in neuroscience from MIT, where she studied brain-computer interfaces and designed cyborgs and artificial bacteria. Jacob Hoffman-Andrews is a senior staff technologist at EFF, where he is lead developer on Let's Encrypt, the free and automated Certificate Authority; he also works on EFF's Encrypt the Web initiative and helps maintain the HTTPS Everywhere browser extension. Before working at EFF, Jacob was on Twitter's anti-spam and security teams. On the security team, he implemented HTTPS-by-default with forward secrecy, key pinning, HSTS, and CSP; on the anti-spam team, he deployed new machine-learned models to detect and block spam in real-time. Earlier, he worked on Google’s maps, transit, and shopping teams.
Collaging, remixing, sampling—art always has been more than the sum of its parts, a synthesis of elements and ideas that produces something new and thought-provoking. Technology has enabled and advanced this enormously, letting us access and manipulate information and images in ways that would’ve been unimaginable just a few decades ago. For Nettrice Gaskins, this is an essential part of the African American experience: The ability to take whatever is at hand—from food to clothes to music to visual art—and combine it with life experience to adapt it into something new and original. She joins EFF’s Cindy Cohn and Jason Kelley to discuss how she takes this approach in applying artificial intelligence to her own artwork, expanding the boundaries of Black artistic thought. In this episode you’ll learn about: Why making art with AI is about much more than just typing a prompt and hitting a button How hip-hop music and culture was an early example of technology changing the state of Black art Why the concept of fair use in intellectual property law is crucial to the artistic process How biases in machine learning training data can affect art Why new tools can never replace the mind of a live, experienced artist Dr. Nettrice R. Gaskins is a digital artist, academic, cultural critic, and advocate of STEAM (science, technology, engineering, arts, and math) fields whose work she explores "techno-vernacular creativity" and Afrofuturism. She teaches, writes, "fabs,” and makes art using algorithms and machine learning. She has taught multimedia, visual art, and computer science with high school students, and now is assistant director of the Lesley STEAM Learning Lab at Lesley University. She was a 2021 Ford Global Fellow, serves as an advisory board member for the School of Literature, Media, and Communication at Georgia Tech, and is the author of “Techno-Vernacular Creativity and Innovation” (2021). She earned a BFA in Computer Graphics with honors from Pratt Institute in 1992; an MFA in Art and Technology from the School of the Art Institute of Chicago in 1994; and a doctorate in Digital Media from Georgia Tech in 2014.MUSIC CREDITSXena's Kiss / Medea's Kiss by mwic (c) copyright 2018 Licensed under a Creative Commons Attribution (3.0) license._________________lostTrack by Airtone (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) license. Ft. mwic
From Napster to YouTube, some of the most important and controversial uses of the internet have been about building community: connecting people all over the world who share similar interests, tastes, views, and concerns. Big corporations try to co-opt and control these communities, and politicians often promote scary narratives about technology’s dangerous influences, but users have pushed back against monopoly and rhetoric to find new ways to connect with each other. Alex Winter is a leading documentarian of the evolution of internet communities. He joins EFF’s Cindy Cohn and Jason Kelley to discuss the harms of behavioral advertising, what algorithms can and can’t be blamed for, and promoting the kind of digital literacy that can bring about a better internet—and a better world—for all of us. In this episode you’ll learn about: Debunking the monopolistic myth that communicating and sharing data is theft. Demystifying artificial intelligence so that it’s no longer a “black box” impervious to improvement. Decentralizing and democratizing the internet so more, diverse people can push technology, online communities, and our world forward. Finding a nuanced balance between free speech and harm mitigation in social media. Breaking corporations’ addiction to advertising revenue derived from promoting disinformation. Alex Winter is a director, writer and actor who has worked across film, television and theater. Perhaps best known on screen for “Bill & Ted’s Excellent Adventure” (1989) and its sequels as well as “The Lost Boys” (1987), “Destroy All Neighbors” (2024) and other films, he has directed documentaries including “Downloaded” (2013) about the Napster revolution; “Deep Web” (2015) about the online black market Silk Road and the trial of its creator Ross Ulbricht; “Trust Machine” (2018) about the rise of bitcoin and the blockchain; and “The YouTube Effect” (2022). He also has directed critically acclaimed documentaries about musician Frank Zappa and about the Panama Papers, the biggest global corruption scandal in history and the journalists who worked in secret and at great risk to break the story. Music credits:Perspectives *** by J.Lang (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) license. Ft: Sackjo22 and Admiral Bob
Blind and low-vision people have experienced remarkable gains in information literacy because of digital technologies, like being able to access an online library offering more than 1.2 million books that can be translated into text-to-speech or digital Braille. But it can be a lot harder to come by an accessible map of a neighborhood they want to visit, or any simple diagram, due to limited availability of tactile graphics equipment, design inaccessibility, and publishing practices. Chancey Fleet wants a technological future that’s more organically attuned to people’s needs, which requires including people with disabilities in every step of the development and deployment process. She speaks with EFF’s Cindy Cohn and Jason Kelley about building an internet that’s just and useful for all, and why this must include giving blind and low-vision people the discretion to decide when and how to engage artificial intelligence tools to solve accessibility problems and surmount barriers. In this episode you’ll learn about: The importance of creating an internet that’s not text-only, but that incorporates tactile images and other technology to give everyone a richer, more fulfilling experience. Why AI-powered visual description apps still need human auditing. How inclusiveness in tech development is always a work in progress. Why we must prepare people with the self-confidence, literacy, and low-tech skills they need to get everything they can out of even the most optimally designed technology. Making it easier for everyone to travel the two-way street between enjoyment and productivity online. Chancey Fleet’s writing, organizing and advocacy explores how cloud-connected accessibility tools benefit and harm, empower and expose communities of disability. She is the Assistive Technology Coordinator at the New York Public Library’s Andrew Heiskell Braille and Talking Book Library, where she founded and maintains the Dimensions Project, a free open lab for the exploration and creation of accessible images, models and data representations through tactile graphics, 3D models and nonvisual approaches to coding, CAD and “visual” arts. She is a former fellow and current affiliate-in-residence at Data & Society; she is president of the National Federation of the Blind’s Assistive Technology Trainers Division; and she was recognized as a 2017 Library Journal Mover and Shaker. Music credits:Probably Shouldn't by J.Lang (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) license. Ft: Mr_YesterdaycommonGround by airtone (c) copyright 2018 Licensed under a Creative Commons Attribution Noncommercial (3.0) license.Klaus by Skill_Borrower (c) copyright 2013 Licensed under a Creative Commons Attribution (3.0) license. Ft: Klaus_NeumaierChrome Cactus by Martijn de Boer (NiGiD) (c) copyright 2020 Licensed under a Creative Commons Attribution Noncommercial (3.0) license. Ft: Javolenus
If you buy something—a refrigerator, a car, a tractor, a wheelchair, or a phone—but you can't have the information or parts to fix or modify it, is it really yours? The right to repair movement is based on the belief that you should have the right to use and fix your stuff as you see fit, a philosophy that resonates especially in economically trying times, when people can’t afford to just throw away and replace things. Companies for decades have been tightening their stranglehold on the information and the parts that let owners or independent repair shops fix things, but the pendulum is starting to swing back: New York, Minnesota, California, and Colorado have passed right to repair laws, and it’s on the legislative agenda in dozens of other states. Gay Gordon-Byrne is executive director of The Repair Association, one of the major forces pushing for more and stronger state laws, and for federal reforms as well. She joins EFF’s Cindy Cohn and Jason Kelley to discuss this pivotal moment in the fight for consumers to have the right to products that are repairable and reusable. In this episode you’ll learn about: Why our “planned obsolescence” throwaway culture doesn’t have to be, and shouldn’t be, a technology status quo. The harm done by “parts pairing:” software barriers used by manufacturers to keep people from installing replacement parts. Why one major manufacturer put out a user manual in France, but not in other countries including the United States. How expanded right to repair protections could bring a flood of new local small-business jobs while reducing waste. The power of uniting disparate voices—farmers, drivers, consumers, hackers, and tinkerers—into a single chorus that can’t be ignored. Gay Gordon-Byrne has been executive director of The Repair Association—formerly known as The Digital Right to Repair Coalition—since its founding in 2013, helping lead the fight for the right to repair in Congress and state legislatures. Their credo: If you bought it, you should own it and have the right to use it, modify it, and repair it whenever, wherever, and however you want. Earlier, she had a 40-year career as a vendor, lessor, and used equipment dealer for large commercial IT users; she is the author of "Buying, Supporting and Maintaining Software and Equipment - an IT Manager's Guide to Controlling the Product Lifecycle” (2014), and a Colgate University alumna. MUSIC CREDITSCome Inside by Zep Hurme (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) license. Ft: snowflakeDrops of H2O ( The Filtered Water Treatment ) by J.Lang (c) copyright 2012 Licensed under a Creative Commons Attribution (3.0) license. Ft: Airtone
fantastic episode on a subject that is very dear to me
I think this tool is destructive, and this is only the surface level of what we think it can and will do. but not ultimately how it'll eventually be used against us both personally and within the government and law enforcement. I thinknit should be banned as a whole. I also think CA law is kind of doing itself a disservice because in such a age of 'information' many people are very misinformed and not educated enough on tech and the dangers of it, even if something seems like a great idea. we need to do a better job of educating people of basic rights, privacy, and identity protection. I feel we've become very lazy and think, if the government says it's good or the law enforcement says it's great. We should all be on board. Tech has had more cons than positives in the past 15 years.
ع
these people are communist
and yet some conspiracy have turned out to be true