DiscoverHow to Fix the Internet
How to Fix the Internet
Claim Ownership

How to Fix the Internet

Author: Electronic Frontier Foundation (EFF)

Subscribed: 26,537Played: 54,263
Share

Description

The internet is broken—but it doesn’t have to be. If you’re concerned about how surveillance, online advertising, and automated content moderation are hurting us online and offline, the Electronic Frontier Foundation’s How to Fix the Internet podcast offers a better way forward. EFF has been defending your rights online for over thirty years and is behind many of the biggest digital rights protections since the invention of the internet. Through curious conversations with some of the leading minds in law and technology, this podcast explores creative solutions to some of today’s biggest tech challenges. Hosted by EFF Executive Director Cindy Cohn and EFF Associate Director of Digital Strategy Jason Kelley, How to Fix the Internet will help you become deeply informed on vital technology issues as we work to build a better technological future together.
52 Episodes
Reverse
This episode was first released on March 21, 2023.  The promise of the internet was that it would be a tool to melt barriers and aid truth-seekers everywhere. But it feels like polarization has worsened in recent years, and more internet users are being misled into embracing conspiracies and cults. From QAnon to anti-vax screeds to talk of an Illuminati bunker beneath Denver International Airport, Alice Marwick has heard it all. She has spent years researching some dark corners of the online experience: the spread of conspiracy theories and disinformation. She says many people see conspiracy theories as participatory ways to be active in political and social systems from which they feel left out, building upon beliefs they already harbor to weave intricate and entirely false narratives. Marwick speaks with EFF’s Cindy Cohn and Jason Kelley about finding ways to identify and leverage people’s commonalities to stem this flood of disinformation while ensuring that the most marginalized and vulnerable internet users are still empowered to speak out. In this episode you’ll learn about: Why seemingly ludicrous conspiracy theories get so many views and followersHow disinformation is tied to personal identity and feelings of marginalization and disenfranchisementWhen fact-checking does and doesn’t workThinking about online privacy as a political and structural issue rather than something that can be solved by individual action  Alice Marwick is director of research at Data & Society. Previously she was an Associate Professor in the Department of Communication and cofounder and Principal Researcher at the Center for Information, Technology and Public Life at the University of North Carolina, Chapel Hill. She researches the social, political, and cultural implications of popular social media technologies. In 2017, she co-authored Media Manipulation and Disinformation Online (Data & Society), a flagship report examining far-right online subcultures’ use of social media to spread disinformation, for which she was named one of Foreign Policy magazine’s 2017 Global Thinkers. She is the author of Status Update: Celebrity, Publicity and Branding in the Social Media Age (Yale 2013), an ethnographic study of the San Francisco tech scene which examines how people seek social status through online visibility, and co-editor of The Sage Handbook of Social Media (Sage 2017). Her forthcoming book, The Private is Political (Yale 2023), examines how the networked nature of online privacy disproportionately impacts marginalized individuals in terms of gender, race, and socio-economic status. She earned a political science and women's studies bachelor's degree from Wellesley College, a Master of Arts in communication from the University of Washington, and a PhD in media, culture and communication from New York University. This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators: http://dig.ccmixter.org/files/djlang59/59729Probably Shouldn’t by J.Lang (c) copyright 2012 Licensed under a Creative Commons Attribution (3.0) license. Ft: Mr_Yesterday__________________________________http://dig.ccmixter.org/files/airtone/58703CommonGround by airtone (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Ft: simonlittlefield__________________________________Additional beds and alternate theme remixes by Gaëtan Harris
The early internet had a lot of “technological self-determination" — you could opt out of things, protect your privacy, control your experience. The problem was that it took a fair amount of technical skill to exercise that self-determination. But what if it didn’t? What if the benefits of online privacy, security, interoperability, and free speech were more evenly distributed among all internet users?This is the future that award-winning author and EFF Special Advisor Cory Doctorow wants us to fight for. His term “enshittification” — a downward spiral in which online platforms trap users and business customers alike, treating them more and more like commodities while providing less and less value — was selected by the American Dialect Society as its 2023 Word of the Year. But, he tells EFF’s Cindy Cohn and Jason Kelley, enshittification analysis also identifies the forces that used to make companies treat us better, helping us find ways to break the cycle and climb toward a better future.In this episode you’ll learn about: Why “intellectual property” is a misnomer, and how the law has been abused to eliminate protections for societyHow the tech sector’s consolidation into a single lobbying voice helped bulldoze the measures that used to check companies’ worst impulsesWhy recent antitrust actions provide a glimmer of hope that megacompanies can still be forced to do better for usersWhy tech workers’ labor rights are important to the fight for a better internetHow legislative and legal losses can still be opportunities for future changeCory Doctorow is an award-winning science fiction author, activist, journalist and blogger, and a Special Advisor to EFF. He is the editor of Pluralistic and the author of novels including “The Bezzle” (2024), “The Lost Cause” (2023), “Attack Surface” (2020), and “Walkaway” (2017); young adult novels including “Homeland” (2013) and “Little Brother” (2008); and nonfiction books including “The Internet Con: How to Seize the Means of Computation” (2023) and “How to Destroy Surveillance Capitalism” (2021). He is EFF's former European director and co-founded the UK Open Rights Group. Born in Toronto, Canada, he now lives in Los Angeles.
AI in Kitopia

AI in Kitopia

2024-06-1838:19

Artificial intelligence will neither solve all our problems nor likely destroy the world, but it could help make our lives better if it’s both transparent enough for everyone to understand and available for everyone to use in ways that augment us and advance our goals — not for corporations or government to extract something from us and exert power over us. Imagine a future, for example, in which AI is a readily available tool for helping people communicate across language barriers, or for helping vision- or hearing-impaired people connect better with the world. This is the future that Kit Walsh, EFF’s Director of Artificial Intelligence & Access to Knowledge Legal Projects, and EFF Senior Staff Technologist Jacob Hoffman-Andrews, are working to bring about. They join EFF’s Cindy Cohn and Jason Kelley to discuss how AI shouldn’t be a tool cash in, or to classify people for favor or disfavor, but instead to engage with technology and information in ways that advance us all. In this episode you’ll learn about: The dangers in using AI to determine who law enforcement investigates, who gets housing or mortgages, who gets jobs, and other decisions that affect people’s lives and freedoms. How "moral crumple zones” in technological systems can divert responsibility and accountability from those deploying the tech. Why transparency and openness of AI systems — including training AI on consensually obtained, publicly visible data — is so important to ensure systems are developed without bias and to everyone’s benefit. Why “watermarking” probably isn’t a solution to AI-generated disinformation. Kit Walsh is a senior staff attorney at EFF, serving as Director of Artificial Intelligence & Access to Knowledge Legal Projects. She has worked for years on issues of free speech, net neutrality, copyright, coders' rights, and other issues that relate to freedom of expression and access to knowledge, supporting the rights of political protesters, journalists, remix artists, and technologists to agitate for social change and to express themselves through their stories and ideas. Before joining EFF, Kit led the civil liberties and patent practice areas at the Cyberlaw Clinic, part of Harvard University's Berkman Klein Center for Internet and Society; earlier, she worked at the law firm of Wolf, Greenfield & Sacks, litigating patent, trademark, and copyright cases in courts across the country. Kit holds a J.D. from Harvard Law School and a B.S. in neuroscience from MIT, where she studied brain-computer interfaces and designed cyborgs and artificial bacteria. Jacob Hoffman-Andrews is a senior staff technologist at EFF, where he is lead developer on Let's Encrypt, the free and automated Certificate Authority; he also works on EFF's Encrypt the Web initiative and helps maintain the HTTPS Everywhere browser extension. Before working at EFF, Jacob was on Twitter's anti-spam and security teams. On the security team, he implemented HTTPS-by-default with forward secrecy, key pinning, HSTS, and CSP; on the anti-spam team, he deployed new machine-learned models to detect and block spam in real-time. Earlier, he worked on Google’s maps, transit, and shopping teams. 
Collaging, remixing, sampling—art always has been more than the sum of its parts, a synthesis of elements and ideas that produces something new and thought-provoking. Technology has enabled and advanced this enormously, letting us access and manipulate information and images in ways that would’ve been unimaginable just a few decades ago.  For Nettrice Gaskins, this is an essential part of the African American experience: The ability to take whatever is at hand—from food to clothes to music to visual art—and combine it with life experience to adapt it into something new and original. She joins EFF’s Cindy Cohn and Jason Kelley to discuss how she takes this approach in applying artificial intelligence to her own artwork, expanding the boundaries of Black artistic thought.  In this episode you’ll learn about: Why making art with AI is about much more than just typing a prompt and hitting a button How hip-hop music and culture was an early example of technology changing the state of Black art Why the concept of fair use in intellectual property law is crucial to the artistic process How biases in machine learning training data can affect art Why new tools can never replace the mind of a live, experienced artist Dr. Nettrice R. Gaskins is a digital artist, academic, cultural critic, and advocate of STEAM (science, technology, engineering, arts, and math) fields whose work she explores "techno-vernacular creativity" and Afrofuturism. She teaches, writes, "fabs,” and makes art using algorithms and machine learning. She has taught multimedia, visual art, and computer science with high school students, and now is assistant director of the Lesley STEAM Learning Lab at Lesley University.  She was a 2021 Ford Global Fellow, serves as an advisory board member for the School of Literature, Media, and Communication at Georgia Tech, and is the author of “Techno-Vernacular Creativity and Innovation” (2021). She earned a BFA in Computer Graphics with honors from Pratt Institute in 1992; an MFA in Art and Technology from the School of the Art Institute of Chicago in 1994; and a doctorate in Digital Media from Georgia Tech in 2014.MUSIC CREDITSXena's Kiss / Medea's Kiss by mwic (c) copyright 2018 Licensed under a Creative Commons Attribution (3.0) license._________________lostTrack by Airtone (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) license. Ft. mwic
From Napster to YouTube, some of the most important and controversial uses of the internet have been about building community: connecting people all over the world who share similar interests, tastes, views, and concerns. Big corporations try to co-opt and control these communities, and politicians often promote scary narratives about technology’s dangerous influences, but users have pushed back against monopoly and rhetoric to find new ways to connect with each other. Alex Winter is a leading documentarian of the evolution of internet communities. He joins EFF’s Cindy Cohn and Jason Kelley to discuss the harms of behavioral advertising, what algorithms can and can’t be blamed for, and promoting the kind of digital literacy that can bring about a better internet—and a better world—for all of us. In this episode you’ll learn about: Debunking the monopolistic myth that communicating and sharing data is theft. Demystifying artificial intelligence so that it’s no longer a “black box” impervious to improvement. Decentralizing and democratizing the internet so more, diverse people can push technology, online communities, and our world forward. Finding a nuanced balance between free speech and harm mitigation in social media. Breaking corporations’ addiction to advertising revenue derived from promoting disinformation. Alex Winter is a director, writer and actor who has worked across film, television and theater. Perhaps best known on screen for “Bill & Ted’s Excellent Adventure” (1989) and its sequels as well as “The Lost Boys” (1987), “Destroy All Neighbors” (2024) and other films, he has directed documentaries including “Downloaded” (2013) about the Napster revolution; “Deep Web” (2015) about the online black market Silk Road and the trial of its creator Ross Ulbricht; “Trust Machine” (2018) about the rise of bitcoin and the blockchain; and “The YouTube Effect” (2022). He also has directed critically acclaimed documentaries about musician Frank Zappa and about the Panama Papers, the biggest global corruption scandal in history and the journalists who worked in secret and at great risk to break the story.   Music credits:Perspectives *** by J.Lang (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) license. Ft: Sackjo22 and Admiral Bob
Blind and low-vision people have experienced remarkable gains in information literacy because of digital technologies, like being able to access an online library offering more than 1.2 million books that can be translated into text-to-speech or digital Braille. But it can be a lot harder to come by an accessible map of a neighborhood they want to visit, or any simple diagram, due to limited availability of tactile graphics equipment, design inaccessibility, and publishing practices. Chancey Fleet wants a technological future that’s more organically attuned to people’s needs, which requires including people with disabilities in every step of the development and deployment process. She speaks with EFF’s Cindy Cohn and Jason Kelley about building an internet that’s just and useful for all, and why this must include giving blind and low-vision people the discretion to decide when and how to engage artificial intelligence tools to solve accessibility problems and surmount barriers. In this episode you’ll learn about: The importance of creating an internet that’s not text-only, but that incorporates tactile images and other technology to give everyone a richer, more fulfilling experience. Why AI-powered visual description apps still need human auditing. How inclusiveness in tech development is always a work in progress. Why we must prepare people with the self-confidence, literacy, and low-tech skills they need to get everything they can out of even the most optimally designed technology. Making it easier for everyone to travel the two-way street between enjoyment and productivity online. Chancey Fleet’s writing, organizing and advocacy explores how cloud-connected accessibility tools benefit and harm, empower and expose communities of disability. She is the Assistive Technology Coordinator at the New York Public Library’s Andrew Heiskell Braille and Talking Book Library, where she founded and maintains the Dimensions Project, a free open lab for the exploration and creation of accessible images, models and data representations through tactile graphics, 3D models and nonvisual approaches to coding, CAD and “visual” arts. She is a former fellow and current affiliate-in-residence at Data & Society; she is president of the National Federation of the Blind’s Assistive Technology Trainers Division; and she was recognized as a 2017 Library Journal Mover and Shaker. Music credits:Probably Shouldn't by J.Lang (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) license. Ft: Mr_YesterdaycommonGround by airtone (c) copyright 2018 Licensed under a Creative Commons Attribution Noncommercial  (3.0) license.Klaus by Skill_Borrower (c) copyright 2013 Licensed under a Creative Commons Attribution (3.0) license. Ft: Klaus_NeumaierChrome Cactus by Martijn de Boer (NiGiD) (c) copyright 2020 Licensed under a Creative Commons Attribution Noncommercial  (3.0) license. Ft: Javolenus
If you buy something—a refrigerator, a car, a tractor, a wheelchair, or a phone—but you can't have the information or parts to fix or modify it, is it really yours? The right to repair movement is based on the belief that you should have the right to use and fix your stuff as you see fit, a philosophy that resonates especially in economically trying times, when people can’t afford to just throw away and replace things.  Companies for decades have been tightening their stranglehold on the information and the parts that let owners or independent repair shops fix things, but the pendulum is starting to swing back: New York, Minnesota, California, and Colorado have passed right to repair laws, and it’s on the legislative agenda in dozens of other states. Gay Gordon-Byrne is executive director of The Repair Association, one of the major forces pushing for more and stronger state laws, and for federal reforms as well. She joins EFF’s Cindy Cohn and Jason Kelley to discuss this pivotal moment in the fight for consumers to have the right to products that are repairable and reusable.  In this episode you’ll learn about: Why our “planned obsolescence” throwaway culture doesn’t have to be, and shouldn’t be, a technology status quo. The harm done by “parts pairing:” software barriers used by manufacturers to keep people from installing replacement parts. Why one major manufacturer put out a user manual in France, but not in other countries including the United States. How expanded right to repair protections could bring a flood of new local small-business jobs while reducing waste. The power of uniting disparate voices—farmers, drivers, consumers, hackers, and tinkerers—into a single chorus that can’t be ignored. Gay Gordon-Byrne has been executive director of The Repair Association—formerly known as The Digital Right to Repair Coalition—since its founding in 2013, helping lead the fight for the right to repair in Congress and state legislatures. Their credo: If you bought it, you should own it and have the right to use it, modify it, and repair it whenever, wherever, and however you want. Earlier, she had a 40-year career as a vendor, lessor, and used equipment dealer for large commercial IT users; she is the author of "Buying, Supporting and Maintaining Software and Equipment - an IT Manager's Guide to Controlling the Product Lifecycle” (2014), and a Colgate University alumna. MUSIC CREDITSCome Inside by Zep Hurme (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) license. Ft: snowflakeDrops of H2O ( The Filtered Water Treatment ) by J.Lang (c) copyright 2012 Licensed under a Creative Commons Attribution (3.0) license. Ft: Airtone
Anti-Trust/Pro-Internet

Anti-Trust/Pro-Internet

2024-04-0938:373

Imagine an internet in which economic power is more broadly distributed, so that more people can build and maintain small businesses online to make good livings. In this world, the behavioral advertising that has made the internet into a giant surveillance tool would be banned, so people could share more equally in the riches without surrendering their privacy. That’s the world Tim Wu envisions as he teaches and shapes policy on the revitalization of American antitrust law and the growing power of big tech platforms. He joins EFF’s Cindy Cohn and Jason Kelley to discuss using the law to counterbalance the market’s worst instincts, in order to create an internet focused more on improving people’s lives than on meaningless revenue generation. In this episode you’ll learn about: Getting a better “deal” in trading some of your data for connectedness. Building corporate structures that do a better job of balancing the public good with private profits. Creating a healthier online ecosystem with corporate “quarantines” to prevent a handful of gigantic companies from dominating the entire internet. Nurturing actual innovation of products and services online, not just newer price models. Timothy Wu is the Julius Silver Professor of Law, Science and Technology at Columbia Law School, where he has served on the faculty since 2006. First known for coining the term “net neutrality” in 2002, he served in President Joe Biden’s White House as special assistant to the President for technology and competition policy from 2021 to 2023; he also had worked on competition policy for the National Economic Council during the last year of President Barack Obama’s administration. Earlier, he worked in antitrust enforcement at the Federal Trade Commission and served as enforcement counsel in the New York Attorney General’s Office. His books include “The Curse of Bigness: Antitrust in the New Gilded Age” (2018), "The Attention Merchants: The Epic Scramble to Get Inside Our Heads” (2016), “The Master Switch: The Rise and Fall of Information Empires” (2010), and “Who Controls the Internet? Illusions of a Borderless World” (2006). MUSIC CREDITSPerspectives *** by J.Lang (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) license. Ft: Sackjo22 and Admiral Bob_________________Warm Vacuum Tube  by Admiral Bob (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) license. Ft: starfrosch
About Face (Recognition)

About Face (Recognition)

2024-03-2636:281

Is your face truly your own, or is it a commodity to be sold, a weapon to be used against you? A company called Clearview AI has scraped the internet to gather (without consent) 30 billion images to support a tool that lets users identify people by picture alone. Though it’s primarily used by law enforcement, should we have to worry that the eavesdropper at the next restaurant table, or the creep who’s bothering you in the bar, or the protestor outside the abortion clinic can surreptitiously snap a pic of you, upload it, and use it to identify you, where you live and work, your social media accounts, and more? New York Times reporter Kashmir Hill has been writing about the intersection of privacy and technology for well over a decade; her book about Clearview AI’s rise and practices was published last fall. She speaks with EFF’s Cindy Cohn and Jason Kelley about how face recognition technology’s rapid evolution may have outpaced ethics and regulations, and where we might go from here. In this episode, you’ll learn about: The difficulty of anticipating how information that you freely share might be used against you as technology advances. How the all-consuming pursuit of “technical sweetness” — the alluring sensation of neatly and functionally solving a puzzle — can blind tech developers to the implications of that tech’s use. The racial biases that were built into many face recognition technologies.  How one state's 2008 law has effectively curbed how face recognition technology is used there, perhaps creating a model for other states or Congress to follow. Kashmir Hill is a New York Times tech reporter who writes about the unexpected and sometimes ominous ways technology is changing our lives, particularly when it comes to our privacy. Her book, “Your Face Belongs To Us” (2023), details how Clearview AI gave facial recognition to law enforcement, billionaires, and businesses, threatening to end privacy as we know it. She joined The Times in 2019 after having worked at Gizmodo Media Group, Fusion, Forbes Magazine and Above the Law. Her writing has appeared in The New Yorker and The Washington Post. She has degrees from Duke University and New York University, where she studied journalism. This podcast is licensed Creative Commons Attribution 4.0 International, and includes music licensed Creative Commons Attribution 3.0 Unported by their creators. This episode features:Kalte Ohren by Alex (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) license. Ft: starfrosch & Jerry SpoonDrops of H2O (The Filtered Water Treatment ) by J.Lang (c) copyright 2012 Licensed under a Creative Commons Attribution (3.0) license. Ft: Airtone
"I-Squared" Governance

"I-Squared" Governance

2024-03-1236:39

Imagine a world in which the internet is first and foremost about empowering people, not big corporations and government. In that world, government does “after-action” analyses to make sure its tech regulations are working as intended, recruits experienced technologists as advisors, and enforces real accountability for intelligence and law enforcement programs. Ron Wyden has spent decades working toward that world, first as a congressman and now as Oregon’s senior U.S. Senator. Long among Congress’ most tech-savvy lawmakers, he helped write the law that shaped and protects the internet as we know it, and he has fought tirelessly against warrantless surveillance of Americans’ telecommunications data. Wyden speaks with EFF’s Cindy Cohn and Jason Kelley about his “I squared” —individuals and innovation—legislative approach to foster an internet that benefits everyone. In this episode you’ll learn about: How a lot of the worrisome online content that critics blame on Section 230 is actually protected by the First Amendment Requiring intelligence and law enforcement agencies to get warrants before obtaining Americans’ private telecommunications data Why “foreign” is the most important word in “Foreign Intelligence Surveillance Act” Making government officials understand national security isn’t heightened by reducing privacy Protecting women from having their personal data weaponized against them U.S. Sen. Ron Wyden, D-OR, has served in the Senate since 1996; he was elected to his current six-year term in 2022. He chairs the Senate Finance Committee, and serves on the Energy and Natural Resources Committee, the Budget Committee, and the Select Committee on Intelligence; he also is the lead Senate Democrat on the Joint Committee on Taxation. His relentless defiance of the national security community's abuse of secrecy forced the declassification of the CIA Inspector General's 9/11 report, shut down the controversial Total Information Awareness program, and put a spotlight on both the Bush and Obama administrations’ reliance on "secret law." In 2006 he introduced the first Senate bill on net neutrality, and in 2011 he was the lone Senator to stand against the Stop Online Piracy Act (SOPA) and the PROTECT IP Act (PIPA), ultimately unsuccessful bills that purportedly were aimed at fighting online piracy but that actually would have caused significant harm to the internet. Earlier, he served from 1981 to 1996 in the House of Representatives, where he co-authored Section 230 of the Communications Decency Act of 1996 —the law that protects Americans’ freedom of expression online by protecting the intermediaries we all rely on.
What if we thought about democracy as a kind of open-source social technology, in which everyone can see the how and why of policy making, and everyone’s concerns and preferences are elicited in a way that respects each person’s community, dignity, and importance? This is what Audrey Tang has worked toward as Taiwan’s first Digital Minister, a position the free software programmer has held since 2016. She has taken the best of open source and open culture and successfully used them to help reform her country’s government. Tang speaks with EFF’s Cindy Cohn and Jason Kelley about how Taiwan has shown that openness not only works but can outshine more authoritarian competition, wherein governments often lock up data. In this episode you’ll learn about: Using technology including artificial intelligence to help surface our areas of agreement, rather than to identify and exacerbate our differences The “radical transparency” of recording and making public every meeting in which a government official takes part, to shed light on the policy-making process How Taiwan worked with civil society to ensure that no privacy and human rights were traded away for public health and safety during the COVID-19 pandemic Why maintaining credible neutrality from partisan politics and developing strong public and civic digital infrastructure are key to advancing democracy. Audrey Tang has served as Taiwan's first  Digital Minister since 2016, by which time she already was known for revitalizing the computer languages Perl and Haskell, as well as for building the online spreadsheet system  EtherCalc in collaboration with Dan Bricklin. In the public sector, she served on the Taiwan National Development Council’s open data committee and basic education curriculum committee and led the country’s first e-Rulemaking project. In the private sector, she worked as a consultant with Apple on computational linguistics, with Oxford University Press on crowd lexicography, and with Socialtext on social interaction design. In the social sector, she actively contributes to g0v (“gov zero”), a vibrant community focusing on creating tools for the civil society, with the call to “fork the government.”
We cannot build a better future unless we can envision it. EFF’s How to Fix the Internet returns with another season full of inspiring conversations with some of the smartest and most interesting people around who are thinking about how to make the internet – and the world – a better place for all of us. Co-hosts Executive Director Cindy Cohn and Activism Director Jason Kelley will speak with people like journalist Kashmir Hill, Taiwan’s minister of digital affairs Audrey Tang, former White House advisor Tim Wu, digital artist Dr. Nettrice Gaskins and actor and filmmaker Alex Winter, among others.It seems like everywhere we turn we see dystopian stories about technology’s impact on our lives and our futures — from tracking-based surveillance capitalism to street level government surveillance to the dominance of a few large platforms choking innovation to the growing pressure by authoritarian governments to control what we see and say — the landscape can feel bleak. Exposing and articulating these problems is important, but so is envisioning and then building a better future. That’s where our podcast comes in.EFF's How to Fix the Internet podcast offers a better way forward. Through curious conversations with some of the leading minds in law and technology, we explore creative solutions to some of today’s biggest tech challenges.
This episode was first published on May 24, 2022.Pam Smith has been working to secure US elections for years, and now as the CEO of Verified Voting, she has some important ideas about the role the internet plays in American democracy. Pam joins Cindy and Danny to explain how elections can be more transparent and more engaging for all.U.S. democracy is at an inflection point, and how we administer and verify our elections is more important than ever. From hanging chads to glitchy touchscreens to partisan disinformation, too many Americans worry that their votes won’t count and that election results aren’t trustworthy. It’s crucial that citizens have well-justified confidence in this pillar of our republic. Technology can provide answers - but that doesn’t mean moving elections online. As president and CEO of the nonpartisan nonprofit Verified Voting, Pamela Smith helps lead the national fight to balance ballot accessibility with ballot security by advocating for paper trails, audits, and transparency wherever and however Americans cast votes. On this episode of How to Fix the Internet, Pamela Smith joins EFF’s Cindy Cohn and Danny O’Brien to discuss hope for the future of democracy and the technology and best practices that will get us there.In this episode you’ll learn about:Why voting online can never be like banking or shopping onlineWhat a “risk-limiting audit” is, and why no election should lack it Whether open-source software could be part of securing our votesWhere to find reliable information about how your elections are conductedPamela Smith, President & CEO of Verified Voting, plays a national leadership role in safeguarding elections and building working alliances between advocates, election officials, and other stakeholders. Pam joined Verified Voting in 2004, and previously served as President from 2007-2017. She is a member of the National Task Force on Election Crises, a diverse cross-partisan group of more than 50 experts whose mission is to prevent and mitigate election crises by urging critical reforms. She provides information and public testimony on election security issues across the nation, including to Congress. Before her work in elections, she was a nonprofit executive for a Hispanic educational organization working on first language literacy and adult learning, and a small business and marketing consultant.This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology.Music for How to Fix the Internet was created for us by Reed Mathis and Nat Keefe of BeatMower. This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators: http://dig.ccmixter.org/files/Skill_Borrower/41751Klaus by Skill_Borrower (c) copyright 2013 Licensed under a Creative Commons Attribution (3.0) license— http://dig.ccmixter.org/files/airtone/58703commonGround by airtone (c) copyright 2018 Licensed under a Creative Commons Attribution Noncommercial  (3.0) license. —http://dig.ccmixter.org/files/NiGiD/62475Chrome Cactus by Martijn de Boer (NiGiD) (c) copyright 2020 Licensed under a Creative Commons Attribution Noncommercial  (3.0) license.
Who Inserted the Creepy?

Who Inserted the Creepy?

2023-05-3034:361

Writers sit watching a stranger’s search engine terms being typed in real time, a voyeuristic peek into that person’s most private thoughts. A woman lands a dream job at a powerful tech company but uncovers an agenda affecting the lives of all of humanity. An app developer keeps pitching the craziest, most harmful ideas she can imagine but the tech mega-monopoly she works for keeps adopting them, to worldwide delight.  The first instance of deep online creepiness actually happened to Dave Eggers almost 30 years ago. The latter two are plots of two of Eggers’ many bestselling novels—“The Circle” and “The Every,” respectively—inspired by the author’s continuing rumination on how much is too much on the internet. He believes we should live intentionally, using technology when it makes sense but otherwise logging off and living an analog, grounded life. Eggers — whose newest novel, “The Eyes and the Impossible,” was published this month — speaks with EFF’s Cindy Cohn and Jason Kelley about why he hates Zoom so much, how and why we get sucked into digital worlds despite our own best interests, and painting the darkest version of our future so that we can steer away from it.   In this episode, you’ll learn about: How that three-digit credit score that you keep striving to improve symbolizes a big problem with modern tech. The difficulties of distributing books without using Amazon.  Why round-the-clock surveillance by schools, parents, and others can be harmful to kids. The vital importance of letting yourself be bored and unstructured sometimes. Dave Eggers is the bestselling author of his memoir “A Heartbreaking Work of Staggering Genius” (2000) as well as novels including “What Is the What” (2006), “A Hologram for the King” (2012), “The Circle” (2013), and “The Every” (2021); his latest novel, “The Eyes and the Impossible,” was published May 9. He founded the independent publishing company McSweeney’s as well as its namesake daily humor website, and he co-founded 826 Valencia, a nonprofit youth writing center that has inspired over 70 similar organizations worldwide. Eggers is winner of the American Book Award, the Muhammad Ali Humanitarian Award for Education, the Dayton Literary Peace Prize, and the TED Prize, and has been a finalist for the National Book Award, the Pulitzer Prize, and the National Book Critics Circle Award. He is a member of the American Academy of Arts and Letters. This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology.Music for How to Fix the Internet was created for us by Reed Mathis and Nat Keefe of BeatMower. This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators: http://dig.ccmixter.org/files/airtone/58703CommonGround by airtone (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Ft: simonlittlefieldAdditional beds and alternate theme remixes by Gaëtan Harris.
People with disabilities were the original hackers. The world can feel closed to them, so they often have had to be self-reliant in how they interact with society. And that creativity and ingenuity is an unappreciated resource. Henry Claypool has been an observer and champion of that resource for decades, both in government and in the nonprofit sector. He’s a national policy expert and consultant specializing in both disability policy and technology policy, particularly where they intersect. He knows real harm can result from misuse of technology, intentionally or not, and people with disabilities frequently end up at the bottom of the list on inclusion. Claypool joins EFF’s Cindy Cohn and Jason Kelley to talk about motivating tech developers to involve disabled people in creating a world where people who function differently have a smooth transition into any forum and can engage with a wide variety of audiences, a seamless inclusion in the full human experience. In this episode, you’ll learn about: How accessibility asks, “Can we knock on the door?” while inclusion says, ”Let’s build a house that already has all of us inside it.” Why affordable broadband programs must include disability-related costs. Why disability inclusion discussions must involve intersectional voices such people of color and the LGBTQI+ community. How algorithms and artificial intelligence used in everything from hiring tools to social services platforms too often produce results skewed against people with disabilities.  Henry Claypool is a technology policy consultant and former executive vice president at the American Association of People with Disabilities, which promotes equal opportunity, economic power, independent living and political participation for people with disabilities. He is the former director of the U.S. Health and Human Services Office on Disability and a founding principal deputy administrator of the Administration for Community Living. He was appointed by President Barack Obama to the Federal Commission on Long-Term Care, advising Congress on how long-term care can be better provided and financed for the nation’s older adults and people with disabilities, now and in the future. He is a visiting scientist with the Lurie Center for Disability Policy in the Heller School for Social Policy and Management at Brandeis University, and principal of Claypool Consulting. This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology.Music for How to Fix the Internet was created for us by Reed Mathis and Nat Keefe of BeatMower. This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators: http://dig.ccmixter.org/files/zep_hurme/59681Come Inside by Zep Hurme (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) license. http://dig.ccmixter.org/files/zep_hurme/59681 Ft: snowflakehttp://dig.ccmixter.org/files/djlang59/37792Drops of H2O ( The Filtered Water Treatment ) by J.Lang (c) copyright 2012 Licensed under a Creative Commons Attribution (3.0) license. Ft: Airtone
Dr. Seuss Warned Us

Dr. Seuss Warned Us

2023-05-0229:341

Dr. Seuss wrote a story about a Hawtch-Hawtcher Bee-Watcher whose job it is to watch his town’s one lazy bee, because “a bee that is watched will work harder, you see.” But that doesn’t seem to work, so another Hawtch-Hawtcher is assigned to watch the first, and then another to watch the second... until the whole town is watching each other watch a bee. To Federal Trade Commissioner Alvaro Bedoya, the story—which long predates the internet—is a great metaphor for why we must be wary of workplace surveillance, and why we need to strengthen our privacy laws. Bedoya has made a career of studying privacy, trust, and competition, and wishes for a world in which we can do, see, and read what we want, living our lives without being held back by our identity, income, faith, or any other attribute. In that world, all our interactions with technology —from social media to job or mortgage applications—are on a level playing field. Bedoya speaks with EFF’s Cindy Cohn and Jason Kelley about how fixing the internet should allow all people to live their lives with dignity, pride, and purpose. In this episode, you’ll learn about: The nuances of work that “bossware,” employee surveillance technology, can’t catch. Why the Health Insurance Portability Accountability Act (HIPAA) isn’t the privacy panacea you might think it is. Making sure that one-size-fits-all privacy rules don’t backfire against new entrants and small competitors. How antitrust fundamentally is about small competitors and working people, like laborers and farmers, deserving fairness in our economy. Alvaro Bedoya was nominated by President Joe Biden, confirmed by the U.S. Senate, and sworn in May 16, 2022 as a Commissioner of the Federal Trade Commission; his term expires in September 2026. Bedoya was the founding director of the Center on Privacy & Technology at Georgetown University Law Center, where he was also a visiting professor of law. He has been influential in research and policy at the intersection of privacy and civil rights, and co-authored a 2016 report on the use of facial recognition by law enforcement and the risks that it poses. He previously served as the first Chief Counsel to the Senate Judiciary Subcommittee on Privacy, Technology and the Law after its founding in 2011, and as Chief Counsel to former U.S. Sen. Al Franken (D-MN); earlier, he was an associate at the law firm WilmerHale. A naturalized immigrant born in Peru and raised in upstate New York, Bedoya previously co-founded the Esperanza Education Fund, a college scholarship for immigrant students in the District of Columbia, Maryland, and Virginia. He also served on the Board of Directors of the Hispanic Bar Association of the District of Columbia. He graduated summa cum laude from Harvard College and holds a J.D. from Yale Law School, where he served on the Yale Law Journal and received the Paul & Daisy Soros Fellowship for New Americans.  This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology.Music for How to Fix the Internet was created for us by Reed Mathis and Nat Keefe of BeatMower. This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators: http://dig.ccmixter.org/files/airtone/64772lostTrack by Airtone (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) license. http://dig.ccmixter.org/files/airtone/64772 Ft. mwic__________________________________http://dig.ccmixter.org/files/djlang59/59729Probably Shouldn’t by J.Lang (c) copyright 2012 Licensed under a Creative Commons Attribution (3.0) license. Ft: Mr_Yesterday__________________________________http://dig.ccmixter.org/files/airtone/58703CommonGround by airtone (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Ft: simonlittlefield
An internet that is safe for sex workers is an internet that is safer for everyone. Though the effects of stigmatization and criminalization run deep, the sex worker community exemplifies how technology can help people reduce harm, share support, and offer experienced analysis to protect each other. But a 2018 federal law purportedly aimed at stopping sex trafficking, FOSTA-SESTA, led to shutdowns of online spaces where sex workers could talk, putting at increased risk some of the very people it was supposed to protect. Public interest technology lawyer Kendra Albert and sex worker, activist, and researcher Danielle Blunt have been fighting for sex workers’ online rights for years. They say that this marginalized group’s experience can be a valuable model for protecting all of our free speech rights, and that holding online platforms legally responsible for user speech can lead to censorship that hurts us all. Albert and Blunt join EFF’s Cindy Cohn and Jason Kelley to talk about the failures of FOSTA-SESTA, the need for encryption to create a safe internet, and how to create cross-movement relationships with other activists for bodily autonomy so that all internet users can continue to build online communities that keep them safe and free. In this episode, you’ll learn about: How criminalization sometimes harms those whom it is meant to protect. How end-to-end encryption goes hand-in-hand with shared community wisdom to protect speech about things that are—or might ever be—criminalized. Viewing community building, mutual aid, and organizing as a kind of technology. The importance of centering those likely to be impacted in conversations about policy solutions. Kendra Albert is a public interest technology lawyer with a special interest in computer security law and freedom of expression. They serve as a clinical instructor at the Cyberlaw Clinic at Harvard Law School, where they teach students to practice law by working with pro bono clients; they also founded and direct the Initiative for a Representative First Amendment. They serve on the boards of the ACLU of Massachusetts and the Tor Project, and provide support as a legal advisor for Hacking//Hustling. They earned a B.H.A. in History and Lighting Design from Carnegie Mellon University and a J.D. from Harvard Law School, cum laude. Danielle Blunt is a sex worker, community organizer, public health researcher and co-founder of Hacking//Hustling, a collective of sex workers and accomplices working at the intersection of tech and social justice to interrupt state surveillance and violence facilitated by technology. Blunt leads community-based participatory research on sex work and equitable access to technology from a public health perspective. She is on the advisory board of the Initiative for a Representative First Amendment; is a Senior Civic Media Fellow at the University of Southern California’s Annenberg Innovation Lab; and was a 2020 recipient of the Electronic Frontier Foundation’s Pioneer Award. This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology.Music for How to Fix the Internet was created for us by Reed Mathis and Nat Keefe of BeatMower. This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators: http://dig.ccmixter.org/files/djlang59/37792Drops of H2O ( The Filtered Water Treatment ) by J.Lang (c) copyright 2012 Licensed under a Creative Commons Attribution (3.0) license. Ft: Airtone__________________________________http://dig.ccmixter.org/files/admiralbob77/59533Warm Vacuum Tube  by Admiral Bob (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) license. Ft: starfroschhttp://dig.ccmixter.org/files/airtone/59721__________________________________reCreation by Airtone (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) license. http://dig.ccmixter.org/files/airtone/59721 Ft. mwic__________________________________Beatmower - Theme and Extro
When a science-fiction villain is defeated, we often see the heroes take their victory lap and then everyone lives happily ever after. But that’s not how real struggles work: In real life, victories are followed by repairs, rebuilding, and reparations, by analysis and introspection, and often, by new battles.  Science-fiction author and science journalist Annalee Newitz knows social change is a neverending process, and revolutions are long and sometimes kind of boring. Their novels and nonfiction books, however, are anything but boring—they write dynamically about the future we actually want and can attain, not an idealized and unattainable daydream. They’re involved in a project called “We Will Rise Again:” an anthology pairing science fiction writers with activists to envision realistically how we can do things better as a neighborhood, a community, or a civilization. Newitz speaks with EFF’s Cindy Cohn and Jason Kelley about depicting true progress as a long-haul endeavor, understanding that failure is part of the process, and creating good law as a form of world-building and improving our future.  In this episode, you’ll learn about: Why the Star Wars series “Andor” is a good depiction of the brutal, draining nature of engaging in protracted action against a repressive regime. The nature of the “hopepunk” genre, and how it acknowledges that things are tough and one small victory is not the end of oppression. How alien, animal, and artificial characters in fiction can help us examine and improve upon human relationships and how we use our resources. How re-thinking our allocation and protection of physical and intellectual property could bring about a more just future. Annalee Newitz writes science fiction and nonfiction. Their new novel—“The Terraformers” (2023)—led Scientific American to comment, ‘It’s easy to imagine future generations studying this novel as a primer for how to embrace solutions to the challenges we all face." Their first novel—”Autonomous” (2017)—won the Lambda Literary Award. As a science journalist, they are the author of “Four Lost Cities: A Secret History of the Urban Age” (2021) and “Scatter, Adapt and Remember: How Humans Will Survive a Mass Extinction” (2013), which was a finalist for the LA Times Book Prize in science. They are a writer for the New York Times; have a monthly column in New Scientist; and have been published in The Washington Post, Slate, Popular Science, Ars Technica, The New Yorker, and The Atlantic, among others. They are the co-host of the Hugo Award-winning podcast Our Opinions Are Correct. Previously, they founded io9 and served as editor-in-chief of Gizmodo.  This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology.Music for How to Fix the Internet was created for us by Reed Mathis and Nat Keefe of BeatMower. This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators: http://dig.ccmixter.org/files/zep_hurme/59681Come Inside by Zep Hurme (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) license. http://dig.ccmixter.org/files/zep_hurme/59681 Ft: snowflake__________________________________http://ccmixter.org/files/JeffSpeed68/56377Smokey Eyes by Stefan Kartenberg (c) copyright 2017 Licensed under a Creative Commons Attribution (3.0) license. Ft.: KidJazz__________________________________http://dig.ccmixter.org/files/djlang59/59729Probably Shouldn’t by J.Lang (c) copyright 2012 Licensed under a Creative Commons Attribution (3.0) license. Ft: Mr_Yesterday__________________________________http://dig.ccmixter.org/files/airtone/58703CommonGround by airtone (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Ft: simonlittlefield__________________________________Beatmower - Theme and Extro__________________________________Additional beds and alternate theme remixes by Gaëtan Harris
The promise of the internet was that it would be a tool to melt barriers and aid truth-seekers everywhere. But it feels like polarization has worsened in recent years, and more internet users are being misled into embracing conspiracies and cults. From QAnon to anti-vax screeds to talk of an Illuminati bunker beneath Denver International Airport, Alice Marwick has heard it all. She has spent years researching some dark corners of the online experience: the spread of conspiracy theories and disinformation. She says many people see conspiracy theories as participatory ways to be active in political and social systems from which they feel left out, building upon beliefs they already harbor to weave intricate and entirely false narratives. Marwick speaks with EFF’s Cindy Cohn and Jason Kelley about finding ways to identify and leverage people’s commonalities to stem this flood of disinformation while ensuring that the most marginalized and vulnerable internet users are still empowered to speak out. In this episode you’ll learn about: Why seemingly ludicrous conspiracy theories get so many views and followers How disinformation is tied to personal identity and feelings of marginalization and disenfranchisement When fact-checking does and doesn’t work Thinking about online privacy as a political and structural issue rather than something that can be solved by individual action  Alice Marwick is an Associate Professor in the Department of Communication and cofounder and Principal Researcher at the Center for Information, Technology and Public Life at the University of North Carolina, Chapel Hill. She researches the social, political, and cultural implications of popular social media technologies. In 2017, she co-authored Media Manipulation and Disinformation Online (Data & Society), a flagship report examining far-right online subcultures’ use of social media to spread disinformation, for which she was named one of Foreign Policy magazine’s 2017 Global Thinkers. She is the author of Status Update: Celebrity, Publicity and Branding in the Social Media Age (Yale 2013), an ethnographic study of the San Francisco tech scene which examines how people seek social status through online visibility, and co-editor of The Sage Handbook of Social Media (Sage 2017). Her forthcoming book, The Private is Political (Yale 2023), examines how the networked nature of online privacy disproportionately impacts marginalized individuals in terms of gender, race, and socio-economic status. She earned a political science and women's studies bachelor's degree from Wellesley College, a Master of Arts in communication from the University of Washington, and a PhD in media, culture and communication from New York University. This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators: http://dig.ccmixter.org/files/djlang59/59729Probably Shouldn’t by J.Lang (c) copyright 2012 Licensed under a Creative Commons Attribution (3.0) license. Ft: Mr_Yesterday__________________________________http://dig.ccmixter.org/files/airtone/58703CommonGround by airtone (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Ft: simonlittlefield__________________________________Additional beds and alternate theme remixes by Gaëtan Harris
What would the internet look like if it weren't the greatest technology of mass surveillance in the history of mankind? Trevor Paglen wonders about this, and he makes art from it. To Paglen, art is a conversation with the past and the future – artifacts of how the world looks at a certain time and place. In our time and place, it’s a world dogged by digital privacy concerns, and so his art ranges from 19th-century style photos of military drones circling like insects in the Nevada sky, to a museum installation that provides a free wifi hotspot offering anonymized browsing through a Tor network, to deep-sea diving photos of internet cables tapped by the National Security Agency.  Paglen speaks with EFF's Cindy Cohn and Jason Kelley about making the invisible visible: creating physical manifestations of the data collection and artificial intelligence that characterize today’s internet so that people can reflect on how to make tomorrow’s internet far better for us all. In this episode you’ll learn about: The blurred edges between art, law, and activism in creating spaces for people to think differently. Exploring the contradictions of technology that is both beautiful and scary. Creating an artistic vocabulary and culture that helps viewers grasp technical and political issues. Changing the attitude that technology is neutral, and instead illuminating and mitigating its impacts on society. Trevor Paglen is an artist whose work spans image-making, sculpture, investigative journalism, writing, engineering, and numerous other disciplines with a focus on mass surveillance, data collection, and artificial intelligence. He has had one-person exhibitions at the Smithsonian Museum of American Art in Washington D.C.; the Carnegie Museum of Art in Pittsburgh; the Fondazione Prada in Milan; the Barbican Centre in London; the Vienna Secession in Vienna; and Protocinema in Istanbul. He has launched an artwork into Earth orbit, contributed research and cinematography to the Academy Award-winning film “Citizenfour,” and created a radioactive public sculpture for the exclusion zone in Fukushima, Japan. The author of several books and numerous articles, he won a 2017 MacArthur Fellowship “genius grant” and holds a B.A. from the University of California at Berkeley, an MFA from the Art Institute of Chicago, and a Ph.D. in Geography from U.C. Berkeley. This podcast is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology.Music for How to Fix the Internet was created for us by Reed Mathis and Nat Keefe of BeatMower. This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators: http://dig.ccmixter.org/files/airtone/58703CommonGround by airtone (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Ft: simonlittlefieldAdditional beds and alternate theme remixes by Gaëtan HarrisPhoto: Ståle Grut (CC-By-Share-alike)
loading
Comments (4)

Kiwi

I think this tool is destructive, and this is only the surface level of what we think it can and will do. but not ultimately how it'll eventually be used against us both personally and within the government and law enforcement. I thinknit should be banned as a whole. I also think CA law is kind of doing itself a disservice because in such a age of 'information' many people are very misinformed and not educated enough on tech and the dangers of it, even if something seems like a great idea. we need to do a better job of educating people of basic rights, privacy, and identity protection. I feel we've become very lazy and think, if the government says it's good or the law enforcement says it's great. We should all be on board. Tech has had more cons than positives in the past 15 years.

May 1st
Reply

Amir Attarzadeh

ع

Apr 16th
Reply

Michael Pieri

these people are communist

May 19th
Reply

Kiwi

and yet some conspiracy have turned out to be true

Apr 3rd
Reply