Claim Ownership

Author:

Subscribed: 0Played: 0
Share

Description

 Episodes
Reverse
If social media platforms don’t directly cause polarization, they do, at least, give oxygen to smoldering divisions that can erupt into tragedies like the Myanmar genocide, Brexit, and January 6th. Why is social media so effective at unleashing the worst in us, and how do we break its hold? This episode’s guest, Christopher Bail, pursues those questions as the director of Duke University’s Polarization Lab. He’s also the author of Breaking the Social Media Prism, which was named one of the top five non-fiction books of 2021. Chris and host Eric Schurenberg discuss the role of status-seeking on social media, the personality types most susceptible to online radicalization, and an intriguing experimental platform his team designed that actually encouraged civil discourse.
At some point in conversations about the media, somebody inevitably says, “I just want a single source of true, instantaneous, and uplifting information so that I don't have to think about it.” The longing is understandable—but let's get real. In this era of unlimited and ungoverned information, you have to construct your own trusted news environment and weed out what is unreliable. Helping people do that is the mission of today’s guest, Alan Miller, the founder of The News Literacy Project. This 14-year-old non-partisan organization trains students and adults on how to tell fact from fiction in media. Together, Eric and Alan talk about standing up for factuality in a world of alternate realities, remaining non-partisan while defending truth, and how to have constructive conversations with those who disagree with you.
How to Market Chaos

How to Market Chaos

2022-11-0341:38

A lie travels halfway around the world while the truth is still putting on its boots. Everyone has heard that chestnut, but Sinan Aral has actually proved it. He was one of the first to warn about the corrosive effects of social media with a celebrated Science Magazine cover story, a seminal book, The Hype Machine, and a vast study showing that fake news spreads faster and farther than the truth. Sinan is a marketing, IT, and data science professor at the Massachusetts Institute of Technology, so he sees disinformation through the lens of marketing. In this episode, we will also discuss what Sinan calls “the end of reality” in political discourse, the role of professional media in its own demise, and the strategies democracies need to take to defend the truth.
It’s a feature of our polarized world today that each side of the political spectrum refers to the extremists on the other side as members of a cult. Those are fightin’ words, sure, but you can understand the feeling. People are going down rabbit holes of bizarre, sometimes apocalyptic beliefs; they are alienating themselves from family and from every source of information but other true believers. Today’s guest Steven Hassan, founder of the Freedom of Mind institute knows destructive cults when he sees them because he’s a cult survivor himself. He’s the author of several books on the topic, including his latest, called the Cult of Trump—just in case you wanted the reassurance of the relevance to today’s political scene. Dr. Hassan and I will talk about how cults recruit, why he believes Trump is an instinctive cult leader; the many sub-cults that he believes make up today’s political landscape, whether there are cults on the left as well, and how you can tell whether you are perhaps being subject to undue influence yourself.
If you are a Democrat, have you ever espoused the slogan “Defund the Police?” If you’re a Republican, do you agree with politicians who claim that 2020 presidential election was stolen? If you said yes, you may well be operating under a “collective illusion,” a widespread mental phenomenon in which people take positions in public they privately don’t actually believe, because they think that everyone else in their group does believe it. The implications for the spread of disinformation these days are obvious. In this episode of In Reality, host Eric Schurenberg talks with Todd Rose, co-founder of the think tank Populace and the author of a fascinating book called ‘Collective Illusions.’ The conversation covers a mind-boggling range of common public beliefs that almost no one privately believes (who knew?). Todd also explains why it’s so important for your own mental health and the health of democracy to speak your own authentic truth – and how to do that without getting yourself shunned by your in-group.
Check My Ads Institute is an organization that is taking aim at purveyors of conspiracy theories, hate speech and disinformation. The Institute describes itself as “an independent watchdog” whose goal is to prevent digital advertisers from inadvertently monetizing the spread of falsehoods.In this episode of In Reality, host Eric Schurenberg sits down with the co-founder of Check My Ads Institute, Claire Atkin, to unpack how the digital advertising industry works to support disinformation and perpetuate ad fraud despite its claims to do the opposite. Claire delves into programmatic advertising and explains how third-party ad-serving companies keep brands unaware of where their digital ads are being placed, allowing propagandists to earn revenue from advertisers who would never intentionally support them. Finally, she specifies the steps that Check My Ads Institute is taking to hold the digital ad industry to account, as well as who the company is targeting next.
In this episode of In Reality, Eric Schurenberg hosts Brittany Kaiser, best known as one of the whistleblowers at Cambridge Analytica, the British political consulting firm that worked on the disinformation-laden 2016 campaigns behind Brexit and the election of Donald Trump. Having disavowed her former employer, she is now a much sought-after expert on data privacy, blockchain technology, and legislative reform meant to counter disinformation campaigns. Much of the conversation focuses on Brittany’s tenure as director of business development at Cambridge Analytica. Brittany describes the firm’s techniques of creating psychological profiles of voters and then micro-targeting false or misleading messages to them. She explains how her former employer’s voter suppression strategies were categorically different–morally, legally and tactically–from commercial targeted advertising campaigns. Finally, they delve into Brittany’s Own Your Data Foundation, a not-for-profit dedicated to raising the DQ (Digital Intelligence) of lawmakers, students, parents and voters and minimizing the existential risks of fake news, cyber attacks, disinformation and polarization–the demons that Cambridge Analytica helped unleash to the detriment of democracy in 2016. 
In this episode of In Reality, recorded at the Collision conference in Toronto, host Eric Schurenberg joins Melanie Smith, Head of the Digital Analysis Unit at the London-based Institute for Strategic Dialogue–an independent non-profit dedicated to reversing the tide of polarization, extremism, and disinformation worldwide.The topics in this episode: how the threat of radicalized violence has shifted from foreign actors to domestic ones; why (at least before January 6th) it was so difficult to convince policymakers that domestic extremism was the more serious threat; how domestic extremists prey on the same set of human insecurities to radicalize their targets as Islamic extremists; why Instagram is a favorite tool of disinformation promoters and Pinterest isn’t; and which demographic groups are most likely to spread harmful false information unwittingly. From Smith: “I am optimistic that we can contain disinformation over a 10-year time frame, but I am concerned that things will get worse in the next five years. Elections tend to inflame disinformation, and that, in some places, can easily lead to violence. You have to realize that there are interests that want to seize the opportunity to deepen the divisions in our society.” 
In this episode of In Reality, host Eric Schurenberg sits down with Gillian Tett, Chair of the Editorial Board and Editor-at-Large for the Financial Times, US. Gillian is also trained as an anthropologist, which gives her a unique perspective on the tribal divides within American society.  If you believe that your grasp of reality is the only legitimate one, prepare to be challenged. Anthropologists, Gillian explains, view sub-cultures as self-contained. The belief in conspiracies may seem incomprehensible to most In Reality listeners, but it makes sense to groups who feel abandoned and belittled by elites. All of us have trouble seeing our biases as anything other than ground truths. For example, elites in media, government, entertainment, academe, and so on, regard command of language as an indisputable sign of seriousness and status. For other tribes in America, articulateness is irrelevant. What matters instead is loyal adherence to the tribe’s fears and grievances.  For members of those groups, the facts presented by institutions like the media and legal system are suspect on their face. The only information that is really trustworthy is what’s conveyed by other members of the tribe.Gillian and Eric take the anthropologist’s view of a wide range of contemporary news events: Why the best way to understand Trump supporters is to attend professional wrestling; what Trump’s use of the neologism “bigly” reveals about professional media’s blind spots; and why whistleblowers are disproportionately women. Listen, and prepare to confront your own blind spots. 
In the fight against disinformation, the last line of defense between audiences and malicious falsehoods are the “trust and safety” teams, also known as content moderators. Some of them are employed by social media platforms like Facebook and Spotify, but increasingly the platforms outsource the work of identifying and countering dangerous lies to fact-checking organizations like the fast-growing Irish company, Kinzen.In this episode of In Reality, host Eric Schurenberg sits down with Áine Kerr, co-Founder, and COO of Kinzen. Áine is a serial risk-taker with extensive experience in the intersection of journalism and technology, most recently as the global head of journalism partnerships at Facebook. Kinzen helps platforms, policymakers, and other defenders “get ahead and stay ahead” of false and hateful content in video, podcast, and text platforms. The company uses artificial intelligence to sniff out objectionable content and then when needed, invites human readers to judge for context and nuance. What Kinzen calls  “human in the loop technology” minimizes errors while still allowing for fact-checking at social media scale. In the recent Brazilian elections, for example, Áine explains that disinformation actors came to realize that phrases like “election fraud” and “rigged election” were alerting content moderators who could take down their false claims. So, the actors began substituting seemingly innocuous phrases like “we are campaigning for clean elections.” Kinzen’s human moderators spotted the changes and helped authorities intercept the false messages. Áine and Eric also dive into the many reasons that someone may participate in sharing harmful content online, ranging from sheer amoral greed to ideological commitment. She ends with a warning that the spreaders of disinformation currently have the upper hand. It is always easier to spread lies than to counteract them. The allies of truth–researchers, social media platforms, entrepreneurs, and fact-checking organizations like hers–need to get better at coordinating their efforts to fight back, or democracy will remain an existential risk around the world. 
In this episode of In Reality, co-hosts Eric Schurenberg and Joan Donovan are joined by Eli Pariser, co-director of New Public and former president of MoveOn.org. Pariser is a long-time advocate for creating healthy communities online, and he now advocates for reimagining the Internet as a trustworthy public space analogous to local parks or public libraries.It’s an appealing analogy. Pariser notes that public spaces are critical for holding democratic societies together, spaces where people come together and work through conflict, raise concerns and demands, and share experiences. A key element of physical public spaces is that they are local in scale. Some digital spaces share some of that “local” flavor. Reddit, for example, fosters local discussions and debates through multiple domains and communities that have their own moderation. That stands in contrast to platforms like Facebook and Twitter, where there is no visible moderation and information is global in nature, making it hard to develop a sense of community.  Moderation alone isn’t quite enough, though. Another key element of healthy public spaces is self-governance because it depends on collaboration. Wikipedia is an example of a digital space that offers contributors power checked by governing principles and steered by collaborative norms. Digital “parks” and “libraries” are a distant cry from the barely controlled chaos that has characterized digital spaces to date. But as our civic lives increasingly move online, the need for them is clear. 
The covid pandemic has created the kind of situation in which misinformation thrives. Public health authorities met surging demand for knowledge about how to protect against covid with inconsistent or inadequate guidance. Misinformation rushed in to fill the gap.   In this episode of In Reality, Dr Leana Wen, emergency physician & public health professor at George Washington University, joins co-hosts Eric Schurenberg and Joan Donovan to discuss how health misinformation spreads and how public health institutions can regain trust.Dr Wen explains that much of the mistrust of public health agencies during the pandemic arose because the agencies continually changed guidance. This is a normal, even desirable reaction to new research and evolving risk assessments, but many in the public regarded the shifting guidance as a sign that authorities didn’t really know the truth or had a hidden agenda.Dr Wen distributes blame for health misinformation liberally. She explains how the major news media covering the baby formula shortage encouraged frightened parents to hoard formula, depleting stocks of the product in stores and worsening the situation. As trust in public health authorities shrinks, Dr Wen explains, people are more likely to absorb information from sources like their neighbors, rather than from qualified agencies such as pediatricians and public health organizations.  This is understandable–but potentially dangerous.To combat mistrust, Dr Wen says, public health authorities must not be afraid to give nuanced advice. Authorities should be willing to admit that they don’t always have the answers and that guidance will inevitably change as new information comes to light. It’s also essential to meet people “where they are”--meaning that authorities should default to the information platforms (including social media) that audiences consume and to local (as opposed to national) authorities that they are more likely to trust.
Truth–and the institutions that defend it–are under attack. What can the rest of us do? In this episode of In Reality, co-hosts Eric Schurenberg and Joan Donovan are joined by Jonathan Rauch, a Senior Fellow at the Brookings Institution and author of ‘The Constitution of Knowledge: A Defense of Truth’. In this captivating discussion, Jonathan unpacks what is best described as a crisis of knowledge in Western culture, the result of a multi-front challenge to citizens’ ability to distinguish fact from fiction and elevate truth above falsehood.What has always bound Western societies together in a shared sense of reality, Rauch explains, is a commitment–not to a set of pre-ordained beliefs but rather to a process of constantly testing claims against objective experience to determine which claims are true. Rauch calls this process ‘The Constitution of Knowledge’ because, like the US Constitution, it relies on a system of checks and balances to prevent the truth from being defined only by those in power. Up to this point, we have implicitly trusted institutions like science, medicine, government and media–what Rauch calls “the reality-based community”--to safeguard the process.Social media, however, has short-circuited all of this. Social media makes no attempt to test the claims that appear in its content, and instead revels in broadcasting claims to millions online at Internet speed, without regard to whether they are true or not. Social media exalts popularity over expertise, speed over reflection and division over consensus. It’s no surprise that trust in the reality-based community is crumbling, and many citizens are no longer sure where to turn for truth. By the interview’s end, though, Rauch expresses cautious optimism. At the moment, fake news, misinformation and extremist propaganda (from both sides) seem to have the upper hand. But truth has a singular advantage: It describes the world as it really is. It works–while falsehoods inevitably collide with reality and fail. The reality-based community–and reasonable citizens outside those institutions–have their work cut out for them, Rauch says. But in the end, they will win. 
In this episode of In Reality, Kathleen Belew, University of Chicago historian and author of ‘Bring The War Home: The White Power Movement and Paramilitary America’, joins co-hosts Eric Schurenberg and Joan Donovan. In a fascinating conversation, Belew outlines how social media and the tactics of disinformation energized the white power movement that reached a watershed moment in the violent attack on the U.S. Capitol on January 6th.Belew traces the current white supremacist surge to a movement that took root among veterans returning from the Vietnam war. The movement is made up of a number of loosely affiliated groups, whose ideology and goals changed little over the past 45 years. Indeed, the storming of the U.S. Capitol eerily recalled a similar event in the 1978 neo-Nazi handbook ‘The Turner Diaries’. Belew explains how these groups opportunistically latched on to the economic and racial resentments that brought Donald Trump to power and then used social media to communicate, organize and radicalize members. Belew explains that white power movements have no intention of “making America great again” and instead agitate for the overthrow of democracy. To really make America great, she concludes, Americans need a better understanding of our government and our imperfect history. We can then address questions of what has made America great in the past and what remains to be done to make it great again.
In this episode of In Reality, co-hosts Eric Schurenberg and Joan Donovan sit down with Joaquin Quiñonero Candela, technical fellow for AI at LinkedIn and a former distinguished technical lead for responsible AI at Facebook. Before this, Joaquin led the Applied Machine Learning team at Facebook, creating the algorithms that made Facebook advertising so effective. It’s safe to say Facebook would not be the profit behemoth it is today without the innovations he introduced.2011 saw the broad public adoption of social media and the democratization of public voice that it enabled. The benefits for democracy were immediately apparent in movements like the Arab Spring, which held special meaning for Joaquin as a native of Morocco. After the 2016 election in the US and the 2018 Cambridge Analytica data scandal, however, Joaquin realized that the tools he helped create could be misused and began to devote himself to AI ethics and responsible use of the technology at Facebook, a mission that he carries on at LinkedIn. You could say that the arc of Joaquin’s career parallels that of society’s evolving relationship to social media.  The optimism that defined social media’s early adoption has been replaced by an alarmed awareness that its obvious benefits come with consequences–a polluted information stream, political polarization and erosion of the institutions needed to uphold  democracy. Joaquin is now deeply involved in leading efforts to minimize the harms that social media can unleash. “We’ve come to realize that  anything open will be exploited,” sums up In Reality co-host Joan Donovan, “and it is time for us to take the measure of that”. 
In the first episode of In Reality, co-hosts Eric Schurenberg and Joan Donovan are joined by Rob Reich, Professor of Political Science and Philosophy at Stanford University and Author of System Error: Where Big Tech Went Wrong and How We Can Reboot.At its birth, social media promised to be a tool to promote democracy. Instead, it has become the accelerant to a firestorm of lies and, far from democratizing power has concentrated it among a few social media giants. “Mark Zuckerberg is now the unelected mayor of three billion people,” says Rob Reich. “That is unacceptable.” How did things go so wrong? Reich blames, what he calls, the “engineering mindset” of social media’s inventors and the financial ecosystem that supports them. Along with co-authors Mehran Sahami and Jeremy M. Weinstein, Reich teaches a class on technology and ethics at Stanford University, the high temple of the engineering mindset. He knows what he is talking about! Engineers seek to “optimize” for a specific, measurable outcome without regard to social ramifications. Thus, for example, algorithms designed to give social media users engaging content to wind uploading news feeds or search results with content that triggers outrage, hatred or fear. Engagement—measured by clicks or time spent on the site climbs exponentially as a result--but at an enormous social cost.   Reich believes that the solutions lie in tempering the optimization mindset with regulations that weigh a technology’s social costs against its effectiveness, much as stop signs moderate optimal traffic flow in the interests of safety. Listen and judge for yourself. His ideas require political resolve to execute, to be sure. But the need is urgent. Democracy is at stake.
Comments 
Download from Google Play
Download from App Store