Claim Ownership


Subscribed: 0Played: 0


Incels are members of an online subculture who define themselves as unable to find a romantic or sexual partner despite desiring one and can be characterised by their hatred towards women. Over the last 10 years, attacks claimed by individuals propagating an incel ideology have claimed the lives of almost 50 people, with an average of 8 per attack. Whilst incel attacks often attract a great deal of attention, incels’ concentrate a majority of their activity online, in which they interact with other misogynistic communities in the so-called online “manosphere”. In this episode, we discuss the roots of the incel movement and the contradictions baked into misogynistic incel theories and manifestos. From how the self-deprecation found in incel forums masks the male supremacist ideology, to how ‘Chads’ (the name given to men perceived as genetically attractive) are to be both admired and attacked. We also explore how the conceptualisation of incels, whether as hate speech, violent extremism or even in some cases, as terrorism, affects tech companies online regulation as well as wider counter-terrorism policies. Anne Craanen and Jacob Berntsson discuss the nuances of misogynist incel ideology. They are joined by two of the most forefront voices in this space: Dr. Debbie Ging, an associate professor in the School of Communications at Dublin City University, where her research is focussed on digital hate, online anti-feminist men’s rights organisations and the incel phenomenon; and Alex DiBranco, the co-founder and executive director of the Institute for Research on Male Supremacism whose research is focussed on the development of right-wing and contemporary misogynist movements. Together, they consider what measures technology companies can take to deal with incel groups - such as partnering with entities that have expertise in countering these forms of extremism. They argue how incels and wider misogyny are a problem both offline and online, and how countering these issues requires collective action from both spheres. They also highlight the importance of the educational level, particularly through encouraging progressive sex education as well as lessons in media literacy and digital ethics. Finally, both agree that some forms of incel violence should be seen as gender-based terrorism. 
In this episode, we discuss why accelerationism has become a flagship doctrine of far-right violent extremism. To help us comprehend what accelerationism is and how it is reflected in the online sphere, Maygane Janin and Adam Hadley are joined by Professor Matthew Feldmann, Director of the Centre for Analysis of the Radical Right (CARR), and an expert on fascist ideology, neo-Nazism and “lone actor” terrorism, and by Ashton Kingdon, a PhD student at the University of Southampton and a fellow at CARR, whose research focuses on how far-right extremists use technology for recruitment and radicalisation. In today’s podcast, we also welcome Ben Makuch, a national security reporter with Vice News, who investigates far-right violent extremism, particularly neo-Nazism.Together, they consider how propaganda is being repurposed on forums and mainstream platforms to coincide with particular events to misconstrue the narrative and cause political tension. We also discuss the emergence of accelerationist subcultures, and how they are using the pandemic to “initiate the collapse of society”, and discuss a rise in media attention on accelerationism in the US.
Across the ideological spectrum, there are misconceptions and oversimplifications when it comes to discussing the role of women in terrorist organisations. From the perception that women are groomed into joining violent extremist groups and can therefore be presumed innocent, to the notion that a woman’s role in a terrorist organisation is secondary simply because she is less likely to be the one picking up a weapon to carry out an attack. In this episode, we debunk many of these myths and explain why this issue has far more depth to it than the media conveys. We explore the misleading ‘jihadi bride’ stories perpetuated by the media, we examine women’s roles in online propaganda and recruitment, and we discuss the nuances to the “push and pull” factors of why women join terrorist groups - including far right groups. Drawing upon all of this, we provide recommendations on how the tech sector should counter women's role in online extremism and terrorism.Maygane Janin and Anne Craanen discuss the complexities at the intersection of gender and terrorism. They are joined by two of the foremost voices in this space: Dr. Joana Cook, an Assistant Professor on Terrorism and Political Violence at Leiden University, Senior Project Manager and an Editor in Chief at the International Centre for Counterterrorism who recently published a book on gender and counterterrorism titled  “A Woman’s Place: U.S. counterterrorism since 9/11”; and Dr. Elisabeth Pearson, a lecturer at the Cyber Threats Research Centre at Swansea University who specialises in gender, extremism, and counter extremism. Together, they consider the broader socio-cultural context of how gender is viewed in extremist ideology participation - especially with regards to how understanding of gender identity, individuals’ experiences, age, and social class also impact the reasons someone might join an extremist group.
How to best regulate the online sphere will be amongst the most important topics of the upcoming decade. Up until recently, laws have been in place that serve to mostly shield digital intermediaries from liability for third-party illegal content on their platform. Since 2016 however, in response to mounting concerns over the criminal misuse of the internet and a surge in noxious content online, the regulatory landscape has begun to change. Governments around the world have started to impose laws and regulatory frameworks that oblige online platforms to expediently and proactively address illegal or harmful content on their sites. Increasingly, however, platforms have also developed their own modes of self-regulation, endeavouring to incorporate new structures of responsibility and accountability into their business models. Join Flora Deverell and Jacob Berntson as they discuss the ways in which online regulation is being pursued by companies, governments, and multi-lateral organisations, such as with the upcoming EU wide law on the dissemination of terrorist content. They are joined by two of the foremost voices in this space: Evelyn Douek, a lecturer in law and SJD candidate at Harvard Law School, and affiliate at the Berkman Klein Center for Internet & Society, studying international and transnational regulation of online speech; and Daphne Keller, Director of Platform Regulation at Stanford’s Cyber Policy Center – formerly Assistant General Counsel at Google and Director of Intermediary Liability at Stanford’s Center for Internet and Society – who has worked on groundbreaking Intermediary Liability litigation and legislation around the world. They also explore the implications of Facebook’s new Oversight Board and what this really means for governance and accountability processes, whether we should use international human rights law as a framework for ruling the internet, and why terrorist content is such an important topic in regulatory discourse.Full list of resources on our website:
“You can sit at home and play Call of Duty or you can come and respond to the real Call of Duty…the choice is yours.” This was tweeted by a well-known ISIS hacker and propagandist. Gaming culture and popular video games, such as Call of Duty and World of Warcraft, have become exploited by terrorist and violent extremist actors for propaganda and radicalisation purposes.Join Maygane Janin and Flora Deverell as they discuss how terrorist and violent extremists exploit gaming culture for their own ends. They are joined by Linda Schlegel, a senior editor at The Counterterrorism Group and a regular contributor for the European Eye on Radicalization, where she recently published a number of articles on the exploitation of gaming culture; and Dr. Nick Robinson, an associate professor in politics and international studies at the University of Leeds who has been researching the links between videogames, social media, militarism, and terrorism for over a decade. They address in particular the “gamification of radicalisation,” the exploitation of gaming platforms, as well as why terrorist organisations developing their own games to serve their own ideologies and purposes is less prevalent now than it used to be. Full list of resources on our website: Schlegel (@LiSchlegel)Dr. Nick Robinson
During the recent protests against the coronavirus lockdown in the US, a protester was spotted with a flyer referring to “Boogaloo”, a popular far-right violent extremist slang term calling for a new civil war that has turned into a meme culture of its own amongst violent extremists. One year earlier, before attacking two mosques and killing 51 people, the Christchurch shooter posted on messaging board 8chan, encouraging readers to continue to make memes. Join Maygane Janin and Jacob Berntsson as they discuss how memes have become an unconventional strategy for violent extremists to easily spread their ideologies. They are joined by Maik Fielitz, a researcher at the Jena Institute for Democracy and Civil Society, and a fellow at the Centre of Analysis of the Radical Right specialising in far-right extremism in Germany; and Lisa Bogerts, an expert of visual communication, both of them are contributors to the 2019 book, ‘Post-Digital Cultures of the Far Right’. They discuss how far-right violent extremists take advantage of the intrinsic virality of seemingly harmless online jokes to reach out to new audiences and penetrate mainstream culture.ResourcesThe visual culture of far-right terrorism (Bogerts & Fielitz, 2020)  “Do you want meme war” (Bogerts & Fielitz, 2018) The visual culture of far-right terrorism (LBogerts & Fielitz, 2020)Digital fascism: challenges for the open society in times of social media (Fielietz, Marcks, 2019) How memes are becoming the new frontier of information warfare (Ascott, 2020)Cyber swarming, memetic warfare and viral insurgency (Goldenberg, Finkelstein, 2020) We Analyzed How the "Great Replacement" and Far Right Ideas Spread Online. The Trends Reveal Deep Concerns (Ebner & Davey, 2019)The far-right is weaponizing Instagram to recruit Gen Z (Bateman, 2019)How the radical right weaponizes memes? (Liyanage, 2020) Meme warfare in the Swedish context (Davey, 2018)Full list of resources on our website: Fielitz (@maik_fielitz) Lisa Bogerts
The Nordic Resistance Movement (NRM) is a neo-Nazi organisation that was originally founded in Sweden. The movement, which is openly anti-semitic, anti-immigration and anti-gay, aims to dismantle Nordic parliamentary democracies and replace them with a unified Nordic fascist state. Join Flora Deverell and Jacob Berntsson as they discuss NRM’s growing influence with guest Jonathan Leman, a researcher at Stockholm-based Expo, which monitors violent extremist far-right activity in the Nordic countries. The podcast also takes a look at some of the most prominent individuals in the Nordic neo-Nazi scene with expert Dr. Louie Dean Valencia-García, an assistant professor of digital history at Texas State University. Together, they explore how Nordic neo-Nazis are exploiting online platforms as a “safe haven” and other mainstream trends, such as memes, to spread their message and aid radicalisation on a global scale.ResourcesRight-wing terrorism and militancy in the Nordic countries: a comparative case study (Ravndal, 2018)Europe’s right wing: A nation-by-nation guide to political parties and extremist groups  (van Gilder Cooke, 2011)Right-wing extremism in Norway – changes and challenges (Bjorgo, 2019)Anti-immigrant ‘Soldiers of Odin’  raise concern in Finland (Rosendahl & Forsell, 2016)The Nordic Resistance Movement (Dr Wiggen, 2020)How a small  Budapest publishing house is quietly fuelling far-right extremism (Owen, 2019)Arktos Vs Counter-Current feud splits the alt-right (Lawrence, 2019)’Finspång’ – an execution meme of the Swedish radical right ignites the political discourse (Onnerfors,2018)Louie Dean Valencia-Garcia (@BurntCitrus):Far-right revisionism and the end of history (2020)The rise of the European far-right in the internet age (Valencia-Garcia, 2018)Jonathan Leman (@JonathanLeman):Hate Beyond Border: The internationalization of white supremacy (ADL)Expo:
In the case of the recent terrorist attack in Christchurch, British and Australian tabloids were instrumental in making the gunman’s attack video and manifesto go viral. There have been a number of occasions where the work of the tech sector to take down extremist content from online platforms has been undermined by mainstream media outlets. Join Lorand Bodo as he speaks to Kyle Taylor, executive director of Hacked Off, a group which campaigns for a free and accountable press in the UK, and Abdirahim Saaed, a journalist for BBC Monitoring, who tracks and analyses the propaganda output of salafi-jihadist groups like ISIS and Al-Qaeda. Focussing on the UK landscape, this episode explores how news media can provide some of the most effective PR for terrorists, promoting and giving tremendous reach to their messages of hate, by spreading videos and images. It particularly focuses on the importance of imposing stringent and robust rules on UK newspapers, which currently lack independent regulation. It's clear we need a solution, because right now we’re playing directly into the terrorists’ hands.
Although terrorist and extremist groups largely use traditional methods to fund their activities, the anonymity cryptocurrency affords is becoming an increasingly attractive alternative. Join Adam Hadley and Lorand Bodo as they speak to Nick Furneaux, author of ‘Investigating Cryptocurrencies’, Florence Keen, research fellow at the International Centre for the Study of Radicalisation at King’s College London, and retired police officer Andrew McDonald, who served as head of specialist investigations of the UK National Terrorist Financial Investigation Unit at New Scotland Yard. This episode explores the myriad of means used by terrorists to sustain their lifestyle, fund operations, recruit individuals, and build capacity. It highlights the need for increased knowledge sharing between law enforcement officials, researchers, and the fintech community, to combat the issue. Although we're not yet in the danger zone with terrorist use of digital money, it's coming. Terrorists are like any other criminals, and they will exploit any avenue necessary to achieve their goals. 
Effective analysis of publicly available information is critical in countering terrorist use of the internet. But just as open source intelligence can used as a tool for good, would-be terrorists can also exploit the data to plan attacks. Join Adam Hadley and Lorand Bodo as they speak to Benjamin Strick, an open-source investigator for BBC Africa Eye, Nico, a.k.a ‘DutchOSINTGuy’, a former police officer in the Netherlands, and Terry Pattar, who runs the intelligence unit at the security analysis firm Jane’s 360. This episode explores how intrigue and curiosity helps these experts infiltrate online extremist networks, where messages of propaganda and hate and being spread. Be warned - tread carefully when entering the world of OSINT and the dark web. Because when you’re looking at terrorists, they could be looking right back at you.
Human rights experts around the world have warned that the rush to tackle terrorist activity online has had worrying implications for fundamental human rights and freedom of speech. How do we differentiate between ‘free speech’ and speech that aims to incite violence? In this episode we find out how governments and tech platforms are attempting to strike a balance between protecting our safety, and protecting our rights. Join Flora Deverell and Jacob Berntsson as they speak to Emma Llanso, director of the Free Expression Project at the Center for Democracy and Technology, and Dr. Krisztina Huszti-Orban, senior legal advisor to the UN Special Rapporteur on Counter-terrorism and Human Rights. As governments and tech platforms across the world work to combat online extremism, this conversation asks – where should we draw the line, and how can we ensure our fundamental freedoms are protected? 
Whether it’s a manifesto posted on 8chan or an attack video uploaded to Facebook, terrorists and right-wing extremists are increasingly using the internet as a way to spread their hate-filled messages. In this episode, we find out how these groups are exploiting an entire tech ecosystem, and what is being done to combat it. Join Adam Hadley and Lorand Bodo as they speak to Matthew Feldman, director of the Centre for Analysis of the Radical Right, and Audrey Alexander, researcher and instructor at West Point’s Combating Terrorism Center. This conversation uncovers how violent extremist groups like ISIS are adapting to an online world, as their physical power diminishes. But does this decentralization actually make them a greater risk?
Download from Google Play
Download from App Store