DiscoverSex in SpaceUnderstanding Image-Based Sexual Abuse, Deepfakes & the Law
Understanding Image-Based Sexual Abuse, Deepfakes & the Law

Understanding Image-Based Sexual Abuse, Deepfakes & the Law

Update: 2025-06-12
Share

Description

In this episode, we had a conversation with Professor Clare McGlynn, a law professor and expert on violence against women and girls, about image-based sexual abuse. 

 

In this episode, we discuss: 

  • What image-based sexual abuse is and why you should use this term rather than ‘revenge porn’ 
  • Who is impacted by image-based sexual abuse 
  • The role of Google and tech platforms in the rise of deepfake sexual abuse 
  • Where to seek help if you’re a victim, and what you can do if you’re worried someone might share your nudes 
  • Why deepfake detection tools work better on images of men 
  • And what you can do as a listener to support this work 

 

Clare McGlynn is a Professor of Law at Durham University in the UK, with expertise in the legal regulation of sexual violence, pornography and online abuse, particularly cyberflashing and image-based sexual abuse (taking, creating and sharing intimate images without consent). She works closely with policy-makers, victim-survivors and the voluntary sector to shape law and policy reforms, as well as with social media and other companies to improve their policies. In 2020, Clare was appointed an Honorary KC (King’s Counsel) in recognition of her work championing women’s equality in the legal profession and shaping new criminal laws on extreme pornography and image-based sexual abuse. 

 

Find more of Professor Clare McGlynn here: https://www.claremcglynn.com/ 

Find her on social: https://bsky.app/profile/claremcglynn.bsky.social

https://www.instagram.com/claremcglynn_/

https://www.linkedin.com/in/clare-mcglynn-32b898238/

Check out the ‘Stop Image Based Abuse’ campaign here: https://www.endviolenceagainstwomen.org.uk/new-campaign-experts-call-for-image-based-abuse-law/ 

Sign the petition to stop image-based abuse: https://chng.it/dgY7sbkqTP 

 

Timestamps: 

00:00 – Trailer  

00:27 – How Clare McGlynn got into this work 

01:50 – Revenge porn vs image-based sexual abuse 

03:10 – Who is most at risk? 

04:11 – The invisible threat of image-based sexual abuse 

06:01 – What laws are in place to protect against this?  

08:33 – How tech platforms have contributed to the rise of deepfake sexual abuse 

12:10 – The conversations we need to be having with young people 

13:55 – About the ‘Stop Image Based Abuse’ campaign 

16:33 – What to do if you’re a victim of image-based sexual abuse 

18:40 – Why have we been so slow to address this issue? 

21:12 – What Clare wants you to understand about image-based sexual abuse 

23:27 – Why we should stop calling it ‘deepfake porn’ - and what to say instead 

27:00 – Why deepfake detection tech works better on images of men 

30:18 – What you can do as a listener to help support this work 

31:31 – Positive accomplishments in this work 

33:09 – Do this if you’re worried someone might share your nudes 

34:59 – Conclusion 


Hosted on Acast. See acast.com/privacy for more information.

Comments 
In Channel
loading
00:00
00:00
x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

Understanding Image-Based Sexual Abuse, Deepfakes & the Law

Understanding Image-Based Sexual Abuse, Deepfakes & the Law