DiscoverSpeakeasy SecurityApple to begin reporting Child Sexual Abuse Material (CSAM)
Apple to begin reporting Child Sexual Abuse Material (CSAM)

Apple to begin reporting Child Sexual Abuse Material (CSAM)

Update: 2021-08-16
Share

Description

Apple recently announced it will begin reporting Child Sexual Abuse Material (CSAM) to law enforcement with the latest iOS 15 update. The new system aims to identify images using a process called hashing, which turns images into numbers. On this episode, we discuss how Apple’s new system will work and how this bold step in combating Child Sexual Abuse is being received by privacy-sensitive users around the world.

Links:

Apple to combat Child Sexual Abuse Material:  https://www.cnbc.com/2021/08/05/apple-will-report-child-sexual-abuse-images-on-icloud-to-law.html 

 National Center for Missing Exploited Children (NCMEC): Home (missingkids.org) 

 Internet Watch Foundation (IWF): Homepage | Internet Watch Foundation (iwf.org.uk) 

This podcast is for informational purposes only and is not intended to replace professional legal, financial or insurance advice. We are not responsible for any losses, damages, or liabilities that may arise from the use of this podcast. The content and views expressed are those of the host and guests.

Comments 
In Channel
The crypto crackdown

The crypto crackdown

2021-06-2528:45

loading
00:00
00:00
x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

Apple to begin reporting Child Sexual Abuse Material (CSAM)

Apple to begin reporting Child Sexual Abuse Material (CSAM)

Ranson Burkette & Tony Anscombe