DiscoverAI For YouFacial Recognition: What if it hurts more than it helps?
Facial Recognition: What if it hurts more than it helps?

Facial Recognition: What if it hurts more than it helps?

Update: 2021-06-10
Share

Description

What if facial recognition causes more harm than help? We dive into the bias embedded in facial recognition technology itself and the bias in the use of these tools by law enforcement and the government through stories, conversations, and lots of research.

Subscribe for episodes every other Thursday!

— Guests: Darren Byler, Jameson Spivack, and Ella Jakubowska

Darren Byler is an Anthropologist and incoming professor at Simon Fraser University who focuses on Uyghur dispossession, infrastructural power, and "terror capitalism."

Jameson Spivack is an Associate at Georgetown Law Center on Privacy and Technology focused on the use of algorithmic technologies like face recognition, predictive policing, and risk assessment in the criminal legal system

Ella Jakubowska is a Policy Advisor at European Digital Rights with a strong focus on facial recognition, biometrics, and fundamental rights

— Links —

— A.I. For Anyone, a non-profit dedicated to helping you learn about AI.  

Find us on Instagram, Twitter, Facebook, LinkedIn and YouTube at @aiforanyone

& check us out at aiforanyone.org/

& email your friends at podcast@aiforanyone.org

Brought to you by Adam Lindquist, Aneekah Uddin, Mac McMahon and the rest of the AI4A team

Comments 
loading
00:00
00:00
1.0x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

Facial Recognition: What if it hurts more than it helps?

Facial Recognition: What if it hurts more than it helps?

A.I. For Anyone