What if your phone could detect how you feel?
What if your computer had an "emotion chip" — AI that could read the expression on your face (or the tone in your voice) and know how you’re feeling? Could online courses teach you better if they knew when you were bored or confused? Could your car help you stay awake if you drift off when driving? These are the questions Rana El-Kaliouby asked when she built an AI tool that examines every micro-muscle in the human face to detect universal emotions — happiness, fear, grief, disgust.
Through her company Affectiva, Rana wants to make technology more human, which she believes will serve us better. But if put the wrong hands, could this emotion-reading engine take advantage of us at our most vulnerable moments? Could our inner thoughts be displayed publicly if we don’t want them to be? How might advertisers exploit us if they are able to read our facial expressions?
To help us see around corners — we’re joined by special guests including Esther Perel (Relationship expert; host of podcast “Where Should We Begin?”), Joy Buolamwini (Founder, Algorithmic Justice League); Sam Altman (Chairman Y Combinator, Cofounder Open AI); Greg Brockman (Cofounder, Open AI); and Joi Ito (Director, MIT Media Lab).