DiscoverSolve for X: Solutions that Spark ChangeHarms Modeling for Ethical Technology & AI with Microsoft's Mira Lane
Harms Modeling for Ethical Technology & AI with Microsoft's Mira Lane

Harms Modeling for Ethical Technology & AI with Microsoft's Mira Lane

Update: 2021-01-05
Share

Description

Welcome back to the PeaceX series! We're excited to kick off the new year with an extremely important discussion of tech ethics. In this episode, Margarita sits down with Partner Director of Ethics & Society within AI at Microsoft, Mira Lane. As Microsoft has matured into an ethical role, Mira started her ethics & society team in 2017, motivated by a sense of deep responsibility in the tech work that she is doing with Microsoft. Throughout this riveting conversation, Mira and Margarita discuss designing the future we want to live in through current ethical tech and business. They touch on the process of ethical auditing from the beginning of product development as an ingrained framework rather than an afterthought. Furthermore, Margarita adds her own take on ethics being shaped more by lived experiences rather than "hard" skills. Tune into this episode for more on Microsoft's Harms Modeling, community jury, peace data, and the fine line between utopian and dystopian futures. You won't want to miss this first episode of 2021!

For more on Microsoft's Harms Modeling: https://docs.microsoft.com/en-us/azure/architecture/guide/responsible-innovation/harms-modeling/

For more on Mira Lane's professional work: https://www.linkedin.com/in/miralane/

Connect with Mira on Twitter: https://twitter.com/mira_lane

Connect with Mira on Instagram: https://instagram.com/miralane

 

Comments 
loading
In Channel
loading
00:00
00:00
1.0x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

Harms Modeling for Ethical Technology & AI with Microsoft's Mira Lane

Harms Modeling for Ethical Technology & AI with Microsoft's Mira Lane

PeaceX