DiscoverThe Algorithmic Futures Podcast
The Algorithmic Futures Podcast
Claim Ownership

The Algorithmic Futures Podcast

Author: Liz Williams and Zena Assaad

Subscribed: 6Played: 83
Share

Description

We talk to technology creators, regulators and dreamers from around the world to learn how complex technologies may shape our environment and societies in the years to come.
28 Episodes
Reverse
How is the booming AI industry linked to the world's growing interest in nuclear power? What might this mean for the future of both sectors (and the planet) moving forward? We talk about all this and far more in our chat with Cindy Vestergaard, Senior Fellow and Director of Converging Technologies and Global Security at the Stimson Center in Washington DC. If you have an interest in how complex (disruptive) technologies shape and are shaped by the complexities of our rapidly changing world, then this episode is for you. This is our last episode of season 3!    Credits Guest – Cindy Vestergaard (Stimson Center) Co-hosts – Liz Williams and Zena Assaad Producers – Robbie Slape, Martin Franklin (East Coast Studio)   The full transcript for this episode is available at https://algorithmicfutures.org/s03e08 
Engineers use STEM skills (and beyond) to solve problems – but what does a career in engineer look like? How do we attract students to careers in engineering? How do we foster and support diversity and inclusion in the engineering workforce? And finally, how do we train engineers to manage the increase in complexity that comes with emergent technologies like artificial intelligence – particularly for safety-critical settings? We explore these questions (and beyond) with Jane MacMaster, Global Engineering Integrity Director at Babcock International Group, and former Chief Engineer of Engineers Australia.   Credits Guest: Jane MacMaster Co-hosts: Zena Assaad and Liz Williams Producers: Robbie Slape, Zena Assaad, Liz Williams Audio / Video producer: Martin Franklin (East Coast Studio)   For the full episode transcript, visit https://algorithmicfutures.org/s03e07   We love feedback! If you enjoyed this episode, share with friends or leave us a 5-star review on Apple Podcasts. This helps us make more of the content you love!
What does it take to regulate artificial intelligence? We invited Professor Geoff Mulgan of University College London, author of When Science Meets Power (Polity Press) and many other titles, to help us unpack the possibilities. Listen in as he draws on his background in technology, governance, academia, and beyond to consider this multidimensional challenge and offer some thoughts on how to make progress. There are some calls to action, too, for those of you in the field.   Credits Guest: Geoff Mulgan Co-Hosts: Zena Assaad and Liz Williams Producers: Robbie Slape, Martin Franklin (East Coast Studio), Zena Assaad, Liz Williams For the full transcript or YouTube vodcast, visit: https://algorithmicfutures.org/s03e06
This month, we chat with Dr Vanessa Pirotta, a wildlife scientist and science communicator with a passion for creatively making use of technology for her work in wildlife conservation. We learn all about how she uses drones to survey the health of whales in transit and what research like this is telling us about these magnificent creatures. We also get into a wide-ranging discussion about life in science as a woman in STEM and in academia, the importance of sharing our work with society, the role citizen science can play in our understanding of the world, and so much more. Vanessa is a real inspiration, and we think our chat with her is a perfect way to get yourself in the mood for Australia's National Science Week. Guest: Vanessa Pirotta Hosts: Zena Assaad, Liz Williams Producers: Zena Assaad, Liz Williams, Robbie Slape, Martin Franklin A video of our chat and a full (edited!) transcript are available on our website: https://algorithmicfutures.org/s03e05 
What are AI standards – and why should we care? Our guest today, Dr Kobi Leins, has first-hand experience as both contributor to the development of AI standards for the world and a professional working on supporting safe AI in real world industry contexts. We talk about what AI standards are for and why the discussion and work feeding into standards – and AI development and deployment more broadly – matters for us all. It's the kind of tricky discussion that starts in industry and day-to-day applications of AI, and ends in military uses of AI. If you care about AI ethics, safety, responsibility, all those words – then you need to listen to this conversation. Credits Guest: Dr Kobi Leins Hosts: Zena Assaad and Liz Williams Producers: Robbie Slape, Zena Assaad, Liz Williams, Martin Franklin (East Coast Studio) Thanks to the Australian National Centre for the Advancement of Science for letting us use their podcast studio.  For episode links and the full transcript, visit https://algorithmicfutures.org/s03e04  
The idea that artificial intelligence is taking our jobs can be scary – but in actuality, there are cases where this is a good thing. Dr Sara Webb (Swinburne University of Technology) shares one of these stories in today's episode, which begins with a TedX talk in Melbourne and ends with a discussion of some of the many ways techniques developed for astrophysics are transforming seemingly unrelated fields. Sara is an astrophysicist based at Swinburne University and is also a published author with a talent for communicating complex ideas about our universe (and AI) for broad audiences. Listen in to hear more about the role AI is increasingly playing in astronomy, how she got into astrophysics in the first place, and more in this wide ranging episode that paints a picture of what a career in STEM can look like.   Episode credits: Guest: Sara Webb Co-hosts: Zena Assaad and Liz Williams Producers: Zena Assaad, Robbie Slape, Liz Williams, Martin Franklin (East Coast Studio) Thanks to the Australian National Centre for the Public Awareness of Science for letting us use their podcast studio to record this episode. For the full episode transcript, visit https://algorithmicfutures.org 
In the age of DALL-E and Stable Diffusion, what counts as art? And what can art tell us about AI? In this episode, we explore these questions and more with the help of Eryk Salvaggio, a US-based artist, designer and researcher whose work explores the fabric of artificial intelligence -- and often playfully defies its boundaries.  Credits Guest – Eryk Salvaggio Hosts – Zena Assaad and Liz Williams Producers – Robbie Slape, Zena Assaad, Liz Williams Audio Producer – Martin Franklin (East Coast Studio) Thank you to the Australian National Centre for the Public Awareness of Science for allowing us to use their podcast studio for this episode. For the full transcript or episode video, visit https://algorithmicfutures.org 
It is the launch of season 3 of this podcast, and we thought it was high time for a positionality statement – er, episode. Why not align it with the start of a new season and our debut on YouTube? Listen in for an episode featuring our co-hosts, Liz Williams and Zena Assaad, in which we explore everything from relics, reactions, reciprocity, risk, and the complexities involved in creating and regulating AI systems in the real world.  Credits: Co-hosts: Zena Assaad and Liz Williams Producers: Robbie Slape, Zena Assaad, Liz Williams, and Martin Franklin (East Coast Studio) Thanks to the Australian National Centre for the Public Awareness of Science for letting us use their podcast studio for recording. We would also like to pay our respects to the Traditional Owners of the lands on which we recorded and edited this episode.  For show notes, the full (edited!) transcript, and maybe even a picture of the big blue ball, visit https://algorithmicfutures.org
In our final episode of season 2, we are grateful to be joined by Damith Herath, Associate Professor of Robotics and Art at the University of Canberra. Damith is a multi-talented roboticist with a long history of working in the art world, and an interest in understanding how to shape human-robot collaboration in real-world environments. During our conversation, Damith talks to us about how his innate drive to experiment with electronics and robotics led him from an entrepreneurial childhood in Sri Lanka to the forefront of robotics and automation research in Australia.  Credits: Guest: Damith Herath (University of Canberra) Co-hosts: Zena Assaad and Liz Williams Producers: Zena Assaad, Liz Williams, Robbie Slape, Martin Franklin (East Coast Studio) Acknowledgements: A special thanks to the ANU School of Cybernetics for lending us the use of their podcast studio for this recording. Transcript:  A full transcript of this episode is available on our website: https://algorithmicfutures.org/s02e09
What does responsibility look like in military contexts – and how do you think about encoding it in autonomous military technologies with the capacity to harm? In today's episode, we explore this topic from a legal perspective with the help of Lauren Sanders. Lauren is a senior research fellow at the University of Queensland with expertise in international criminal law, international humanitarian law, and domestic counter-terrorism law. She is also host and editor of the Law and the Future of War podcast.  Episode Credits: Guest: Lauren Sanders Co-Hosts: Zena Assaad and Liz Williams Producers: Zena Assaad, Liz Williams, and Martin Franklin (East Coast Studio) This episode is rated explicit because the topic of discussion may not be suitable for young listeners. For the full episode transcript, visit https://algorithmicfutures.org/s02e08 
In this episode, we explore the "nuclear mindset" – a term being thrown around in discussions about Australia's plans to acquire conventionally-armed, nuclear powered submarines as part of the AUKUS trilateral partnership between the US, UK, and Australia. With the help of Veronica Taylor, Will Grant, and Ed Simpson, guest co-host AJ Mitchell and I explore what a nuclear mindset might look like, and discuss how we can help train a new generation of nuclear technology creators, regulators and dreamers approach their work with the care needed to make use of nuclear technologies safely, responsibly, and securely in an Australian context.  Along the way, we talk about Australia's already lengthy history of working with nuclear technologies, tricky considerations like how to manage nuclear waste (even for widely accepted applications like nuclear medicine), and far more in this wide ranging and transdisciplinary discussion. There will be lessons in this episode for anyone who designs, manages, or regulates technologies used in safety-critical applications – including those enabled by artificial intelligence.  Episode Credits: Host: Liz Williams Guest co-host: AJ Mitchell AJ Mitchell is a Senior Lecturer in the ANU Department of Nuclear Physics and Accelerator Applications. He is the convenor of the ANU Graduate Certificate of Nuclear Technology Regulation, does research in fundamental nuclear structure and applied nuclear science, and is a passionate educator and science communicator. He is also actively involved in teacher-training projects in Timor Leste and leads a program with the University of Yangon in Myanmar to build teaching and research capacity in physics. Guests: Veronica Taylor, the Professor of Law and Regulation in the School of Regulation and Global Governance (or RegNet) at ANU. She is former Dean of the ANU College of Asia and the Pacific, is a member of the ANU Steering Group on Nuclear Technology Stewardship, and is one of the chief investigators for the newly awarded Australian Research Council Industrial Transformation Training Centre for Radiation Innovation. Will Grant is Associate Professor in Science Communication at the Australian National Centre for the Public Awareness of Science, which is based at ANU, and is a prolific writer and contributor on the interaction between science, politics and technology. He is also a member of the ANU Working Group on Nuclear Technology Stewardship. Will has some fantastic podcasts of his own: The Wholesome Show, G'day Patriots and G'day Sausages. Ed Simpson is a Senior Lecturer at the ANU Department of Nuclear Physics and Accelerator Applications, Nuclear Science Lead for the ANU Research School of Physics, and is one of the few nuclear theorists I know who can hold his own in laboratory settings. He is heavily involved in nuclear science education here on campus, has experience in government through service as an Australian Science Policy Fellow, and is also a Chief Investigator on the new Australian Research Council Industrial Transformation Training Centre for Radiation Innovation. Producers: Liz Williams, Martin Franklin (East Coast Studio), Zena Assaad Acknowledgements: A special thanks to the Australian National Centre for the Public Awareness of Science (CPAS)for allowing us to use their recording studio for this episode. For the full episode transcript, visit: https://algorithmicfutures.org/s02e07/  Regarding the explicit rating: This episode mentions nuclear weapons and weapons testing, and also talks about the use of nuclear propulsion for Defence. If you don't wish to discuss these topics with small children, it may be worth saving this episode for another time.
Our episode today features Tracey Spicer, award winning journalist, author, and social justice advocate who begins this episode with a story from her own life: her son, after watching an episode of South Park, declared "Mum, I want a robot slave." This declaration prompted Tracey to begin a seven-year journey exploring how society shapes the technology we surround ourselves with, and how technology in turn shapes us. Her findings are documented in her latest book, Man-Made, which was published by Simon & Schuster earlier this year. Tune in to hear more about Tracey's latest book, her work as a journalist and social justice advocate, how technology is changing journalism, life as a working parent, and so much more.  Please note: We discuss some of the realities of work for women. This occasionally touches on topics that are not suitable for young listeners.   Credits Guest: Tracey Spicer Hosts: Zena Assaad and Liz Williams Producers: Zena Assaad, Liz Williams, and Martin Franklin (East Coast Studios) Theme music: Coma-Media
You have probably heard of ChatGPT – the generative AI language model that is already transforming work and education. In this episode, we explore the many potential benefits and challenges ChatGPT and models like it pose for education and law with the help of Simon Chesterman, author of We, the Robots? Regulating Artificial Intelligence and the Limits of the Law, David Marshall Professor and Vice Provost of Educational Innovation at the National University of Singapore, Senior Director of AI Governance at AI Singapore, and Editor of the Asian Journal of International Law. This episode has something for everyone who is interested in understanding how we can sensibly make the best use of generative AI models like ChatGPT while mitigating their potential for harm. Credits: Guest: Simon Chesterman Hosts: Zena Assaad and Liz Williams Guest co-hosts: Tom Chan, Matthew Phillipps Producers: Tom Chan, Matthew Phillipps, Robbie Slape, Zena Assaad, Liz Williams, Martin Franklin (East Coast Studios) Theme music: Coma-Media Thank you to the ANU School of Cybernetics for allowing us to record Tom and Matthew's audio in their studio. Transcript: For the full transcript of this episode, visit: https://algorithmicfutures.org/s02e05 
What does human flourishing have to do with human-machine teams? And how do we meaningfully engage stakeholders in consultations about some of the most challenging problems of our time? Listen in as we explore some of these questions with Kate Devitt, co-founder and CEO of BetterBeliefs – a platform for evidence-based stakeholder engagement and decision-making – who also happens to be an internationally recognized leader in ethical robotics, autonomous systems and AI.  Credits: Guest: Kate Devitt Hosts: Zena Assaad and Liz Williams Producers: Zena Assaad, Liz Williams, Martin Franklin (East Coast Studios) Theme music: Coma-Media We would like to acknowledge the Traditional Owners of the lands on which this episode was recorded, and pay our respects to Elders past and present.  Content notes: We have chosen to list this episode as explicit because of some discussion of warfare. For the full transcript, visit https://algorithmicfutures.org/s02e04
Most of us have a vested interest in what happens in space – whether we know it or not. Listen in as we talk to Cassandra Steer, Deputy Director of the Australian National University Institute for Space – or ANU InSpace, for short – about space law, diversity and inclusivity in the space sector, and why having diverse perspectives contribute to Australia's future in space is important for us all. Credits: Guest: Cassandra Steer Hosts: Zena Assaad and Liz Williams Producers: Zena Assaad, Liz Williams and Martin Franklin (East Coast Studio) Theme music: Coma-Media *** If you enjoyed this episode, please remember to give us a 5-star review on Apple Podcasts and share the episode with friends and colleagues. We put a lot of time and effort into producing every episode and really appreciate your support. You can also access the full transcript of this episode on our website: https://algorithmicfutures.org *** Notes on the content: Our choice to list this episode as explicit is because of some discussion of sexism and racism in the episode, some mention of warfare, and a brief story about discussing terrorism in a classroom setting. Disclaimer: This episode is for your education and entertainment only. None of this is meant to be taken as advice specific to your situation.
Our second episode of Season 2 features Sue Keay. Sue is currently the robotics technology lead at OZ Minerals, Chair and Founder of Robotics Australia Group, and is a member of the Advisory Committee for the National Robotics Strategy (amongst many other accomplishments). She joined us for a chat shortly before the Department of Industry, Science and Resources released its National Robotics Strategy discussion paper – which Sue had a hand in shaping – and shared with us the many challenges and opportunities she sees for the future of robotics in Australia. Episode credits Guest: Sue Keay Hosts: Zena Assaad and Liz Williams Producers: Robbie Slape, Zena Assaad, Liz Williams, and Martin Franklin
Our first episode of Season 2 features Julie Carpenter, author of Culture and Human-Robot Interaction in Militarised Spaces: A War Story. Julie is a social scientist based in San Francisco, and her work explores how humans experience emerging technologies. Listen in as we delve into the relationship between humans and robots, exploring everything from love and intimacy to the bonds humans form with robots deployed in military settings. Content warning and disclaimer: We talk about adult themes in this episode, so it may not be one to share with minors. We also produce this podcast for your education and enjoyment only. Please don't take anything in this episode as advice specific to your situation. Love the episode? We are so glad! Please help others discover our podcast by sharing this episode with friends, family, or colleagues. Listening on Apple Podcasts? You can also help us game the algorithms by giving us a great review. Episode Credits Guest: Julie Carpenter Hosts: Zena Assaad and Liz Williams Producers: Zena Assaad, Liz Williams, and Martin Franklin
Today, we're honoured to be joined by Jenny Zhang -- a software engineer and writer based in Canada. Her purpose-driven approach to technology development comes through clearly throughout our time with her, and (we think) offers up valuable lessons to anyone seeking to generate beneficial impact in the tech industry. Listen in as we talk to Jenny about her circuitous path to software development, what it means to be a full stack engineer, her considerations of privacy and safety in voice datasets, values and career trajectories, and more. This is the last episode for this season of the Algorithmic Futures podcast. Don't worry -- we'll be back next year with more episodes, so stay tuned (and subscribe!). *** Credits Guest: Jenny Zhang Hosts: Zena Assaad and Liz Williams Producers: Zena Assaad and Liz Williams Sound editors: Cyril Buchard (with final edits by Liz Williams) *** To learn more about the podcast and our guests you can visit our website algorithmicfutures.org. And if you've enjoyed this, please like the podcast on Apple Podcasts and share your favourite episodes with others. It really helps us get the word out. And now for a short disclaimer: This podcast is for your education and enjoyment only. It is not intended to provide advice specific to your situation.
In this episode we talk with Caitlin Bentley, a Lecturer in AI Education at King's College London. Caitlin's research has predominantly engaged with questions around how technology systems can be designed and implemented in ways that promote social inclusion, empowerment and democratic participation. Tune in to hear about a theme of fierce women in history, the ups and downs of experimenting with educational pedagogies, intersectionality and its applications in technology research, and critical Black feminists across history. Please note: Caitlin briefly mentions encountering evidence of violence against women as part of her experiences in Morocco. This portion of the episode may not be appropriate for young listeners. Credits: Hosts: Zena Assaad and Liz Williams Guest: Caitlin Bentley Producers: Zena Assaad and Liz Williams Sound editor: Cyril Buchart Transcript
In this episode, co-hosts Zena and Liz share some of their experiences on creating podcast episodes in support of the Social Responsibility of Algorithms workshop series and discuss the potential futures of the Algorithmic Futures podcast. Along the way, they have a wide-ranging discussion covering everything from how assumptions get embedded in technologies deployed at scale to what it's like being a woman working in a male-dominated STEM field. This episode was developed in support of the Algorithmic Futures Policy Lab, a collaboration between the Australian National University (ANU) Centre for European Studies, ANU School of Cybernetics, ANU Fenner School of Environment and Society, DIMACS at Rutgers University, and CNRS LAMSADE. The Algorithmic Futures Policy Lab is supported by an Erasmus+ Jean Monnet grant from the European Commission.
loading
Comments 
loading