Observability, Agents & Humanity: How PwC's Nathan Reichardt Is Rewriting the Rules of Responsible AI
Description
In this episode of Careers and the Business of Law, David Cowen sits down with Nathan Reichardt, PwC's Lead Managed Services Director and AI Champion, for a conversation that bridges technology and humanity. They unpack why "observability" isn't just a technical concept, it's the foundation of trust in an age of autonomous agents. From building glass-box systems that make AI accountable to recognizing the invisible pressures on professionals, this discussion explores what it really takes to lead responsibly in the era of AI.
Key Topics Covered:
- Agents aren't magic, you must observe them. Why oversight is essential as AI agents act and learn autonomously.
- From black box to glass box. Transparency, explainability, and compliance as non-negotiable design principles.
- Responsible AI in practice. What observability really means for governance, risk, and trust.
- The rise of new roles. Why "AI Observer" and "Observability Lead" may soon become critical titles inside legal and business ops.
- The human dimension. How leaders can apply observability to people spotting stress, isolation, and burnout before it's too late.
- From pilot to practice. PwC's approach to scaling agentic AI safely through iteration, measurement, and feedback.
🎧 Subscribe on Apple Podcasts, Spotify, or wherever you listen. Share this episode and take your career from now to next!
💡To learn more about the future of legal innovation, visit https://cowengroup.com/
Never eat alone!
Â























