Discover
AWS For AI
AWS For AI
Author: AWS For AI
Subscribed: 2Played: 0Subscribe
Share
© Amazon Web Services, Inc. or its affiliates. All rights reserved.
Description
Decoding The Future of Artificial Intelligence with AWS:
Explore the frontiers of artificial intelligence with AWS For AI, your insider guide to the technologies reshaping our world.
Each episode brings you face-to-face with the brilliant minds behind groundbreaking AI innovations from pioneering researchers, to executives transforming businesses with generative AI.
10 Episodes
Reverse
Join us for a fascinating deep dive with Ali Benfattoum, AWS Principal Product Manager and Physical AI expert, as we explore the revolutionary world of agentic twins and the foundations of physical AI.Ali brings over 15 years of expertise in IoT, Applied AI, and Digital Twins, including his groundbreaking work as creator of the open-source Garnet Framework.Discover how AI is evolving beyond digital realms into physical applications, transforming everything from smart cities to industrial systems. Learn about the intersection of IoT and AI, the power of knowledge graphs in creating context-aware solutions, and how digital twins are enabling intelligent, responsive environments across major cities and global enterprises.Get insider insights on semantic data modeling, context management strategies, and the practical implementation of living digital twins. Ali shares his unique perspective bridging mathematics, physics, and cutting-edge AI technology, offering a comprehensive view of where Physical AI is heading and how it's reshaping our interaction with the physical world.Don't miss this technical exploration where we uncover real-world applications of agentic systems and the infrastructure powering the next generation of intelligent physical environments.Garnet Framework: https://garnet-framework.tech/Learn more: https://aws.amazon.com/iot/Chapters:0:00:00 : Episode Introduction0:02:37 : Ali's Journey to Physical AI and IoT0:05:35 : From smart territory framework to physical AI0:08:00 : The Garnet Framework : From data silos to insights0:12:02 : Building with the garnet framework0:14:58 : AI Agents as Data Consumers0:18:02 : Knowledge graphs for true context aware AI0:22:12 : NGSI-LD specification0:24:20 : Sustainable AI foundations for the future0:26:57 : Physical AI : perceive reason act in real world0:29:35 : Use Cases with the most impact0:32:48 : Maximizing ROI of using physical AI.0:39:09 : Next generation of automated industry0:40:08 : Panama Success Stories0:42:30 : City of Madrid Success Story0:43:43 : From digital twins to agentic twins0:47:06 : Training world models , nvidia omniverse0:50:47 : Build today for tomorrow’s innovations on AWS0:55:15 : Future Trends in Agentic Systems0:58:46 : Final Thoughts1:00:02 : Closing Remarks
Join us for an in-depth conversation with Dr. Zuhair Khayat, CTO and co-founder of Lucidya, a groundbreaking AI company revolutionizing customer experience in the Middle East. From his journey through academia at KAUST to pioneering Arabic language AI solutions, Dr. Khayat shares invaluable insights on building technology for Arabic markets. Discover how Lucidya is tackling unique challenges in Arabic language processing, managing customer experience across dialects, and leveraging AI for brand management in the MENA region.Learn how Lucidya is transforming social listening and customer intelligence through advanced AI. From sophisticated sentiment analysis to real-time brand monitoring across multiple dialects, discover the future of customer experience management.Learn more: http://go.aws/47yubYqLucidya Website: http://go.aws/4846zeHDr. Khayat's Research: https://scholar.google.com/citations?... Chapters: 00:00 - Introduction02:27 - Lucidya's Origin Story05:20 - AI startup vision before GenAI07:30 - From Saudi Arabia to the world. 08:11 - Lucidya’s Long Term Mission 08:59 - What is Customer Experience ?10:55 - Lucidya’s Value Proposition12:18 - Use Cases : Brand Managment17:20 - Use Cases : Customer Support 19:14 - Use Cases : Market Research 20:43 - Use Cases : Marketing Optimization 21:52 - Building Products in GenAI Era 25:58 - Keeping up with the AI innovations 26:45 - Product bundles from Research to Support28:50 - Evolution of Challenges for Building AI Models 30:18 - ASAD - Arabic Sentiment Analysis Data 31:22 - Navigating Ambiguity in Human Communications33:40 - Managing Biases in Data with agents 38:02 - Arabic Dialects Support 40:41 - Split Learning, Collaborative Training on Private Data on the Cloud.47:10 - Opportunities for Split Learning in Regulated Industries 49:35 - U shape Learning as a Collaborative Framework for Privacy 50:12 - a CTO’s advise for AI startups 53:20 - Choosing the right architecture on AWS for your team55:45 - The Future of AI. 58:06 - Closing Remarks
Join us for an enlightening conversation with Anton Alexander, AWS's Senior Specialist for Worldwide Foundation Models, as we delve into the complexities of training and scaling large foundation models. Anton brings his unique expertise from working with the world's top model builders, along with his fascinating journey from Trinidad and Tobago to becoming a leading AI infrastructure expert.Discover practical insights on managing massive GPU clusters, optimizing distributed training, and handling the critical challenges of model development at scale. Learn about cutting-edge solutions in GPU failure detection, checkpointing strategies, and the evolution of inference workloads. Get an insider's perspective on emerging trends like GRPO, visual LLMs, and the future of AI model development.Don't miss this technical deep dive where we explore real-world solutions for building and deploying foundational AI models, featuring discussions on everything from low-level infrastructure optimization to high-level AI development strategies.Learn more: http://go.aws/47yubYqAmazon SageMaker HyperPod : https://aws.amazon.com/fr/sagemaker/ai/hyperpod/The Llama 3 Herd of Models paper : https://arxiv.org/abs/2407.21783Chapters:00:00:00 : Introduction and Guest Background00:01:18 : Anton Journey from Caribbean to AI 00:05:52 : Mathematics in AI 00:07:20 : Large Model Training Challenges00:09:54 : GPU failures : Lama Herd of models 00:13:40 : Grey failures 00:15:05 : Model training trends00:17:40 : Managing Mixture of Experts Models00:21:50 : Estimate how many GPUs you need.00:25:12 : Monitoring loss function 00:27:08 : Crashing trainings 00:28:10 : SageMaker Hyperpod story00:32:15 : How we automate managing grey failures00:37:28 : which metrics to optimize for 00:40:23 : Checkpointing Strategies 00:44:48 : USE Utilization, Saturation, Errors 00:50:11 : SageMaker Hyperpod for Inferencing 00:54:58 : Resiliency in Training vs Inferencing workloads 00:56:44 : NVIDIA NeMo Ecosystem and Agents00:59:49 : Future Trends in AI 01:03:17 : Closing Thoughts
Join us as we sit down with AWS Solutions Architect Mirabela Dan for a journey into the world of generative AI for developers. Whether you're a seasoned developer or just getting started, this episode is your gateway to staying ahead in the AI revolution.Discover: the game-changing shift from "vibe coding" to spec-driven development and learn about the latest AWS AI tools that can supercharge your productivity. Get real-world insights on how AI is reshaping the developer landscape and plan essential strategies to future-proof your development careerDon't miss this power-packed episode where we demystify the intersection of AI and modern development practices. Get ready to transform the way you build with AWS's innovative AI solutions!Learn more: https://aws.amazon.com/ai/ and https://kiro.dev/Connect with Mirabela: https://www.linkedin.com/in/carmenmirabeladan/Chapters 0:00:00 : Introduction and Guest Background0:02:48 : Mirabela's Journey at AWS0:05:38 : Working with Different Customer Types0:08:52 : Shift from Infrastructure to AI0:09:35 : Evolution of Developer Role with AI0:15:49 : Challenges in Developing with AI0:20:06 : AI for Code Generation vs Maintenance0:28:33 : Spec-Driven Development Approach0:39:30 : Balancing Planning vs Rapid Delivery0:42:23 : Kiro - AWS AI-Powered IDE0:44:46 : MCP Integration0:46:10 : Agent Steering and Hooks 0:54:43 : Code Transformation with AI0:55:57 : Future of Developer Role0:59:55 : Consuming AI/Tech Updates1:00:17 : Personal Learning Methods1:01:30 : Optimism for Future of Development1:02:34 : Closing Remarks
In this episode of AWS for AI, we sit down with Akshat Prakash, CTO and co-founder of Camb.ai, to explore how this Dubai-based startup is revolutionizing content localization through AI. From making sports accessible in 140+ languages to preserving indigenous cultures with fewer than 600 speakers, discover how they're breaking down global language barriers.Witness history as we discuss how Camb.ai partnered with NASCAR to become the first company to livestream a race with real-time AI dubbing. Deep dive into their groundbreaking technical solutions - from preserving context, sarcasm, and emotion in speech, to their innovative approach of separating voice identity from speech prosody, solving critical ethical challenges in voice AI.Get exclusive insights into their MARS and BOLI models, their partnership with AWS, and learn why deep problem understanding trumps technical expertise in building successful AI solutions. Whether you're a technologist, content creator, or business leader, this episode offers valuable insights into the future of global communication.Learn more about CAMB.AI: https://www.camb.ai/Chapters 0:00:00 : Dubbed Introduction 00:00:17 : Episode Introduction00:02:14 : From Siri to Camb.ai: Akshat's AI Journey00:03:52 : Breaking Language Barriers: The Camb.ai Family Story00:05:47 : Beyond Translation: Understanding Cultural Context 00:07:00 : The Story Behind the Name 'CAMB' 00:08:01 : The Mamba Mentality00:10:19 : Dubai to the World: Exporting AI Innovation00:12:43 : Making History: First Multi-Language Live Race Stream00:14:30 : A "Man on the Moon" Moment in AI00:15:15 : Camb.ai Technology Suite and Offerings 00:19:56 : Creating Value Across All Layers 00:22:36 : Solving Last Mile Problems in AI Localization 00:23:59 : Focus on Results: Getting the Job Done 00:24:38 : Tackling the Hardest Challenge First: Live Sports00:28:00 : MARS Architecture: Balancing Prosody, Speed, and Performance. 00:31:01 : Understanding Auto-Regression Tradeoffs 00:31:58 : Speaker Entanglement: Core Voice Identity Challenges00:33:42 : Ethics in Voice Identity Usage 00:35:12 : Building with Resource Constraints 00:36:48 : The Case for Small Language Models00:42:21 : Speech-to-Speech vs. Cascading Architecture Approach00:46:30 : Preserving Context in Cascading Architecture00:49:29 : BOLI: Enhanced Context Through Multi-modality00:50:59 : Inclusive AI: Supporting All Languages 00:53:12 : Managing Dialects vs Languages 00:53:49 : MBC Partnership: Advancing Arabic Understanding 00:54:56 : From Sports to Rap: Diverse Use Cases00:59:53 : A CTO's AWS Journey01:02:38 : Accelerating Innovation with SageMaker Hyperpod01:03:58 : The Future of AI: Final Thoughts 01:07:24 : Closing Remarks
Explore the cutting-edge world of causal AI with Professor Kun Zhang in this enlightening episode of the AWS for AI podcast. As a leading researcher from MBZUAI and Carnegie Mellon University, Professor Zhang delves into the fundamentals of causal discovery and inference, revealing how these techniques are reshaping the landscape of artificial intelligence. From education to finance, healthcare to climate science, discover how causal AI is revolutionizing diverse fields by answering the crucial "why" and "what if" questions that traditional machine learning often overlooks. Professor Zhang shares his vision for a future where AI not only provides convenience and safety but also promotes human intelligence and societal harmony. He offers valuable insights on the ethical considerations of AI development and the role of cloud computing in facilitating large-scale AI research collaborations. Whether you're an AI enthusiast, a researcher, or simply curious about the future of technology, this episode provides a fascinating glimpse into the transformative potential of causal AI. Join us for an in-depth discussion that bridges the gap between correlation and causation, paving the way for more interpretable, robust, and ethical AI systems.Professor Kun Zhang : https://mbzuai.ac.ae/study/faculty/ku... MBZUAI : https://mbzuai.ac.ae/
In this insightful episode, we're joined by Eduardo Ordax, Principal Specialist for Generative AI Go-to-Market at AWS and recognized as Spain's #1 most influential person in AI, and #14 worldwide. Eduardo shares his deep expertise on the latest trends in generative AI, including the evolution of AI agents, the rise of small language models, and AWS's approach to responsible AI innovation. He offers valuable insights into AWS Nova models, the infrastructure powering enterprise AI, and practical perspectives on building AI solutions at scale. Whether you're a developer, business leader, or AI enthusiast, this episode provides actionable insights into the current state and future direction of enterprise AI adoption. Eduardo's unique blend of technical knowledge and practical experience offers listeners a comprehensive view of how organizations can successfully implement AI while navigating key challenges around data foundations, model selection, and responsible deployment.Learn More : Eduardo’s Linkedin Profile: http://go.aws/4ldsLX6Nova Models: http://go.aws/4n42nk4Amazon Bedrock Agents: http://go.aws/3FY13P3Amazon SageMaker HyperPod : http://go.aws/4kL0SWB
In this episode of AWS for AI, we explore the cutting-edge world of emotional AI with Moayad Rayan, Partner and Head of Technology at Virtue.Discover how Virtue is bridging the gap between humans and machines through their innovative approach to understanding and responding to human emotions. Learn about their proprietary emotional data analysis system, real-world applications in safety and city planning, and how they're using AWS to scale their solution. From reducing human errors by 75% to creating more empathetic AI interactions, this episode reveals the future of human-centered artificial intelligence.To learn more about Virtue : virtuedeep.techOwnverse : ownexperiences.comLearn more at - https://aws.amazon.com/ai/
Join us for an enlightening conversation with Julien Simon, VP and Chief Evangelist at ARCEE.AI , as he shares deep insights on building practical and cost-efficient AI solutions. From his extensive experience at AWS, Hugging Face, and now ARCEE.AI, Julien discusses why "small is beautiful" when it comes to language models, revealing how 10B parameter models can now match the performance of much larger 72B models from just months ago. Learn about innovative techniques like model merging, the importance of proper infrastructure choices, and practical advice for organizations starting their AI journey. This episode covers critical topics including:Why small language models are the future of enterprise AIHow to optimize costs while maintaining performanceThe role of CPU vs GPU inferenceEssential architecture considerations for AI workloadsBest practices for building production-ready AI systemsWhether you're a startup, enterprise, or public sector organization, this episode offers invaluable guidance on building scalable, efficient, and practical AI solutions in today's rapidly evolving landscape.Julien Simon Youtube channel : https://www.youtube.com/@juliensimonfrto learn more about ARCEE.AI : https://www.arcee.ai/
Explore the groundbreaking development of Opus - the world's first large work model. With over two decades of experience across financial services, cybersecurity, and AI, Phillip Kingston , CTO of AppliedAI , shares fascinating insights into how Opus is revolutionizing enterprise workflow automation.Learn how Opus differs from traditional language models by predicting actions rather than words, and discover how it leverages a massive knowledge graph containing millions of documented business processes to optimize workflows across organizations. Phillip discusses the platform's unique approach to blending human and AI capabilities, the future of enterprise AI adoption, and how Applied AI is helping regulated industries embrace innovation while maintaining compliance.The conversation covers everything from the technical architecture behind Opus to broader philosophical questions about the future of work. Whether you're a technology leader, developer, or business professional interested in enterprise AI transformation, this episode offers valuable perspectives on how AI is reshaping organizational processes and decision-making.Join us for an illuminating discussion about the next frontier of AI-powered workflow automation and learn how AppliedAI is helping organizations navigate the transition to more intelligent, efficient operations.To learn more about AppliedAI Opus : https://www.opus.com/





