Discover
Life Is But A Stream
Life Is But A Stream
Author: Confluent
Subscribed: 1Played: 1Subscribe
Share
© Confluent
Description
There’s a new divide in the data revolution—there are the organizations harnessing real-time data streaming for immediate impact, and there are those left in the dust.
Life Is But A Stream is a web show for technology leaders who need to act the instant data is created. Every episode, technical leaders and industry pioneers share how they’re utilizing data streaming to enrich, analyze, and put real-time data to work across use cases that were once unimaginable. Host Joseph Morais and senior tech leaders break down winning strategies with real-world examples, teaching you how to accelerate decision-making, achieve measurable business impact, and turn your data into your biggest competitive advantage.
Ready to lead the data streaming revolution? Join us on Life Is But A Stream.
Powered by the data streaming experts at Confluent.
Life Is But A Stream is a web show for technology leaders who need to act the instant data is created. Every episode, technical leaders and industry pioneers share how they’re utilizing data streaming to enrich, analyze, and put real-time data to work across use cases that were once unimaginable. Host Joseph Morais and senior tech leaders break down winning strategies with real-world examples, teaching you how to accelerate decision-making, achieve measurable business impact, and turn your data into your biggest competitive advantage.
Ready to lead the data streaming revolution? Join us on Life Is But A Stream.
Powered by the data streaming experts at Confluent.
22 Episodes
Reverse
How is it possible to order groceries to your doorstep in seconds, yet still have to fax a signed contract and deliver a physical check to book a bus? For Busie, the answer wasn’t just building a marketplace, it was building the underlying infrastructure from the ground up. By adopting a central nervous system approach to data, their lean engineering team has modernized a legacy industry, replacing manual workflows with real-time ecommerce and automated operations. In this episode, our host, Joseph, sits down with Brady Perry, Co-founder and CTO of Busie, to discuss the transformation from point-to-point REST APIs to fully decoupled, event-driven architecture. Brady shares the pragmatic “buy vs. build” logic that allowed them to offload Kafka management to Confluent Cloud, freeing his team to focus on solving customer pain points rather than managing infrastructure. You’ll Learn:How event-driven architecture enables loose coupling and always-on systemsWhy data streaming became the architectural centerpiece for a lean engineering teamWhy a small startup team chose managed Kafka to accelerate delivery and shift focus from ops to valueAbout the Guest:Brady Perry is the Co-Founder & CTO at Busie, and is a technology entrepreneur focused on solving outdated problems with modern software solutions. A self-taught software developer, Brady brings hands-on experience across TypeScript, JavaScript, React, Python, Django, HTML, and CSS, along with expertise in multiple PLC scripting languages. His background spans both software and operational systems, giving him a practical perspective on building scalable, real-world platforms.Guest Highlight:“Data streaming has actually become sort of the centerpiece of the architecture that we’ve built… the stream becomes the API. That’s where the definition is. That’s where the contract is.”Episode Timestamps: 07:20 – Data Streaming Goodness19:30 – Beyond the Stream 39:30 – Quick Bytes45:15 – Joseph’s Top TakeawaysDive Deeper into Data Streaming:EP1—Stream On: Unleashing Innovation With Data Streaming | Life Is But A StreamEP2—Processing Without Pause: Continuous Stream Processing and Apache Flink® | Life Is But A StreamEP3—The Connective Tissue: Shift Left to Turn Data Chaos to Clarity | Life Is But A StreamLinks & Resources:Connect with Joseph: @thedatagiantJoseph’s LinkedIn: linkedin.com/in/thedatagiantBrady’s LinkedIn: linkedin.com/in/brady-perryBuilding a Charter Platform From the Ground Up:https://www.confluent.io/blog/busie-building-a-charter-platform/?utm_source=simplecast&utm_medium=podcast&utm_id=tm.content_life-is-but-a-stream-busieLearn more at Confluent.ioOur Sponsor: Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent data streaming platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true data streaming platform from the pioneers in data streaming. Learn more at confluent.io. Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
AI isn’t just evolving—it’s reshaping who your customers are, how systems operate, and what real time really means. From machines making purchase decisions to agents increasing query volume across databases, the realities of 2026 are forcing leaders to rethink data architecture and governance strategies at a fundamental level. In this episode, Joseph is joined by Will LaForest (Field CTO, Confluent), Adi Polak (Director of Developer Advocacy & Experience, Confluent), and independent analyst, Sanjeev Mohan, to break down critical insights from Confluent’s 2026 Predictions Report. Together, they explore how agentic AI will transform digital commerce, why Model Context Protocol is quickly becoming table stakes, and how context engineering is emerging as the next major unlock for AI systems. The conversation also dives into the acceleration of AI-enabled cybercrime, why enterprises can no longer dismiss data governance, how Apache Iceberg™ is quickly becoming the standard in cold data management, and much more.If you’re navigating AI and data streaming, this episode offers a grounded, opinionated take on what’s coming next—and how technologies like Apache Kafka® and Apache Flink® will shape what you need to do now. About the Guests:Will LaForestWill is Field CTO for Confluent. Will works with customers across a broad spectrum of industries and government, enabling them to realize the benefits of a data in motion architecture with event streaming. He is passionate about data technology innovation and has spent 26 years helping customers wrangle data at massive scale. His technical career spans diverse areas from software engineering, NoSQL, data science, cloud computing, machine learning, and building statistical visualization software but began with code slinging at DARPA as a teenager. Will holds degrees in mathematics and physics from the University of Virginia. Adi PolakAdi is the Director of Advocacy and Developer Experience Engineering at Confluent. For most of her professional life, she has worked with data and machine learning for operations and analytics. As a data practitioner, she developed algorithms to solve real-world problems using machine learning techniques and expertise in Apache Spark, Kafka, HDFS, and distributed large-scale systems.Adi has taught how to scale machine learning systems to thousands of practitioners, and is the author of successful books: Scaling Machine Learning with Spark and High Performance Spark 2nd Edition.Sanjeev MohanSanjeev is a recognized thought leader in cloud technologies, modern data architectures, analytics, and artificial intelligence. With a keen focus on emerging trends and technologies, Sanjeev hosts It Depends podcast and authors regular Medium blogs. He is also the author of Data Product for Dummies.Formerly a Vice President at Gartner, Sanjeev was renowned for his in-depth research and strategic insights, shaping the research agenda for data and analytics globally. Over the past three years, he has led SanjMo, a consultancy specializing in technical advisory services that elevate category and brand awareness for clients.Guest Highlights:“There was already an asymmetric relationship between attackers and defenders. This just increases that, because it’s easier to use AI to attack than it is to use it to defend.” — Will LaForest“We’re about to accidentally DDoS our software and databases, because we’re giving a tool to automate things without thinking if our systems are actually ready for it.” — Adi Polak“Your proprietary data is your only moat. Everything else—models, performance, tooling—will keep changing.” — Sanjeev MohanEpisode Timestamps: 03:40 – Predictions Report Overview05:05 – Prediction 1: The Rise of Agentic Commerce: Machines Are Your New Customers12:45 – Prediction 2: Leading Platforms Will Offer Model Context Protocol18:00 – Prediction 3: Context Engineering Is the Next AI Unlock24:15 – Prediction 4: AI Will Apply Increased Pressure to Existing Databases30:10 – Prediction 5: AI Will Drive Cyber Crime to Unprecedented Levels36:15 – Prediction 6: AI Will Accelerate Enterprise Investment in Data Governance43:25 – Prediction 7: Apache Iceberg™ Will Become the Standard for Cost-Effective Cold Data Management49:20 – Prediction 8: Your AI Strategy Will Need an Independent Data Plane to Avoid Overcommitting55:00 – Prediction 9: Early Adopters of Durable Execution Engines Will Gain a Competitive AI Edge1:00:01 – Prediction 10: Improvements in Generative AI Will Help Businesses Finally Address Legacy Tech DebtDive Deeper into Data Streaming:EP1—Stream On: Unleashing Innovation With Data Streaming | Life Is But A StreamEP2—Processing Without Pause: Continuous Stream Processing and Apache Flink® | Life Is But A StreamEP3—The Connective Tissue: Shift Left to Turn Data Chaos to Clarity | Life Is But A StreamLinks & Resources:Confluent’s 2026 Predictions Report: [link here]Connect with Joseph: @thedatagiantJoseph’s LinkedIn: linkedin.com/in/thedatagiantWill’s LinkedIn: linkedin.com/in/willlaforestAdi’s LinkedIn: linkedin.com/in/polak-adiSanjeev’s LinkedIn: linkedin.com/in/sanjmoListen to Podcast: https://pod.link/1792991234Learn more at Confluent.ioOur Sponsor: Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent data streaming platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true data streaming platform from the pioneers in data streaming. Learn more at confluent.io. Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
Legacy systems don’t have to hold you back. In high-volume markets where milliseconds matter, MarketAxess—an electronic trading platform for institutional investors—needed a way to scale trade data reliably across on-prem systems and the cloud. That journey—from self-managed Apache Kafka® to MSK to Confluent Cloud—reshaped how their engineering teams build, scale, and govern real-time infrastructure.In this episode, Andy Steinmann, Software Engineering Manager at MarketAxess, breaks down how his team evolved from a monolithic, socket-driven architecture to a streaming-first platform. He shares why Kafka became the foundation of their 2.0 strategy, how Confluent Cloud eliminated operational drag during peak trading windows, and what it takes to modernize without disrupting a mission-critical financial ecosystem. You’ll learn:How MarketAxess scaled real-time trade flows across on-prem and the cloudHow Confluent Cloud eliminated outages and operational bottlenecksHow to approach modernization through iterative “2.0” patterns and dark deploymentsIf you’re ready to explore hybrid connectivity, governance, performance tuning, and the operational realities of scaling market-data workloads, this is the episode for you.About the Guest:Andy Steinmann, Software Engineering Manager at MarketAxess, is a seasoned software engineer with 20+ years of experience in web, mobile, and backend technologies delivering proven solutions to thousands of users. Andy has been the technical lead/architect on many teams and participated in implementing and adapting agile methodologies to improve productivity.Guest Highlights:“One of the big benefits of the partnership with Confluent has not only been the zero cluster outages—but the bigger side of it was really the access to the experts… Somebody to walk that journey with you as opposed to just starting with a Google search or a ChatGPT question.”Episode Timestamps: 3:20 – Data Streaming Goodness20:20 – Beyond the Stream 36:15 – Quick Bytes43:15 – Joseph’s Top TakeawaysDive Deeper into Data Streaming:EP1—Stream On: Unleashing Innovation With Data Streaming | Life Is But A StreamEP2—Processing Without Pause: Continuous Stream Processing and Apache Flink® | Life Is But A StreamEP3—The Connective Tissue: Shift Left to Turn Data Chaos to Clarity | Life Is But A StreamLinks & Resources:Connect with Joseph: @thedatagiantJoseph’s LinkedIn: linkedin.com/in/thedatagiantLearn more at Confluent.ioOur Sponsor: Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent data streaming platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true data streaming platform from the pioneers in data streaming. Learn more at confluent.io. Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
In 2025, real-time data moved from important to mission-critical. From billion-event pipelines to AI agents reasoning over fresh data streams, the conversations this year revealed just how fast organizations are rethinking their data infrastructure and what’s possible when data streaming becomes the foundation.In this episode, Joseph Morais curates standout moments from the year: the stakes of real-time data, the architectural decisions behind enterprise-grade streaming, and how AI strategy is shifting toward agentic, event-driven systems. You’ll hear leaders discuss architectures designed for zero data loss, solid governance, multi-agent orchestration, and why organizations choose Confluent as the data streaming platform for data streaming at scale.You’ll learn:How enterprises use Apache Kafka® and Apache Flink® to meet mission-critical guarantees like reliability and data integrityWhy real-time pipelines matter for agentic AI and fresh-signal reasoningWhat leaders evaluate when choosing a data streaming platformHow to prepare architectures for multi-agent systems and 2026 AI roadmapsThe Guests:Cosmo Wolfe, CTO, Metronome Chris Kapp, Software Architect, HS1 (Henry Schein One)Alex Haugland , Cursor Engineer, AnysphereSteffen Hoellinger, Co-Founder & CEO, AiryEthan Chan, Co-Founder & CEO, AlliumJoe Pardi, Sr. Director of Global Data Engineering, CovetrusRami Al Lolah, Lead Architect - Integration Center of Excellence, SwedbankSean Falconer, Sr. Director, Product Management - AI Strategy, ConfluentJeffrey Jonathan Jennings (J3), Founder, signalRoomTodd Smitala, Lead Engineer, Association of Professional Flight AttendantsGuest Highlights: “We are processing events on a second-level latency end-to-end… it'll show up on your end customer's invoice in sub-seconds, and you're cutting users off for fraud, all done exactly once.” — Cosmo Wolfe, Metronome“Security loves us, right? Because everything is immutable. We know who did everything and we know who can access what and who has been accessing what.” — Chris Kapp, HS1 (Henry Schein One) “These are not dashboards… If you want to transform your business and take a proactive approach, reacting to signals as they’re happening, you need to shift your mindset and move data responsibilities left.” — Sean Falconer, ConfluentEpisode Timestamps: *(00:00) - Looking Back at Data Streaming in 2025*(00:38) - Segment 1: Unique Streaming Use Cases*(06:05) - Segment 2: The Power of Partnerships *(11:23) - Segment 3: Data Streaming and Real-Time AIDive Deeper into Data Streaming:EP1—Stream On: Unleashing Innovation with Data Streaming | Life Is But A StreamEP2—Processing Without Pause: Continuous Stream Processing and Apache Flink® | Life Is But A StreamEP3—The Connective Tissue: Shift Left to turn Data Chaos to Clarity | Life Is But A StreamLinks & Resources:Connect with Joseph: @thedatagiantJoseph’s LinkedIn: linkedin.com/in/thedatagiantLearn more at Confluent.ioOur Sponsor: Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent data streaming platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true data streaming platform from the pioneers in data streaming. Learn more at confluent.io. Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
Saudi Arabia is rapidly emerging as a global AI powerhouse, a transformation fueled by a fundamental shift in how data is managed, processed, and activated. Mohamed Ahmed Anwar, General Manager of Presales at (sccc by stc) Saudi Cloud Computing Company, explains why they adopted Confluent’s data streaming platform: cyber resilience and security, strong governance, and scalability paired with simplified operations at enterprise scale. Anwar unpacks why data streaming is foundational to modern AI, from launching model-as-a-service offerings to enabling agentic architectures. He highlights how sccc by stcs strategic partnership with Confluent delivers value extending far beyond open source Apache Kafka®, including connectors, governance with schemas, Kubernetes-based deployment, self-balancing clusters, and Cluster Linking that make enterprise operations tractable without an “army of engineers”.You’ll Learn:How sccc by stc uses Confluent to deliver a managed data streaming platform that powers AI use cases while maintaining performance, security, and control across public and private sectors Why data governance, local compliance, and operational scalability are are critical within Saudi ArabiaWhat makes real-time platforms indispensable for enhanced customer experience and operational agilityHow data streaming transforms traditional batch into dynamic, scalable AI architectures (e.g., RAG and agentic systems) by ensuring fresh, trustworthy data About the Guest:Mohamed Ahmed Anwar is a seasoned technology leader with over 18 years of experience driving innovation across the IT industry. He has held leadership roles at global tech giants including Microsoft, Dell Technologies, and Pure Storage, where he led complex digital transformation initiatives with a focus on delivering business impact. Currently, he serves as the General Manager of Presales at sccc by stc, where he drives strategic efforts to empower Saudi organizations through advanced cloud services, accelerating their growth and digital modernization.Throughout his career, Anwar has led initiatives in data analytics, cloud adoption, and artificial intelligence, enabling organizations to harness the power of emerging technologies. His passion for mentorship and team development is reflected in his dedication to nurturing talent and building resilient, future-ready teams. Anwar is driven by a belief in the transformative potential of technology, consistently working to bridge strategic vision with practical execution to deliver meaningful, lasting impact. Guest Highlight: “Batch computing is like getting your water from a water well. You have a bucket and are governed by the size of the bucket and by the speed and the strength in your arms while in the data stream. On a streaming platform, you have a stream of data. You govern it. You know how exactly you will direct this stream of data, how you will store it and so on. This actually affected the time-to-value for our customers.”Episode Timestamps:1:00 — Guest introduction & company overview5:55 — Segment 1: Data Streaming Goodness14:05 — Segment 2: Beyond the Stream26:50 — Segment 3: Quick Bytes30:15 — Joseph’s Top TakeawaysDive Deeper into Data Streaming:EP1—Stream On: Unleashing Innovation with Data StreamingEP2—Processing Without Pause: Continuous Stream Processing and Apache Flink®EP3—The Connective Tissue: Shift Left to turn Data Chaos to ClarityLinks & Resources:Connect with Joseph: @thedatagiantsccc by stc: https://www.youtube.com/@sccc_bystc/videosJoseph’s LinkedInAnwar’s LinkedInLearn more at Confluent.ioOur Sponsor: Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent data streaming platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true data streaming platform from the pioneers in data streaming. Learn more at confluent.io. Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
From the heart of the expo hall at Current NOLA, this special episode drops you into the conversations, energy, and the breakthroughs as they happened.Host, Joseph Morais, and co-host, Adi Polak talk with data streaming leaders and community voices to unpack how teams are using Apache Kafka® and Apache Flink® to power low-latency, AI-ready applications—covering patterns from usage-based billing and hybrid operations to cost efficiency and streaming governance. You’ll also hear how shift-left processing and governance make AI-ready data possible at scale. Plus, we break down key launches, including Queues for Kafka, Confluent Private Cloud, the Real-Time Context Engine for AI, and more.You’ll learn:How modern architectures power real-time billing, entitlements, and fraud detectionWhy exactly-once processing and stateful workloads matter at scaleHow Unified Stream Manager and Confluent Private Cloud simplify hybrid, multi-cluster operations with a single pane of glassHow Real-Time Context Engine serves governed, low-latency context to agentic AI via MCPThe Hosts:Joseph Morais, Technical Champion, Confluent Adi Polak, Director of Advocacy and Developer Experience Engineering, ConfluentThe Guests:Cosmo Wolfe, CTO, Metronome Kevin Balaji, Sr. Director, Product Marketing, ConfluentAdam Bellemare, Principal Technologist, ConfluentSam Barker, Principal Software Engineer, IBMOlena Kutsenko, Staff Developer Advocate, ConfluentScott Haines, O’Reilly Author and OSS EducatorGuest Highlights: “If your company is shifting to be more nimble with monetization and especially if you're switching to more of a consumption based model, then the underlying systems have to be real time to support that.” — Cosmo Wolfe, Metronome“So many customers out there…they are running in Amazon and in their data center and somewhere else. They're all looking for ways to monitor and manage all these different clusters in one way.” — Kevin Balaji, Confluent “There's a reason we call it time travel—because we're going back to a prior point in history to replay data and do something else with it.” — Scott Haines, O’Reilly Author and OSS EducatorEpisode Timestamps: 01:00 — Cosmo Wolfe, CTO, Metronome 17:10 — Kevin Balaji, Sr. Director, Product Marketing, Confluent28:40 — Adam Bellemare, Principal Technologist, Confluent43:15 — Sam Barker, Principal Software Engineer, IBM50:30 — Olena Kutsenko, Staff Developer Advocate, Confluent01:05:00 — Scott Haines, O’Reilly Author and OSS EducatorDive Deeper into Data Streaming:EP1—Stream On: Unleashing Innovation with Data Streaming | Life Is But A StreamEP2—Processing Without Pause: Continuous Stream Processing and Apache Flink® | Life Is But A StreamEP3—The Connective Tissue: Shift Left to turn Data Chaos to Clarity | Life Is But A StreamLinks & Resources:Connect with Joseph: https://www.youtube.com/@thedatagiantJoseph’s LinkedIn: linkedin.com/in/thedatagiantAdi’s LinkedIn: https://www.linkedin.com/in/polak-adi/Keynote: Building Intelligent Systems on Real-time DataWatch Current On Demand: https://current.confluent.io/archive/2025/new-orleans?utm_source=simplecast&utm_medium=podcast&utm_id=tm.content_life-is-but-a-streamContext-Driven AI Reigned Supreme at Current New Orleans: https://www.confluent.io/blog/context-driven-ai-current-new-orleans/?utm_source=simplecast&utm_medium=podcast&utm_id=tm.content_life-is-but-a-streamBuilding Event-Driven Microservice (2E): https://www.confluent.io/resources/ebook/building-event-driven-microservices-2e/?utm_source=simplecast&utm_medium=podcast&utm_id=tm.content_life-is-but-a-streamLearn more at Confluent.ioOur Sponsor: Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent data streaming platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true data streaming platform from the pioneers in data streaming. Learn more at confluent.io. Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
How quickly can your business turn usage into revenue? For modern software-as-a-service (SaaS) companies, speed is a competitive advantage. Metronome is redefining what’s possible with a real-time monetization engine that scales to billions of events per day—all powered by Confluent’s data streaming platform and Apache Kafka®.In this episode, Cosmo Wolfe, CTO at Metronome, joins Joseph Morais to discuss how real-time data streaming enables everything from sub-second billing and fraud prevention to new pricing models and product-led growth. He explains how Metronome processes tens of thousands of invoices per second and billions of events per day, all while keeping 90% of their engineering team focused on building features, not managing operations. You’ll learn:How Metronome achieves second-level latency for billing and fraud detectionWhy real-time monetization requires a data streaming foundationHow Confluent’s managed data streaming platform lets engineering teams focused on innovation, not infrastructure What “time to revenue” means—and why it’s the new north star for SaaS growthIf your business depends on billing, pricing, or product usage data, this episode shows how to turn streaming into a strategic advantage.About the Guest:Cosmo Wolfe is the Chief Technology Officer at Metronome. He is interested in using technology to unlock data, and is experienced in building and managing web, security and identity systems and teams. Cosmo is currently reinventing billing at Metronome.Guest Highlights: “We are processing events on a second-level latency end to end. So if an event occurs in your business, it’ll show up on your end customer’s invoice in sub-seconds.”“We think of the core data infrastructure at Metronome as having two axes—time to revenue, which is basically that second-level latency, and the cardinality of data we can support.”“When you’re making monetization a core part of your product experience, it requires really up-to-date data. And so it requires Metronome, and thus requires Confluent.”Episode Timestamps: *(00:45) - Metronome and the Importance of Real-Time Data*(05:45) - Data Streaming Goodness*(25:25) - Beyond the Stream*(38:00) - Quick Bytes*(42:35) - Joseph’s Top 3 TakeawaysDive Deeper into Data Streaming:EP1—Stream On: Unleashing Innovation with Data Streaming | Life Is But A StreamEP2—Processing Without Pause: Continuous Stream Processing and Apache Flink® | Life Is But A StreamEP3—The Connective Tissue: Shift Left to turn Data Chaos to Clarity | Life Is But A StreamLinks & Resources:Connect with Joseph: @thedatagiantMetronome: @getmetronomeJoseph’s LinkedIn: linkedin.com/in/thedatagiantCosmo’s LinkedIn: linkedin.com/in/cosmowolfeHow Cursor and WarpStream Power AI Productivity: The Future of Coding: How Cursor and WarpStream Power AI Productivity | Life Is But A StreamLearn more at Confluent.ioRegister now for Current NOLA 2025Our Sponsor: Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent data streaming platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true data streaming platform from the pioneers in data streaming. Learn more at confluent.io. Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
Software development is changing fast. With Cursor, Anysphere is building an AI-forward IDE that fuses human creativity with machine intelligence. At the heart of this transformation is data streaming—making it possible to train models responsibly, deliver lightning-fast Tab completions, and scale telemetry without breaking engineering velocity.In this episode, engineer Alex Haugland shares how WarpStream gives Cursor sovereignty over user data, how telemetry and accounting pipelines strengthen product decisions, and why “coding is really just a bug” in how we interact with computers. You’ll learn:Why data sovereignty and trust are critical for AI-powered productsHow WarpStream abstracts complexity while preserving control over dataWhere telemetry and accounting data streaming improve both user experience and business outcomesHow to right-size your data stack and avoid mismatched scaleFor senior leaders evaluating AI and streaming strategies, Cursor offers a blueprint for balancing innovation, cost, and trust.About the Guest:Alex Haugland is an engineer at Anysphere working on Cursor. He is a massive scale distributed systems performance engineer enthusiastic about understanding complex technical systems, organizational systems, and mentoring junior engineers.Guest Highlight: “Our business is built on top of data, and a lot of it is data that people have been really generous to allow us to use… WarpStream lets us do that in a way where the data sits entirely inside stuff that we control and have access to. Being able to say, ‘All this stuff is locked down on S3; we have control of it,’ is really, really important to us.”Episode Timestamps: *(00:45) - Anysphere, Cursor, and Data Streaming Strategy*(04:05) - Data Streaming Goodness*(21:30) - Beyond the Stream*(32:00) - Quick Bytes*(34:45) - Joseph’s Top 3 TakeawaysDive Deeper into Data Streaming:EP1—Stream On: Unleashing Innovation with Data Streaming | Life Is But A StreamEP2—Processing Without Pause: Continuous Stream Processing and Apache Flink® | Life Is But A StreamEP3—The Connective Tissue: Shift Left to turn Data Chaos to Clarity | Life Is But A StreamLinks & Resources:Connect with Joseph: @thedatagiantWarpStream: @WarpStreamLabsCursor: @cursor_aiJoseph’s LinkedIn: linkedin.com/in/thedatagiantAlex’s LinkedIn: linkedin.com/in/alex-haugland-4ba8893bLearn more at Confluent.ioRegister now for Current NOLA 2025: https://current.confluent.io/new-orleans?utm_source=simplecast&utm_medium=podcast&utm_campaign=tm.content_life-is-but-a-streamOur Sponsor: Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent data streaming platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true data streaming platform from the pioneers in data streaming. Learn more at confluent.io. Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
All AI problems are data problems—and one of the biggest is getting AI agents to talk to each other. This special episode with Sean Falconer dives into how agents built by different teams often end up stranded in “intelligence silos,” unable to collaborate or share context. The result? Fragmented AI that struggles to deliver real business value. Tune in to learn why moving beyond isolated AI experiments calls for a robust data streaming backbone—and how Confluent's strategic partnership with Google Cloud sets organizations up for building multi-agent AI systems at scale. With technologies like Apache Kafka®, agents can tap into a central stream of trustworthy data—effectively creating a nervous system for your AI ecosystem. That backbone enables real-time exchange, scalable collaboration, and reliable intelligence across the enterprise—unlocking speed and innovation that siloed AI can’t deliver.About the Guest:Sean Falconer is a Senior Director, Product Management - AI Strategy at Confluent where he works on AI strategy and thought leadership. Sean’s been an academic, startup founder, and Googler. He has published works covering a wide range of topics from AI to quantum computing. Sean also hosts the popular engineering podcasts Software Engineering Daily and Software Huddle. Guest Highlight:“What is the message between agents? It's just data. You're moving that data from one agent system to another agent system, whether that's a multi-agent system that's collaborating or even these agents that exist in different, you know, parts of the business, you can use data streaming essentially be the bridge where those agents shared language, the shared communication becomes events.”Episode Timestamps:*(00:00) - Introduction*(00:34) - Why AI is Fundamentally a Data Challenge*(02:17) - Data Streaming: The Foundation for AI*(07:21) - The “Island of Agents” Problem*(09:46) - Streaming Agents vs. Chatbots*(12:35) - The Consequences of Disconnected Agents*(14:09) - The Power of Event-Driven Architecture*(25:04) - The A2A Protocol—Google Cloud and Confluent *(31:19) - Model Context Protocol (MCP)*(34:13) - Agent Development Kit (ADK)*(37:36) - Real-World AI Use Cases*(42:20) - Key GenAI Features on Confluent Cloud *(45:43) - Wrap-Up: TakeawaysLinks & Resources:Connect with Joseph: https://www.youtube.com/@TheDataGiantWhy Google’s Agent2Agent Protocol Needs Apache Kafka®Joseph’s LinkedIn: https://www.linkedin.com/in/thedatagiantSean’s LinkedIn: https://www.linkedin.com/in/seanf/Real-Time AI Agents Powered by Apache Kafka®, Apache Flink®, and Google CloudListen to podcast: https://flowsto.com/r/9BRiRRl1cLearn more at Confluent.ioOur Sponsor Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent data streaming platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true data streaming platform from the pioneers in data streaming. Learn more at confluent.io.About the ShowHosted by Joseph Morais, Life Is But A Stream is for technology leaders looking to embrace the data revolution and transform their organizations through data streaming. Instant analytics, next level customer experiences, and business innovation that was previously unimaginable: it all starts with Life Is But A Stream. Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
In a highly regulated industry where every millisecond matters, Swedbank is shifting its data strategy to move fast without breaking trust. In this episode, Rami Al Lolah, Lead Architect at Swedbank’s Integration Center of Excellence, shares how the bank built a future-proof foundation using Apache Kafka® with Confluent Cloud.From early investments in building reusable data products to shift left, , Swedbank’s strategy goes beyond infrastructure. It’s about enabling interoperability, eliminating data silos, and empowering teams to think differently about how data should flow across a modern financial institution.You’ll Learn:How Swedbank reduces risk by accelerating data availability across teamsWhy “shift left” is a cultural and architectural shift,not just a development patternWhy choose Confluent over other solutionsWhere Swedbank sees the biggest opportunities for AI and streaming in the futureWhether you're leading modernization for a financial institution or scaling real-time data for your organization, this episode offers a blueprint for transformation. About the Guest:Rami Al Lolah is a seasoned professional, technologist and domain expert for 25+ years within the application and systems integration domain. His expertise is in delivering, designing, writing major integration solutions from world leading integration technology providers, leading technical/services teams, and directing major integration technical implementations. Guest Highlight: “We need to embrace the [shift left] mindset and methodology, and try to move critical considerations, for example — compliance, security, governance, and data quality earlier in the development process. This means that when teams build something like a Kafka topic, or even an API or a file based integration, they are not just focusing on making it work technically. They are already thinking about who owns it, who owns the data, what the contract, what the schemas look like, who will access this data, and how the access will be managed.”Episode Timestamps: *(06:40) - Data Streaming Goodness*(33:15) - The Runbook: Winning Strategies*(41:15) - Data Streaming Street Cred*(47:40) - Quick Bytes*(54:30) - Joseph’s Top 3 TakeawaysDive Deeper into Data Streaming:EP1—Stream On: Unleashing Innovation with Data Streaming | Life Is But A StreamEP2—Processing Without Pause: Continuous Stream Processing and Apache Flink® | Life Is But A StreamEP3—The Connective Tissue: Shift Left to turn Data Chaos to Clarity | Life Is But A StreamLinks & Resources:Connect with Joseph: @thedatagiantJoseph’s LinkedIn: linkedin.com/in/thedatagiantRami’s LinkedIn: linkedin.com/in/ramiallolahLearn more at Confluent.ioOur Sponsor: Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent data streaming platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true data streaming platform from the pioneers in data streaming. Learn more at confluent.io Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
Strategic partnerships don’t work without trust. And in the data streaming world, trust begins with transparency.In this episode, Elena Cuevas, Sr. Manager of Cloud Partner Solutions Engineering at Confluent, joins Joseph Morais to unpack what it takes to collaborate with hyperscalers like AWS, Google Cloud, and Microsoft Azure. From managing competitive overlaps to future-proofing enterprise data architectures, Elena shares her battle-tested strategies for driving alignment across complex stakeholder ecosystems.You’ll learn:How to collaborate effectively across competitive cloud partnerships Why transparency builds trust and drives adoption in partner-led salesWhat shift left unlocks for streaming data governanceHow Confluent supports GenAI use cases with real-time data Whether you're working with a cloud provider or building your own ecosystem, Elena’s playbook will change how you think about partnership, transparency, and long-term scale.About the Guest:Elena Cuevas currently leads the Cloud Partner Solutions Engineering team at Confluent, where she and her team focus on strengthening technical relationships with cloud providers.Prior to joining Confluent, Elena spent several years at Google, working directly with some of the company’s largest Google Cloud customers in both pre-sales and post-sales capacities.She is passionate about helping members of historically underrepresented communities see that a path in STEM is within reach if they choose to pursue it. Elena has served as an #IamRemarkable facilitator, an instructor for the Computer Science Summer Institute, and is a frequent speaker at universities and high schools.Guest Highlight: “Sometimes we talk about competing and about one or the other, but we do better things when it's ‘and’, and not an ‘or’.”Episode Timestamps: *(07:55) - Data Streaming in a World of Clouds *(14:25) - The Playbook: Winning Strategies*(23:25) - Voices from the World of Data Streaming*(28:30) - Quick Bytes*(31:15) - Joseph’s Top 3 TakeawaysDive Deeper into Data Streaming:EP1—Stream On: Unleashing Innovation with Data Streaming | Life Is But A StreamEP2—Processing Without Pause: Continuous Stream Processing and Apache Flink® | Life Is But A StreamEP3—The Connective Tissue: Shift Left to turn Data Chaos to Clarity | Life Is But A StreamLinks & Resources:Connect with Joseph: @thedatagiantJoseph’s LinkedIn: linkedin.com/in/thedatagiantElena’s LinkedIn: linkedin.com/in/ecuevascLearn more at Confluent.ioOur Sponsor: Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent data streaming platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true data streaming platform from the pioneers in data streaming. Learn more at confluent.io Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
Blockchain may be decentralized, but reliable access to its data is anything but simple. In this episode, Ethan Chan, Co-Founder & CEO of Allium, shares how his team transforms blockchain firehoses into clean, queryable, real-time data feeds.From the pitfalls of hosting your own data streaming infrastructure to the business advantages of Confluent Cloud, Ethan reveals the strategic decisions that helped Allium scale from 3 to nearly 100 blockchains, without burning out their engineering team.You’ll learn:Why Allium opted for Confluent Cloud from day one How data streaming supports mission-critical blockchain use casesWhat makes stream sharing a force multiplier for customer successHow Confluent connectors accelerated their integrations with Snowflake, BigQuery, and DatabricksWhether you’re building real-time monitoring tools or reconciling petabytes of cross-chain data, this episode offers a blueprint for scaling with Apache Kafka® and Confluent.About the Guest:Ethan Chan is the CEO and Co-Founder of Allium. With over six years of experience in the AI and natural language processing space, Ethan has come to believe that high-quality data is the key constraint in building the best-performing models. This insight drives Allium’s mission: to provide the highest-quality blockchain data infrastructure for crypto analytics and engineering teams at institutions and funds. Allium focuses on delivering clean, reliable data in a simple, accessible format—data served on a silver platter.Before founding Allium, Ethan was the Director of Engineering at Primer, where he founded and led the Primer Command product and team, and re-architected the core machine learning pipelines powering Primer Analyze. Earlier in his career, he lectured at Stanford University and conducted research at the Stanford Intelligent Systems Lab as a graduate student.Guest Highlight: “We definitely always wanted to go for a managed service from the get-go… If you break [streaming], every other engineering company is going to complain and you become infamous—for no reason.”Episode Timestamps: *(01:10) - Allium’s Data Streaming Strategy*(05:20) - Data Streaming Goodness*(26:50) - The Runbook: Tools & Tactics*(29:55) - Data Streaming Street Cred: Improve Data Streaming Adoption *(32:10) - Quick Bytes*(36:40) - Joseph’s Top 3 TakeawaysDive Deeper into Data Streaming:EP1—Stream On: Unleashing Innovation with Data Streaming | Life Is But A StreamEP2—Processing Without Pause: Continuous Stream Processing and Apache Flink® | Life Is But A StreamEP3—The Connective Tissue: Shift Left to turn Data Chaos to Clarity | Life Is But A StreamLinks & Resources:Ep 8 - How Airy Powers Real-Time AI Agents with Data Streaming and ConfluentConnect with Joseph: @thedatagiantJoseph’s LinkedIn: linkedin.com/in/thedatagiantEthan’s LinkedIn: linkedin.com/in/ethanyschanLearn more at Confluent.ioOur Sponsor: Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent data streaming platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true data streaming platform from the pioneers in data streaming. Learn more at confluent.io. Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
For enterprises managing sprawling systems and frequent M&A activity, data latency isn’t just inconvenient—it’s a blocker to business value. In this episode, Joe Pardi, Senior Director of Global Data Engineering at Covetrus, explains how his team replaced fragile data pipelines with a robust real-time data streaming architecture that enables instant decisions across the entire enterprise.The Covetrus team started with open source Apache Kafka® and then shifted to Confluent Cloud for scalability and data governance. Joe shares the adoption has drastically reduced operational overhead, strengthened data governance, and supercharged innovation. He also emphasizes the strategic importance of data integration in post-merger scenarios and how real-time data streaming helps uncover and address data quality issues early on.You’ll learn:Why Covetrus moved from self-managed Kafka to Confluent CloudHow real-time data streaming helps detect data quality issues fasterWhat are the requirements for scaling a data architecture across hundreds of engineersHow data streaming supports M&A integration and value creationWhether you’re centralizing data across silos or enabling downstream systems with real-time feeds, this conversation offers a blueprint for building data streaming platforms that scale.About the Guest:Joe Pardi is the Senior Director of Global Data Engineering at Covetrus, a leading provider in the veterinary services industry. Based in Portland, Maine, he has held this role since January 2021, where he leads data engineering initiatives to support Covetrus' global operations.Guest Highlight: “The whole key with that acquisition strategy is to unlock the value. It's easy to make an acquisition go from 1+1=2, but you're really trying to make 1+1=3… If you're going to do an M&A, you have to get really good at integrating data.”Episode Timestamps: *(05:10) - Covetrus’ Data Streaming Strategy*(06:40) - Data Challenges: Latency and Quality*(26:05) - The Runbook: Tools & Tactics*(31:15) - Data Streaming Street Cred: Improve Data Streaming Adoption *(38:25) - Quick Bytes*(42:15) - Joseph’s Top 3 TakeawaysDive Deeper into Data Streaming:EP1—Stream On: Unleashing Innovation with Data Streaming | Life Is But A StreamEP2—Processing Without Pause: Continuous Stream Processing and Apache Flink® | Life Is But A StreamEP3—The Connective Tissue: Shift Left to turn Data Chaos to Clarity | Life Is But A StreamLinks & Resources:Connect with Joseph: @thedatagiantJoseph’s LinkedIn: linkedin.com/in/thedatagiantJoe’s LinkedIn: linkedin.com/in/joepardiLearn more at Confluent.ioOur Sponsor: Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent data streaming platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true data streaming platform from the pioneers in data streaming. Learn more at confluent.io. Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
Agentic AI is transforming how modern systems interact with data, and Airy is at the forefront with its real-time approach.In this episode, Steffen Hoellinger, Co-founder and CEO of Airy, discusses the transformative impact of agentic AI for enterprises and why real-time data streaming is a crucial component. From Apache Flink® and Apache Kafka® to the importance of focusing on core business challenges, Steffen breaks down how Airy builds intelligent systems that react to data as it happens.You’ll learn:How Airy’s team uses Confluent to simplify their data infrastructureWhat agentic AI means beyond the buzzwords and why it’s a leap past basic natural language processing (NLP)Why real-time interaction is essential for creating next-generation software experiencesHow business users can create and test AI-driven patterns without writing any codeIf you’re building AI-powered apps—or planning to—this one’s for you.About the Guest:Steffen Hoellinger is the Co-founder and CEO of Airy, where he leads the development of open-source data infrastructure that connects real-time event streaming technologies like Apache Kafka® and Apache Flink® with large language models. Airy helps enterprises build AI agents and data copilots for both technical and business users across streaming and batch data. Steffen is also an active early-stage investor focusing on data infrastructure, AI, and deep technology.Guest Highlight: “Real-time is getting more and more important. Once you understand all the fancy ways people in the market are trying to solve a real-time data problem, you come to the realization that [data streaming] is the best way of doing things. It became natural to adopt it early on in our journey.”Episode Timestamps: *(05:10) - Overview of Airy’s AI Solutions*(37:20) - The Runbook: Tools & Tactics*(44:00) - Data Streaming Street Cred: Improve Data Streaming Adoption *(47:50) - Quick Bytes*(50:25) - Joseph’s Top 3 TakeawaysDive Deeper into Data Streaming:EP1—Stream On: Unleashing Innovation with Data Streaming | Life Is But A StreamEP2—Processing Without Pause: Continuous Stream Processing and Apache Flink® | Life Is But A StreamEP3—The Connective Tissue: Shift Left to turn Data Chaos to Clarity | Life Is But A StreamLinks & Resources:Connect with Joseph: @thedatagiantJoseph’s LinkedIn: linkedin.com/in/thedatagiantSteffen’s LinkedIn: linkedin.com/in/hoellingerLearn more at Confluent.ioOur Sponsor: Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent data streaming platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true data streaming platform from the pioneers in data streaming. Learn more at confluent.io. Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
What does real transformation look like when the stakes are high? Todd Smitala, Technology Solutions Architect at the Association of Professional Flight Attendants (APFA), shares how his journey of modernizing data system support 27,000 union members.With the legacy infrastructure crumbling under peak traffic during a high-volume union vote, Todd’s team turned to cloud-based real-time streaming with Apache Kafka® on Confluent Cloud. The result? A more reliable, scalable system that delivered immediate impact, ensuring every vote counted. In this episode, Todd walks through the technical decisions and leadership alignment that made it possible.You’ll learn:How real-time infrastructure solved a high-stakes voting challengeWhat it took to modernize under pressure and bring leadership on boardHow to frame cost, reliability, and system ownership to drive successful changeIf you're leading a data streaming initiative and need a blueprint for impact, this episode is for you.About the Guest:As a Technology Solutions Architect at APFA, Todd Smitala brings over 10 years of experience as a flight attendant and 20 years in IT, allowing Todd to bridge the gap between technology and the needs of flight attendants. He’s passionate about leveraging technology to create meaningful change, support union members, and enforce contractual legalities. This combination of industry insight and technical expertise empowers Todd to deliver innovations that make a real difference. Outside of work, Todd enjoys scuba diving, adventure travel, hiking, playing music, and dancing at Carnaval in Brazil.Guest Highlight: “We all agreed that Confluent Cloud would be the way we want to go because of how it does host. It does really handle all of the background stuff. I'm not a Java developer… We didn't have the time to invest in all of that so Confluent Cloud is actually perfect for this type of thing.”Episode Timestamps: *(01:22) - How APFA Is Using Data Streaming to Empower Members *(20:39) - The Runbook: Tools & Tactics*(26:45) - Data Streaming Street Cred: Improve Data Streaming Adoption *(28:30) - Quick Bytes*(33:36) - Joseph’s Top 3 TakeawaysDive Deeper into Data Streaming:EP1—Stream On: Unleashing Innovation with Data Streaming | Life Is But A StreamEP2—Processing Without Pause: Continuous Stream Processing and Apache Flink® | Life Is But A StreamEP3—The Connective Tissue: Shift Left to turn Data Chaos to Clarity | Life Is But A StreamLinks & Resources:Connect with Joseph: @thedatagiantJoseph’s LinkedIn: linkedin.com/in/thedatagiantTodd’s LinkedIn: linkedin.com/in/tsmitalaLearn more at Confluent.ioOur Sponsor: Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent data streaming platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true data streaming platform from the pioneers in data streaming. Learn more at confluent.io. Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
Whether you're an executive setting the strategy or an architect building the backbone, alignment is the key to turning transformation from a buzzword into results. Rick Hernandez, principal technical architect at EY, shares how to unlock shared vision and turn it into enterprise-wide data streaming success. In this episode, Rick joins Joseph to explore how organizations can connect leadership ambition with technical execution. Exploring how top-down buy-in, clearly defined objectives, and strategic alignment with the right technology can give your organization “wings to a tiger.”You’ll learn:How to directly map business pain points to streaming-first solutionsHow real-time infrastructure enables faster decisions and a competitive edgeWhat happens when stakeholders discover their challenges are already solvable through streamingIf you're leading modernization efforts or championing real-time data, this episode is your guide for building momentum across teams.About the Guest:Rick Hernandez is a Principal Technical Architect at EY, specializing in enterprise architecture and digital transformation. He helps organizations implement innovative solutions and optimize IT strategies. Rick’s expertise includes architecture and development in middleware technologies, cloud integration work, business process management, enterprise application integration, and more. Guest Highlight:“Technology transformation is not for one area of a company. The overall company needs to actually change to do something better, differently, and more efficiently.”Episode Timestamps: *(01:00) - EY’s Data Streaming Strategy*(05:30) - Data Streaming Goodness: Having a Shared Vision*(22:00) - The Runbook: Tools & Tactics*(26:45) - Data Streaming Street Cred: Improve Data Streaming Adoption *(31:10) - Quick Bytes*(33:45) - Joseph’s Top 3 TakeawaysDive Deeper into Data Streaming:EP1—Stream On: Unleashing Innovation with Data Streaming | Life Is But A StreamEP2—Processing Without Pause: Continuous Stream Processing and Apache Flink® | Life Is But A StreamEP3—The Connective Tissue: Shift Left to turn Data Chaos to Clarity | Life Is But A StreamLinks & Resources:Connect with Joseph: @thedatagiantJoseph’s LinkedIn: linkedin.com/in/thedatagiantRick’s LinkedIn: linkedin.com/in/rick-hernandez-3298574Learn more at Confluent.ioOur Sponsor: Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent data streaming platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true data streaming platform from the pioneers in data streaming. Learn more at confluent.io. Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
Want your real-time data streaming initiative to stick? Success hinges on more than pipelines—it’s about people, governance, and business impacts. Jeffrey Johnathon Jennings (J3), managing principal at signalRoom, shares how to bring it all together.In this episode, J3 shares how he’s used impactful proofs of concepts to demonstrate value early, then scaled effectively through shift left with governance and stronger cross-team collaboration.You’ll learn about:Why proofs of concept are key to securing buy-in and demonstrating ROI earlyHow data governance as a shared language create consistency across teamsStrategies for establishing a data streaming center of excellenceThe role of business outcomes in guiding streaming data adoption strategiesIf you’re building or scaling a data streaming practice, this episode goes beyond the technology, showing you how to drive real impact.About the Guest:Jeffrey Johnathan Jennings is the managing principal of signalRoom, a dedicated father, avid traveler, and EDM enthusiast whose creativity and energy shape both his personal and professional life. As a cloud-native data streaming expert, he specializes in integrating ML/AI technologies to drive transformative change and improve business outcomes. With a focus on innovation, he designs scalable data architectures that enable real-time insights and smarter decision-making. Committed to continuous learning, Jeffrey stays ahead of technological advancements to help businesses navigate the evolving digital landscape and achieve lasting growth.Guest Highlight:“We need to speak the same language. The only way to speak the same language is to have a Schema Registry. I don't think there's an option. You just have to do this. We share a common language and therefore we build common libraries.”Episode Timestamps: *(01:13) - J3’s Data Streaming Journey*(07:07) - Data Streaming Goodness: Strategies to Demonstrate Value*(26:38) - The Runbook: Data Streaming Center of Excellence *(37:00) - Data Streaming Street Cred: Improve Data Streaming Adoption *(42:35) - Quick Bytes*(45:00) - Joseph’s Top 3 TakeawaysDive Deeper into Data Streaming:EP1—Stream On: Unleashing Innovation with Data Streaming | Life Is But A StreamEP2—Processing Without Pause: Continuous Stream Processing and Apache Flink® | Life Is But A StreamEP3—The Connective Tissue: Shift Left to turn Data Chaos to Clarity | Life Is But A StreamLinks & Resources:Connect with Joseph: @thedatagiantJoseph’s LinkedIn: linkedin.com/in/thedatagiantJ3’s LinkedIn: linkedin.com/in/jeffreyjonathanjennings/J3’s GitHub: github.com/j3-signalroom“Fall in Love with the Problem, Not the Solution” by Uri Levine“Ask Your Developer” by Jeff LawsonLearn more at Confluent.ioOur Sponsor: Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent data streaming platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true data streaming platform from the pioneers in data streaming. Learn more at confluent.io. Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
Despite its value, legacy data can feel like a roadblock in a fast-paced digital world—Henry Schein One is clearing the path forward with real-time data streaming.In this episode, Chris Kapp, Software Architect at Henry Schein One (HS1), shares how his team modernizes data management to stay competitive and unlock real-time insights.You’ll learn about:How tagging strategy, immutable audit log, and governance keep data secure and reliableThe challenges (and wins) of getting leadership buy-in for data modernizationHS1’s approach to decentralized data ownership, domain-driven design, and the importance of stream processing for scalingThe role of GenAI in the future of real-time stream processingGet ready to future-proof your data strategy with this must-listen episode for technology leaders facing scalability, governance, or integration challenges. About the Guest:Chris Kapp is a Software Architect at Henry Schein One specializing in domain-driven design and event-driven patterns. He has 34 years of experience in the software industry including Target and Henry Schein One. He is passionate about teaching patterns for scalable data architectures. He’s currently focused on the One-Platform initiative to allow Henry Schein applications to work together as a single suite of products.Guest Highlight:“It's important to collect the data, try to eliminate our biases and go towards what is delivering quickly. The key is to start small, agile, iterative, and build something small with the people that are excited and willing to learn new things. If it doesn't work, then be agile, adjust, and find the thing that does work."Episode Timestamps: *(01:18) - Chris’ Data Streaming Journey*(03:35) - Data Streaming Goodness: AI-Driven Reporting & Data Streaming*(21:08) - The Playbook: Data Revitalization & Event-Driven Architecture*(31:55) - Data Streaming Street Cred: Executive Alignment & Engineering Collaboration*(32:03) - Quick Bytes*(40:14) - Joseph’s Top 3 TakeawaysLinks & Resources:Connect with Joseph: @thedatagiantJoseph’s LinkedIn: linkedin.com/in/thedatagiantChris’ LinkedIn: linkedin.com/in/chris-kapp-87868a4Designing Event-Driven Systems eBookDesigning Data-Intensive Applications eBookCurrent 2025—The Data Streaming EventLearn more at Confluent.ioOur Sponsor: Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent data streaming platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true data streaming platform from the pioneers in data streaming. Learn more at confluent.io. Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
In the final episode of our 3-part series on the basics of data streaming, we take a deep dive into data integration—covering everything from data governance to data quality.Our guests, Mike Agnich, General Manager of Data Streaming Platform, and David Araujo, Director of Product Management at Confluent, explain why connectors are must-haves for integrating systems. You’ll learn:Why real-time ETL out performs the old-school approachHow shifting left with governance saves time and pain laterThe overlooked role of schemas in data qualityAnd more…About the Guests:Mike Agnich is the General Manager and VP of Product for Confluent's Data Streaming Platform (DSP). Mike manages a product portfolio that includes stream processing, connectors and integrations, governance, partnerships, and developer tooling. Over the last six years at Confluent, Mike has held various product leadership roles spanning Apache Kafka®, Confluent Cloud, and Confluent Platform. Working closely with customers, partners, and R&D to drive adoption and execution of Confluent products. Prior to his work at Confluent, Mike was the founder and CEO of Terrain Data (acquired by Confluent in 2018).David Araujo is a Director of Product Management at Confluent, focusing on data governance with products such as Schema Registry, Data Catalog, and Data Lineage. David previously held positions at companies like Amobee, Turn, WeDo Technologies Australia, and Saphety, where David worked on various aspects of data management, analytics, and infrastructure. With a background in Computer Science from the University of Évora, David has a strong foundation of technical expertise and leadership roles in the tech industry.Guest Highlights:"If a ton of raw data shows up on your doorstep, it's like shipping an unlabeled CSV into a finance organization and telling them to build their annual forecast. By shifting that cleaning and structure into streaming, we remove a massive amount of toil for our organizations… Instead of punting the problem down to our analytics friends, we can solve it because we're the ones that created the data." - Mike Agnich"We've had data contracts in Kafka long before it became a buzzword—we called them schemas… But more recently, we've evolved this concept beyond just schemas. In streaming, a data contract is an agreement between producers and consumers on both the structure (schema) and the semantics of data in motion. It serves as a governance artifact, ensuring consistency, reliability, and quality while providing a single source of truth for understanding streaming data." - David AraujoLinks & Resources:Connect with Joseph: @thedatagiantJoseph’s LinkedIn: linkedin.com/in/thedatagiantMike’s LinkedIn: linkedin.com/in/magnichDavid’s LinkedIn: linkedin.com/in/davidaraujoWhat Is a Data Streaming Platform (DSP)Learn more at Confluent.ioEpisode Timestamps:*(02:00) - Mike and David’s Journey in Data Streaming*(13:55) - Data Streaming 101: Data Integration*(40:06) - The Playbook: Tools & Tactics for Data Integration*(53:25) - Voices from the World of Data Streaming*(59:33) - Quick Bytes*(1:05:20) - Joseph’s Top 3 TakeawaysOur Sponsor: Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent Data Streaming Platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true Data Streaming Platform from the pioneers in data streaming. Learn more at confluent.io Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
We’re diving even deeper into the fundamentals of data streaming to explore stream processing—what it is, the best tools and frameworks, and its real-world applications.Our guests, Anna McDonald, Distinguished Technical Voice of the Customer at Confluent, and Abhishek Walia, Staff Customer Success Technical Architect at Confluent, break down what stream processing is, how it differs from batch processing, and why tools like Flink are game changers.You’ll learn:The key differences between stream and batch processingHow different frameworks like Flink, Kafka Streams, and ksqlDB approach stream processingThe role of POCs and observability in real-time data workflowsAnd more…About the Guests:Anna McDonald is the Distinguished Technical Voice of the Customer at Confluent. She loves designing creative solutions to challenging problems. Her focus is on event-driven architectures, reactive systems, and Apache Kafka®.Abhishek Walia is a Staff Customer Success Technical Architect at Confluent. He has years of experience implementing innovative, performance-driven, and highly scalable enterprise-level solutions for large organizations. Abhishek specializes in architecting, designing, developing, and delivering integration solutions across multiple platforms. Guest Highlights:“Flink is more approachable because it blends approaches together and says, ‘If you need this, you still can use this.’ It's the most powerful at this point.” - Abhishek Walia“If you're somebody who's ever gone from normal to eventing, at some point you probably would have gone, ‘When does [the data] stop?’ It doesn't stop.” - Anna McDonald“ Start with a fully managed service. That's probably going to save a lot of cycles for you.” - Abhishek WaliaEpisode Timestamps: *(01:35) - Anna & Abhishek’s Journey in Data Streaming*(12:30) - Data Streaming 101: Stream Processing*(26:30) - The Playbook: Tools & Tactics for Stream Processing*(50:20) - Voices from the World of Data Streaming*(56:13) - Quick Bytes*(58:57) - Top 3 TakeawaysLinks & Resources:Connect with Joseph: @thedatagiantJoseph’s LinkedIn: linkedin.com/in/thedatagiantAnna’s LinkedIn: linkedin.com/in/jbfletchAbhishek’s LinkedIn: linkedin.com/in/abhishek-waliaIntroducing Derivative Event SourcingDesigning Event-Driven SystemsLearn more at Confluent.ioOur Sponsor: Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent Data Streaming Platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true Data Streaming Platform from the pioneers in data streaming. Learn more at confluent.io. Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.



