DiscoverModern Web
Modern Web
Claim Ownership

Modern Web

Author: Modern Web

Subscribed: 3,122Played: 19,095
Share

Description

The modern web is changing fast. Front-end frameworks evolve quickly, standards are emerging and old ones are fading out of favor. There are a lot of things to learn, but knowing the right thing is more critical than learning them all. Modern Web Podcast is an interview-style show where we learn about modern web development from industry experts. We’re committed to making it easy to digest lots of useful information!
173 Episodes
Reverse
Rob Ocel and Danny Thompson go deep on intentionality, the developer “superpower” that can speed up your growth, sharpen your judgment, and keep you from getting automated away in the AI era. Rob unpacks a simple loop (state intent → act → measure → review) with real stories, including the ticket he challenged on day one that saved a team six figures, and the “it seems to work” anti-pattern that shipped a mystery bug. Together they show how being deliberate before you write a line of code changes everything: scoping tickets, estimating work, documenting decisions, reviewing PRs, and speaking up, even as a junior.What you’ll learn: • The intentionality loop: how to set a hypothesis, capture outcomes, and improve fast • The exact moment to ask “Should we even do this ticket?” and how to push back safely • Why code is the last step: design notes, edge cases, and review context first • Estimation that actually works: start naive, compare to actuals, iterate to ±10% • How to avoid DRY misuse, “tragedy of the commons” code reviews, and stealth tech debt • Where to keep your working notes (GitHub, Notion, SharePoint) so reviewers can follow your logic • How juniors can question assumptions without blocking the room or their careerRob Ocel on Linkedin: https://www.linkedin.com/in/robocel/Danny Thompson on Linkedin: https://www.linkedin.com/in/dthompsondev/This Dot Labs Twitter: https://x.com/ThisDotLabsThis Dot Media Twitter: https://x.com/ThisDotMediaThis Dot Labs Instagram: https://www.instagram.com/thisdotlabs/This Dot LabsFacebook: https://www.facebook.com/thisdot/This Dot LabsBluesky: https://bsky.app/profile/thisdotlabs.bsky.socialSponsored by This Dot Labs: https://ai.thisdot.co/
On this episode of the Modern Web Podcast, hosts Rob Ocel and Danny Thompson welcome Miles Ward, CTO of SADA, for an in-depth conversation about the intersection of cloud computing and AI. Miles shares his career journey from early days at AWS and Google Cloud to leading SADA through its acquisition by Insight, offering a rare perspective on the evolution of solutions architecture and cloud adoption at scale.The discussion covers the realities of cloud “repatriation,” why GPUs have shifted some workloads back on-prem or to niche “neo-cloud” providers, and how cloud infrastructure remains the backbone of most AI initiatives. Miles breaks down practical concerns for organizations, from token pricing and GPU costs to scaling AI features without blowing budgets. He also highlights how AI adoption exposes weak organizational habits, why good data and strong processes matter more than hype, and how developers should view AI as intelligence augmentation rather than replacement.Key Takeaways:- Miles Ward, former early AWS Solutions Architect, founder of the SA practice at Google Cloud, and now CTO at SADA (acquired by Insight), brings a deep history in scaling infrastructure and AI workloads.- Cloud repatriation is rare. The main exception is GPUs, where companies may rent from “neo-clouds” like CoreWeave, Crusoe, or Lambda, or occasionally use on-prem for cost and latency reasons, though data-center power constraints make this difficult.- Cloud remains essential for AI. Successful initiatives depend on cloud primitives like data, orchestration, security, and DevOps. Google’s integrated stack (custom hardware, platforms, and models) streamlines development. The best practice is to build in cloud first, then optimize or shift GPU inference later if needed.- Costs and readiness are critical. Organizations should measure AI by business outcomes rather than lines of code. Token spending needs calculators, guardrails, and model routing strategies. On-prem comes with hidden costs such as power, networking, and staffing. The real bottleneck for most companies is poor data and weak processes, not model quality.Miles Ward on Linkedin: https://www.linkedin.com/in/rishabkumar7/Rob Ocel on Linkedin: https://www.linkedin.com/in/robocel/Danny Thompson on Linkedin: https://www.linkedin.com/in/dthompsondev/This Dot Labs Twitter: https://x.com/ThisDotLabsThis Dot Media Twitter: https://x.com/ThisDotMediaThis Dot Labs Instagram: https://www.instagram.com/thisdotlabs/This Dot Labs Facebook: https://www.facebook.com/thisdot/This Dot Labs Bluesky: https://bsky.app/profile/thisdotlabs.bsky.socialSponsored by This Dot Labs: https://ai.thisdot.co/
In this Modern Web Podcast, Rob Ocel and Danny Thompson break down the recent string of NPM supply chain attacks that have shaken the JavaScript ecosystem. They cover the NX compromise, the phishing campaign that hit libraries like Chalk, and the Shy Halood exploit, showing how small changes in dependencies can have massive effects. Along the way, they share practical defenses like using package lock and npm ci, avoiding phishing links, reviewing third party code, applying least privilege, staging deployments, and maintaining incident response plans. They also highlight vendor interventions such as Vercel blocking malicious deployments and stress why companies must support open source maintainers if the ecosystem is to remain secure.Key Points from this Episode:- Lock down installs. Pin versions, commit package-lock.json, use npm ci in CI, and disable scripts in CI (npm config set ignore-scripts true) to neutralize post-install attacks.- Harden people & permissions. Phishing hygiene (never click-through emails), 2FA/hardware keys, least-privilege by default, and separate/purpose-scoped publishing accounts.- Stage & detect early. Canary/staged deploys, feature flags, and tight observability to catch dependency drift, suspicious network egress, or monkey-patched APIs fast.- Practice incident response. Two-hour containment target: revoke/rotate tokens, reimage affected machines, roll back artifacts, notify vendors, and run a post-mortem playbook.Rob Ocel on Linkedin: https://www.linkedin.com/in/robocel/Danny Thompson on Linkedin: https://www.linkedin.com/in/dthompsondev/This Dot Labs Twitter: https://x.com/ThisDotLabsThis Dot Media Twitter: https://x.com/ThisDotMediaThis Dot Labs Instagram: https://www.instagram.com/thisdotlabs/This Dot Labs Facebook: https://www.facebook.com/thisdot/This Dot Labs Bluesky: https://bsky.app/profile/thisdotlabs.bsky.socialSponsored by This Dot Labs: https://ai.thisdot.co/
In this episode of the Modern Web Podcast, Rob Ocel and Danny Thompson talk with Wes Eklund from AWS ProServe about interviews, practical AI, and the future of developer workflows. Wes shares what trips candidates up in coding and behavioral rounds, how to ask better questions, and why prepping multiple honest STAR narratives matters. Danny introduces the Thrive Framework for behavioral interviews and Rob underscores the discipline required to stand out in a crowded market. The trio then digs into 100 Days of Code in the AI era, smart ways juniors can learn with AI, and how Wes’s team uses MCP servers and Amazon Q to speed design, onboarding, and day-to-day delivery. They cover the lull in MCP hype, real security concerns, the “80 percent is a win” mindset when AI accelerates work, and when it actually makes sense to build agents. They close on thin, purpose-built agents, enterprise adoption patterns, and why frameworks like DSPy could reshape moats and costs.Key Takeaways from this episode:- Wes explains how candidates often fail because they neglect behavioral prep, and Danny introduces the Thrive Framework as a system to stand out.- The group debates whether juniors should use AI. Wes frames it as a tool for strategy and reflection, not a shortcut, while Danny emphasizes using it to deepen knowledge and accountability.- Wes shares how his AWS team leverages MCP servers and Amazon Q to speed design, boost onboarding, and solve problems faster, while Danny highlights enterprise-level use cases like multilingual documentation.- They discuss whether developers should build agents, the risks of security gaps, and how frameworks like DSPy could make optimized, lightweight agents a new competitive edge.Chapters0:00 MCP servers: security reality check0:33 Modern Web Podcast intro0:55 Guest: Wes Ecklan (AWS ProServe)2:02 Job hunt & interview mistakes5:05 Danny’s THRIVE framework7:39 Researching values & STAR stories11:12 Sponsor + quality & discipline in applications13:04 100 Days of Code in the AI era18:03 Using AI at work (MCP + Amazon Q)23:13 Hackathons & making time to innovate25:06 MCPs in practice: adoption & security36:00 Agents: when they help vs. hype — close & linksWes Eklund on Linkedin: https://www.linkedin.com/in/weseklund/Rob Ocel on Linkedin: https://www.linkedin.com/in/robocel/Danny Thompson on Linkedin: https://www.linkedin.com/in/dthompsondev/This Dot Labs Twitter: https://x.com/ThisDotLabsThis Dot Media Twitter: https://x.com/ThisDotMediaThis Dot Labs Instagram: https://www.instagram.com/thisdotlabs/This Dot Labs Facebook: https://www.facebook.com/thisdot/This Dot Labs Bluesky: https://bsky.app/profile/thisdotlabs.bsky.socialSponsored by This Dot Labs: https://ai.thisdot.co/
Danny Thompson sits down with Marino Wijay, Staff Solutions Architect at Kong and CNCF Ambassador, for a wide-ranging conversation on modern cloud-native development. They start with Kubernetes as the entry point into the ecosystem and explore what it really means to be a CNCF ambassador. Marino explains the difference between an API gateway and a service mesh, when small teams should adopt each, and why managed services often make more sense than running infrastructure yourself.The discussion then shifts to reliability and observability, with a focus on automation, pipelines, and creating a seamless developer experience. Finally, Marino shares lessons from working with enterprises rolling out AI, covering vector caching, cost optimization, latency concerns, and the importance of data governance when dealing with LLM traffic. It’s an episode full of practical advice for builders navigating the realities of APIs, microservices, and AI in production today.Key points from this episode:- Kubernetes remains the entry point into the cloud-native ecosystem, giving teams the foundation to operationalize applications and join the CNCF community.- Marino breaks down the distinction between an API gateway and a service mesh, showing how a gateway like Kong secures APIs at the edge while a mesh like Kuma manages traffic, authentication, and encryption between services.- For smaller teams, the smartest path is to rely on managed services and an API gateway, introducing a service mesh only when scale and complexity demand it.- As organizations adopt AI, Marino highlights how vector caching, governance policies, and PII sanitization help control costs, cut latency, and protect sensitive data when working with LLMs.Marino Wijay on Linkedin: https://www.linkedin.com/in/mwijay/Rob Ocel on Linkedin: https://www.linkedin.com/in/robocel/Danny Thompson on Linkedin: https://www.linkedin.com/in/dthompsondev/This Dot Labs Twitter: https://x.com/ThisDotLabsThis Dot Media Twitter: https://x.com/ThisDotMediaThis Dot Labs Instagram: https://www.instagram.com/thisdotlabs/This Dot Labs Facebook: https://www.facebook.com/thisdot/This Dot Labs Bluesky: https://bsky.app/profile/thisdotlabs.bsky.socialSponsored by This Dot Labs: https://ai.thisdot.co/
Rob Ocel and Danny Thompson sit down with Andy Bell to treat CSS as a craft, not a chore. Andy explains why he mentors the browser instead of micromanaging it, how progressive enhancement keeps products resilient, and which modern features deserve attention right now, including has, anchor positioning, and clamp. The conversation gets practical on grid versus flexbox, why grid is often simpler than people think, and how ecosystems and tooling skew usage. They unpack the real reasons Tailwind spread across teams, where it helps with speed and onboarding, and why core CSS skills plus a clear methodology prevent long-term debt. Expect candid consultancy stories, smarter debugging with today’s devtools, and a reminder that play, standards knowledge, and strong communication habits lead to cleaner, more maintainable front ends.Key Takeaways:- Andy Bell stresses mentoring the browser instead of micromanaging it, leaning on progressive enhancement and letting it adapt to context.- Features like :has(), anchor positioning, and clamp are changing how developers approach layouts, interactions, and responsive design.- Despite its power, Grid hasn’t caught on like flexbox, partly due to ecosystem and tooling choices. Andy suggests learning grid first for a clearer foundation.- Tailwind solves organizational and onboarding challenges, but without solid CSS fundamentals and consistent methodologies, teams risk piling up technical debt.Andy Bell on Linkedin: https://www.linkedin.com/in/andy-bell-347971255/Rob Ocel on Linkedin: https://www.linkedin.com/in/robocel/Danny Thompson on Linkedin: https://www.linkedin.com/in/dthompsondev/This Dot Labs Twitter: https://x.com/ThisDotLabsThis Dot Media Twitter: https://x.com/ThisDotMediaThis Dot Labs Instagram: https://www.instagram.com/thisdotlabs/This Dot Labs Facebook: https://www.facebook.com/thisdot/This Dot Labs Bluesky: https://bsky.app/profile/thisdotlabs.bsky.socialSponsored by This Dot Labs: https://ai.thisdot.co/
In this episode of the Modern Web Podcast, Rob Ocel and Danny Thompson are joined by Rishab Kumar, Staff Developer Evangelist at Twilio, to explore the evolving landscape of voice and AI interactions. They discuss the rise of conversational AI, how voice interfaces are becoming the natural medium for human-computer interaction, and the tools and best practices for integrating AI into real-world applications. Rishab shares insights from Twilio on building voice-enabled AI experiences, tackling challenges like latency and prompt design, and how AI is shaping the future of productivity and problem-solving. The conversation also highlights community-focused events, like the upcoming Commit Your Code Conference in Dallas, where networking, learning, and giving back to charity take center stage.Key Takeaways:- Voice interfaces are becoming more natural and conversational, moving beyond simple commands to context-aware, agentic interactions that can assist with tasks in real time.- AI is being integrated into real-world use cases, from coding assistants and productivity tools to hands-on guidance for tasks like furniture assembly, car troubleshooting, and lab work.- Platforms like Twilio provide APIs, Conversation Relay, and integrations with voice models to streamline AI voice interactions, handling challenges like latency, speech-to-text, and interruption management.- There’s a growing need for specialized, reliable AI tools tailored to specific industries and tasks, as well as careful consideration of ethical implications, user trust, and contextual accuracy.Rishab Kumar on Linkedin: https://www.linkedin.com/in/rishabkumar7/Rob Ocel on Linkedin: https://www.linkedin.com/in/robocel/Danny Thompson on Linkedin: https://www.linkedin.com/in/dthompsondev/This Dot Labs Twitter: https://x.com/ThisDotLabsThis Dot Media Twitter: https://x.com/ThisDotMediaThis Dot Labs Instagram: https://www.instagram.com/thisdotlabs/This Dot Labs Facebook: https://www.facebook.com/thisdot/This Dot Labs Bluesky: https://bsky.app/profile/thisdotlabs.bsky.socialSponsored by This Dot: https://ai.thisdot.co/
In this episode, Rob Ocel and Danny Thompson enjoy a conversation with Kilian Valkhof, founder of Polypane, a browser built for developers who care deeply about their craft. The discussion explores the shifting landscape of online developer communities as conversations migrate from Twitter to Blue Sky, Mastodon, Discord, and local meetups. Kilian shares how this decentralization has shaped advocacy around accessibility, performance, and front-end principles, while Rob and Danny reflect on what developers lose and gain when there’s no longer a single central hub. They also dig into guiding principles for building quality front-end experiences, from usability and accessibility to balancing trade-offs between performance, readability, and SEO.Key points from this episode- Developers are finding their communities scattered across Blue Sky, Mastodon, Discord, and meetups, changing how ideas about accessibility and performance spread.- Practical frameworks like “rule of three” and “make it run, make it right, make it fast” give developers clearer guidance than vague advice such as “don’t repeat yourself.”- Building with craft means going beyond visual accuracy to include accessibility, usability, and small details that improve the overall user experience.- Teams need to agree on priorities so they can navigate trade-offs between things like accessibility, performance, SEO, and readability.Kilian Valkhof on Linkedin: https://www.linkedin.com/in/kilianvalkhof/Rob Ocel on Linkedin: https://www.linkedin.com/in/robocel/Danny Thompson on Linkedin: https://www.linkedin.com/in/dthompsondev/This Dot Labs Twitter: https://x.com/ThisDotLabsThis Dot Media Twitter: https://x.com/ThisDotMediaThis Dot Labs Instagram: https://www.instagram.com/thisdotlabs/This Dot Labs Facebook: https://www.facebook.com/thisdot/This Dot Labs Bluesky: https://bsky.app/profile/thisdotlabs.bsky.socialSponsored by This Dot Labs: https://ai.thisdot.co/
This episode of the Modern Web Podcast features Cody De Arkland, Head of Developer Experience at Sentry, in conversation with hosts Rob Ocel and Danny Thompson. They explore how Sentry has embraced a culture of experimentation with AI, from grassroots innovation in Slack channels to leadership setting the tone for rapid adoption. Cody shares insights into Sentry’s new AI monitoring tools, including MCP support and agent tracing, which give developers visibility into token usage, tool calls, and debugging flows. The discussion also touches on how AI is reshaping developer workflows, the balance between writing code and prompting, and why structured thinking is key to getting useful results.Keypoints from this episode:- Sentry fosters a playful, experimental environment where both grassroots initiatives and leadership drive AI adoption.- Sentry has rolled out AI monitoring with MCP support and agent tracing to give visibility into token usage, tool calls, and debugging.- AI is changing how developers approach coding, blending prompting with traditional programming.- Success with AI depends on framing problems clearly, not just relying on raw prompts.Cody De Arkland on Linkedin: https://www.linkedin.com/in/codydearkland/ Rob Ocel on Linkedin: https://www.linkedin.com/in/robocel/Danny Thompson on Linkedin: https://www.linkedin.com/in/dthompsondev/This Dot Labs Twitter: https://x.com/ThisDotLabsThis Dot Media Twitter: https://x.com/ThisDotMediaThis Dot Labs Instagram: https://www.instagram.com/thisdotlabs/This Dot Labs Facebook: https://www.facebook.com/thisdot/This Dot Labs Bluesky: https://bsky.app/profile/thisdotlabs.bsky.socialSponsored by This Dot Labs: https://ai.thisdot.co
In this episode of the Modern Web Podcast, Rob Ocel and Danny Thompson talk with Philipp Krenn, Head of Developer Advocacy at Elastic, about how Elasticsearch has evolved from a search engine into a foundation for observability, security, and AI-powered systems. Philipp explains how Elastic approaches information retrieval beyond just vector search, using tools like LLMs for smarter querying, log parsing, and context-aware data access.They also discuss how Elastic balances innovation with stability through regular releases and a focus on long-term reliability. For teams building with AI, Elastic offers a way to handle search, monitoring, and logging in one platform, making it easier to ship faster without adding complexity.Key points from this episode: Elasticsearch has expanded beyond search to support observability and security by treating all of them as information retrieval problems.Elastic integrates with AI tools like LLMs to improve search relevance, automate log parsing, and enable features like query rewriting and retrieval-augmented generation.Vector search is just one feature in a larger toolkit for finding relevant data, and Elastic supports hybrid and traditional search approaches.Elastic maintains a steady release cadence with a focus on stability, making it a reliable choice for both fast-moving AI projects and long-term production systems.Philipp Krenn on Linkedin: https://www.linkedin.com/in/philippkrenn/Rob Ocel on Linkedin: https://www.linkedin.com/in/robocel/Danny Thompson on Linkedin: https://www.linkedin.com/in/dthompsondev/This Dot Labs Twitter: https://x.com/ThisDotLabsThis Dot Media Twitter: https://x.com/ThisDotMediaThis Dot LabsInstagram: https://www.instagram.com/thisdotlabs/This Dot Labs Facebook: https://www.facebook.com/thisdot/This Dot Labs Bluesky: https://bsky.app/profile/thisdotlabs.bsky.socialSponsored by This Dot Labs: ai.thisdot.co
In this episode of the Modern Web Podcast, hosts Rob Ocel and Danny Thompson talk with Sean Roberts, Head of AX Architecture and Distinguished Engineer at Netlify, about the emerging discipline of Agentic Experience (AX). They explore how AX is reshaping how we design services for AI agents, what makes an agent experience successful, and why traditional user flows often break down in agent-driven systems. Sean discusses the role of MCPs, the challenges of discoverability, and the future of content delivery in an agent-first web. They also dig into real-world examples, like how an agent accidentally took down Netlify’s homepage, and debate whether CMSs still have a place in this new landscape. Key points from this episode- Agent experience is already part of every digital service and needs to be intentionally designed to ensure agents can interact effectively- SEO still matters but new practices like lightweight pages, structured content, and llm dot txt files help improve discoverability for agents- Systems that require human confirmation for basic actions create friction for agents and should be redesigned to allow autonomous task completion- LLMs make it possible to turn unstructured content into structured data on demand which raises questions about whether traditional CMS platforms are still necessary.Sean Roberts: https://www.linkedin.com/in/developsean/Danny Thompson: https://www.linkedin.com/in/dthompsondev/Rob Ocel: https://www.linkedin.com/in/robocel/This Dot Labs Twitter: https://x.com/ThisDotLabsThis Dot Media Twitter: https://x.com/ThisDotMediaThis Dot Labs Instagram: https://www.instagram.com/thisdotlabs/This Dot Labs Facebook: https://www.facebook.com/thisdot/This Dot Labs Bluesky: https://bsky.app/profile/thisdotlabs.bsky.socialSponsored by This Dot Labs: ai.thisdot.co
In this episode of Modern Web, Danny Thompson chats with MelkeyDev, a Machine Learning Infrastructure Engineer at Twitch, about AI’s real-world applications, developer productivity, and the future of careers in Go. They cover everything from the rise of tiny AI-driven teams competing with large enterprises to how system prompts may matter more than model choice. Melkey shares his thoughts on cost-effective LLMs, production pitfalls, and the cognitive downsides of over-relying on AI. The conversation also explores backend development with Go, what makes it great for fast-moving teams, and how new developers can get started.Keypoints from this episode:- AI’s real value lies in business use cases. Melkey emphasizes that AI isn’t just a productivity tool; it enables small teams to build faster, cheaper, and more effectively than ever before.- System prompts are underrated. When it comes to LLM performance, prompt engineering often matters more than the model itself, especially for UI generation and agent design.- Cognitive cost of AI reliance. Referencing recent research, Melkey warns that overusing AI tools can reduce your ability to retain knowledge and perform certain tasks independently.- Go remains a strong backend choice. Despite being “boring,” Go continues to power developer velocity and scalable infrastructure, making it a smart language for backend-focused engineers.Follow MelkeyDev on Twitter: https://x.com/MelkeyDevSponsored by This Dot Labs: thisdot.co
In this episode of the Modern Web Podcast, hosts Rob Ocel and Danny Thompson sit down with Andre Landgraf, Senior Developer Advocate at Neon (now part of Databricks), to explore the evolving role of AI agents in developer workflows. They discuss how more Neon databases are being spun up by agents than humans, what that means for developer and agent experience (DX vs AX), and how tools like MCP and step functions are enabling scalable agent orchestration. The conversation also touches on agent security concerns, real-time vs. async UX, and how developers can build resilient, human-in-the-loop AI systems today. Plus, Andre shares practical insights from building his own personal CRM agent and experimenting with tools like Cortex and Ingest.Keypoints from this episode:- Agents now outpace humans in provisioning databases on Neon, thanks to agent-friendly APIs, early MCP support, and seamless integration with platforms like Replit and v0.dev.- Developer experience (DX) principles directly inform agent experience (AX), tools designed for simplicity and clarity often translate well to agent interactions, but agents still need unique guardrails like resumability and fine-grained permissions.- Agent orchestration is the next big frontier, with tools like LangBase, Ingest, and step functions offering patterns for chaining tasks, running agents in parallel, and retrying failed steps—enabling more resilient and scalable AI systems.- Async UX patterns are crucial for agent-powered apps, especially as LLMs become slower and more complex. Real-time feedback, task progress indicators, and human-in-the-loop controls will define effective agent interactions.Chapters00:00 Why apps don’t talk to each other 01:44 Meet Andre Landgraf from Neon 02:39 Agents now outnumber humans on Neon 05:03 DX vs AX: Building for agents 08:58 Security and authorization for agents 13:06 What’s missing for real adoption 17:06 Building a personal CRM with agents 20:04 MCP as the universal app interface 23:32 Agent orchestration and async UX 26:46 Step functions and background tasks 30:04 Are agents ready for real-time UX? 33:19 Human-in-the-loop patterns 35:59 Where to find Andre Follow Andre Landgraf on Social Media:Twitter: https://x.com/AndreLandgraf94Linkedin: https://www.linkedin.com/in/andre-landgraf/Sponsored by This Dot Labs: thisdotlabs.com
On this episode of the Modern Web Podcast, Rob Ocel and Danny Thompson talk with Brian Morrison, Senior Developer Educator at Clerk. They cover the state of authentication today, what makes Clerk stand out for small teams and indie builders, and how thoughtful developer experience design can make or break adoption.Brian shares why bundling tools like auth, billing, and user management is becoming more common, how Clerk handles real-world concerns like bot protection and social login, and why starting with a great developer experience matters more than ever.The conversation also explores the role of AI in software development and content creation, where it helps, where it hurts, and how to use it responsibly without losing quality or trust.Keypoints for this Episode:Modern auth is about experience, not just security. Clerk simplifies user management, social login, bot protection, and subscription billing with developer-friendly APIs and polished default UIs.Bundled platforms are making a comeback. Developers are shifting from handpicking tools to using tightly integrated services that reduce setup time and complexity.Developer education needs more care and creativity. Brian emphasizes the importance of visual storytelling, thoughtful structure, and anticipating confusion to help devs learn faster and retain more.AI is a productivity multiplier, not a replacement. The group discusses how AI can accelerate development and content creation when used with oversight, but warn against using it to blindly build entire apps.Follow Brian Morrison on Social MediaTwitter: https://x.com/brianmmdevLinkedin: https://www.linkedin.com/in/brianmmdev/Sponsored by This Dot: thisdotlabs.com
On this episode of the Modern Web Podcast, hosts Rob Ocel, Danny Thompson, and Adam Rackis are joined by Tejas Kumar, host of The Contagious Code podcast, author of Fluent React, and Developer Relations Engineer for Generative AI at DataStax. They unpack the current wave of AI announcements from Google I/O and Microsoft Build, and zoom in on the significance of MCP (Model Context Protocol) as a foundational shift in how AI-powered apps will be built and used.Tejas breaks down what MCP is, why it's catching on across the industry, and how it could become the HTTP of AI apps. The group explores real-world examples, like AI apps managing your inbox or booking flights without ever opening a browser, and discuss how MCP servers enable secure, agent-driven experiences that can act on your behalf. They also touch on hallucinations, the role of fine-tuning vs. tool integration, and the future of checkout flows powered by AI agents.Keypoints from this Episode:- MCP enables structured communication between AI apps and servers, allowing agents to perform real tasks like sending emails or booking flights- Users will increasingly interact with applications through natural language, with agents handling workflows behind the scenes- Connecting models to tools via MCP helps reduce hallucinations by ensuring actions and responses are grounded in real data- Most use cases benefit more from retrieval-augmented generation and strong tool integration than from expensive model fine-tuningFollow Tejas on Social MediaTwitter: https://x.com/TejasKumar_Linkedin: https://www.linkedin.com/in/tejasq/
In this episode of the Modern Web Podcast, Rob Ocel, Danny Thompson, and Adam Rackis sit down with Ahmad Awais, CEO and founder of LangBase, to talk about agents, context, and the future of AI-assisted software development. Ahmad shares the origin story of Chai.new, an agent that builds agents, and why he believes context, not code, is the true value layer in the AI era. The group unpacks how "vibe coding" is reshaping who can build software, why Chai isn’t just another AI assistant, and how agents might evolve into personalized, production-grade tools for everyone, technical or not. Plus: Tailwind analogies, Stanford lectures, sports nutrition agents, and a CLI that went viral in a hospital.Key points from this episode:- Ahmad Awais explains that AI agents aren't magic; they're just a new paradigm for writing software. What makes them powerful is their ability to act autonomously with relevant context, not just generate text.- Chai.new helps developers (and non-developers) create purpose-built agents without needing deep ML expertise. It abstracts complex concepts like memory, retrieval, and orchestration into an approachable interface.- Ahmad emphasizes that the real opportunity lies in agents tailored to individual users and use cases. Personal agents with custom context outperform generic ones, much like small teams beat massive frameworks for specific problems.- Chai and LangBase aim to bring AI development to the millions of engineers who aren't AI researchers. With tools like Chai, you don’t need a PhD to build powerful, production-ready AI agents.Follow Ahmad Awais on Social MediaTwitter: https://x.com/MrAhmadAwaisLinkedin: https://www.linkedin.com/in/mrahmadawais/Sponsored by This Dot: thisdot.co
In this episode of the Modern Web Podcast, Danny Thompson sits down with Reed Harmeyer, CTO of Skylight Social, and Brandon Mathis, React Native engineer at This Dot Labs. They unpack the technical and strategic decisions behind Skylight’s meteoric growth: why they built on the AT Protocol, how they tackled video discovery and scaling challenges, and how a fast-tracked in-app video editor gave them an edge.Keypoints from this episode:Skylight Social was built on the AT Protocol, allowing users to retain followers across platforms like Blue Sky and enabling creators to publish interoperable content in a decentralized social network.The team used React Native with Expo to achieve rapid development and cross-platform performance—launching a high-quality, TikTok-like video experience in just days.An in-app video editor was prioritized to reduce friction for creators, built using a native SDK wrapped with Expo Modules, enabling features like clip rearranging, overlays, voiceovers, and AI-generated captions.User behavior data—specifically watch time—drives content recommendations, not just likes or follows, helping Skylight offer a personalized experience while navigating scaling challenges from hypergrowth.Follow Reed Harmeyer on Social MediaBluesky: https://bsky.app/profile/reedharmeyer.bsky.socialLinkedin: https://www.linkedin.com/in/reed-harmeyer/
In this episode of the Modern Web Podcast, Rob Ocel and Danny Thompson sit down with Julián Duque, Principal Developer Advocate at Heroku, to talk about Heroku’s evolution into an AI Platform-as-a-Service. Julián breaks down Heroku’s new Managed Inference and Agents (MIA) platform, how they’re supporting Claude, Cohere, and Stable Diffusion, and what makes their developer experience stand out.They also get into Model Context Protocols (MCPs)—what they are, why they matter, and how they’re quickly becoming the USB-C for AI. From internal tooling to agentic infrastructure and secure AI deployments, this episode explores how MCPs, trusted environments, and better AI dev tools are reshaping how we build modern software.Key Points from this episode:- Heroku is evolving into an AI Platform-as-a-Service with its new MIA (Managed Inference and Agents) platform, supporting models like Claude, Cohere, and Stable Diffusion while maintaining a strong developer experience.- MCPs (Model Context Protocols) are becoming a key standard for extending AI capabilities—offering a structured, secure way for LLMs to access tools, run code, and interact with resources.- Heroku's AI agents can perform advanced operations like scaling dynos, analyzing logs, and self-healing failed deployments using grounded MCP integrations tied to the Heroku CLI.- Despite rapid adoption, MCPs still have rough edges—developer experience, tooling, and security protocols are actively improving, and a centralized registry for MCPs is seen as a missing piece.Chapters0:00 – What is MCP and why it matters3:00 – Heroku’s pivot to AI Platform-as-a-Service6:45 – Agentic apps, model hosting, and tool execution10:50 – Why REST isn’t ideal for LLMs14:10 – Developer experience challenges with MCP18:00 – Hosting secure MCPs on Heroku23:00 – Real-world use cases: scaling, healing, recommendations30:00 – Common scaling challenges and hallucination risks34:30 – Testing, security, and architecture tips36:00 – Where to start and final advice on using AI tools effectivelyFollow Julián Duque on Social MediaTwitter/X: https://x.com/julian_duqueLinkedin: https://www.linkedin.com/in/juliandavidduque/Sponsored by This Dot: thisdotlabs.com
In this episode of the Modern Web Podcast, Rob Ocel and Danny Thompson talk with Hannes Rudolph, Community Manager at RooCode, to explore how this fast-moving, community-driven code editor is rethinking what AI-assisted development looks like. Hannes breaks down Roo’s agentic coding model, explains how their “boomerang tasks” tackle LLM context limits, and shares lessons from working with contributors across experience levels.Keypoints from this episode:- RooCode's "boomerang" architecture breaks complex coding tasks into structured, recursive subtasks, helping AI agents stay focused while avoiding context bloat and hallucination chains.- Developers can build their own orchestrator and agent modes in Roo, tailoring persona and instructions to fit specific workflows—crucial for long-term productivity.- Unlike many tools, RooCode shows developers exactly how much each LLM call costs in real time, empowering teams to manage both quality and budget.- RooCode is deeply community-driven, with user-submitted PRs frequently reshaping priorities. The team emphasizes transparency, collaboration, and accessibility for contributors at all levels.Follow Hannes Rudolph on Linkedin: https://www.linkedin.com/in/hannes-rudolph-64738b3b/Sponsored by This Dot: thisdotlabs.com
In this episode of the Modern Web Podcast, Rob Ocel is joined by Danny Thompson, Adam Rackis, and special guest Coston Perkins for a lively discussion on the evolving role of AI in software development. The group swaps thoughts on everything from the rise of AI agents like RooCode and Claude, to what makes tools like Vercel’s v0 surprisingly powerful for frontend work. They debate Tailwind’s dominance as the styling output of choice for AI tools, unpack the implications of Shopify’s AI-mandate memo, and tackle the big question: will AI reshape team structures or just amplify developer productivity?Keypoints from this episode:- AI agents in everyday development – The hosts discuss how tools like RooCode, Claude, and Cursor are reshaping daily coding workflows, enabling everything from automated documentation to feature planning and refactoring.- Vercel's v0 is changing perceptions – Originally seen as a landing page generator, v0 is now appreciated for its live, code-focused interface, showing promise for serious frontend development with real-time editing and deployment.- Tailwind’s dominance in AI output – The conversation dives into why Tailwind has become the styling default for AI-generated components, and whether that’s a productivity boost or a future limitation.- AI’s impact on hiring and team structure – The group debates whether AI will reduce developer headcount or empower mid-level devs to produce senior-level output—suggesting AI may reshape team dynamics more than replace them.Follow Coston Perkins on Linkedin: https://www.linkedin.com/in/costonperkins/Sponsored by This Dot: thisdot.co
loading
Comments (2)

Helen -

Thanks for sharing! I'd also recommend checking out an interesting guide on React comparing tio Flutter - https://existek.com/blog/flutter-vs-react-native-in-2022/.

Jul 25th
Reply

Abhishek Prasad

is the whole episode uploaded? cz it cuts off midway

Feb 21st
Reply