DiscoverMy Favorite Mistake: Learning Without Blame in Business and Leadership
My Favorite Mistake: Learning Without Blame in Business and Leadership

My Favorite Mistake: Learning Without Blame in Business and Leadership

Author: Mark Graban

Subscribed: 34Played: 1,363
Share

Description

My Favorite Mistake is a podcast about learning without blame in business and leadership.


Despite the name, it’s not just my favorite mistake—it’s yours, it’s ours, and it’s what we can all learn from when things don’t go as planned.


Hosted by author and consultant Mark Graban, each episode features honest conversations with leaders, executives, entrepreneurs, and changemakers about a meaningful mistake they made—and what they learned after things went wrong. How they responded. How they improved. How they grew as leaders.


This isn’t a show about failure theater, gotcha moments, or simplistic “lessons learned.” It’s about how real people reflect, improve, and lead better in complex organizations—without scapegoating, shame, or hindsight bias.


What You’ll Hear


• Leadership and management mistakes that reshaped careers, teams, and organizations
• How teams and leaders learn without blaming individuals
• Insights about culture, systems, decision-making, and psychological safety
• Practical lessons drawn from real experience, not abstract theory


Guests come from business, healthcare, technology, sports, entertainment, government, and academia, sharing stories that reveal how learning actually happens.


The Perspective


Mark brings a systems-thinking lens grounded in Lean management, continuous improvement, and psychological safety. The focus is less on who messed up and more on what the system taught us.


Who This Podcast Is For


• Leaders and managers who want to learn from mistakes without blame
• Executives working to build healthier, more resilient cultures
• Professionals who believe improvement starts with reflection, not punishment


My Favorite Mistake: Learning Without Blame in Business and Leadership

388 Episodes
Reverse
Baseball has always made room for human error. Umpires miss calls. Fans complain. Life goes on. But this season, MLB is rolling out the Automated Ball-Strike challenge system -- ABS -- giving teams two challenges per game to contest ball-and-strike calls. The idea is to reduce bad calls. The likely side effect is a whole new category of mistakes. In this "Mistake of the Week," Mark Graban looks at what happens when correcting human error depends on another human decision -- and what one anonymous coach predicted, vividly, about how this will play out.
A forgotten water heater tap led to an overnight leak, an unexpected ice rink, and a reminder that the real lesson isn’t about blame — it’s about designing systems that catch small mistakes before they spread. A small, human slip led to a big, icy problem in a neighborhood in northwest China. After a woman forgot to turn off the tap on her solar water heater, water flowed unnoticed for nine hours — and overnight temperatures turned the street outside into an accidental skating rink. In this episode of Mistake of the Week, we look past blame and shame to ask a better question: why did the system require perfect memory, instead of detecting the problem or shutting itself off? It’s a story about water leaks, design flaws, and how small mistakes can spread when systems aren’t built to catch them early — along with practical lessons for our own homes about alarms, automatic shutoffs, and mistake-proofing everyday risks. Source news story
After winning gold at the Winter Olympics, skier Breezy Johnson did what champions do — she jumped for joy. And her medal fell off. She later joked, “Don’t jump in them… I was jumping in excitement and it broke,” adding that it was “not, like, crazy broken. But, a little broken.” Other athletes experienced similar ribbon failures during their celebrations. In this episode of Mistake of the Week, Mark Graban looks at what happens when a system fails during the very moment it’s designed to support — and why it’s encouraging that Olympic officials acknowledged the problem instead of blaming the athletes. Because if your medal can’t survive celebration… what exactly was it tested for? This episode explores: Designing for real human behavior (including joy) The importance of testing under realistic conditions Why admitting a flaw beats assigning blame What organizations can learn from a broken ribbon
Most of us pull up to a gas pump on autopilot—until something goes wrong. In this Mistake of the Week, host Mark Graban looks at a real-world systems failure that affected hundreds of drivers across the Denver metro area. Due to an upstream error at a fuel terminal, diesel fuel was mistakenly delivered into the gasoline supply—leading to stalled cars, tow trucks, and costly repairs. Instead of rushing to blame or punishment, Colorado regulators emphasized learning, investigation, and prevention. That response matters—and it offers an important lesson about mistake-proofing, system design, and leadership. In this episode, Mark explores: Why focusing on who made the mistake misses the real problem How mistake-proofing works—and where it often fails Why downstream safeguards can’t fix upstream system errors What leaders can learn from choosing curiosity over blame Mistakes like this are disruptive and expensive—but they also create an opportunity to improve systems so the same error doesn’t happen again.
A devastating hospital mistake in Glasgow was described by leaders as “human error,” even as they acknowledged that “very rigorous processes” were not followed. In this episode of The Mistake of the Week, Mark Graban examines why suspensions and discipline don’t guarantee improvement — and how gaps between written procedures and real work create hidden risk. Punishment may feel like accountability, but without fixing the system, the same harm remains possible.
What does a failed bank robbery have to do with one of the most cited ideas in psychology? More than you might expect. In this episode of My Favorite Mistake, Mark Graban tells the true story of McArthur Wheeler, a man who believed that rubbing lemon juice on his face would make him invisible to security cameras. Confident in his reasoning—and even more confident in his ability to test it—Wheeler walked into two Pittsburgh banks in broad daylight, fully exposed, certain that his citrus-based logic would protect him. It didn’t. When police later showed him clear surveillance photos, Wheeler’s stunned response became legendary: “But I wore the juice.” That moment caught the attention of psychologist David Dunning, who saw in Wheeler’s mistake something deeper than criminal incompetence. Along with Justin Kruger, Dunning went on to study how people with low skill often lack the awareness to recognize their own limitations—research that became known as the Dunning–Kruger Effect. This episode explores the layered nature of mistakes: flawed assumptions, poorly designed tests, and the dangerous certainty that both are correct. It’s not a story about stupidity. It’s a story about human blind spots—and how easily confidence can outrun competence. Whether in leadership, work, or everyday life, the lesson is universal: it’s not enough to test our ideas. We also have to test how we test them. Because some of the most convincing mistakes are the ones that feel like proof.
In this week’s Mistake of the Week, a company’s HR team accidentally sent a mass termination email to the entire workforce — including the CEO. The culprit was an offboarding automation tool left in the wrong mode, turning a routine test into a company-wide panic. Mark Graban explores what this moment teaches about automation, human fallibility, and the danger of relying on memory in systems that affect people’s livelihoods. Instead of asking, “Who pressed the wrong button?”, the real question is, “Why was this mistake even possible?” A funny story now, but a real lesson in error-proofing or the lack thereof. Because even when no one’s actually fired, the fear can linger long after the email is retracted.
Why do New Year’s resolutions fail so predictably—and what does that teach us about change at work? In this Mistake of the Week, Mark Graban explores why treating change as a test of willpower is a reliable setup for frustration, both personally and in organizations. Drawing on behavioral psychology and leadership examples, the episode connects failed personal resolutions to common organizational mistakes: big announcements, ambitious targets, and too little attention to system design and psychological safety. The takeaway is practical and actionable: instead of trying to boost motivation or eliminate human error, leaders should focus on making the right choices easier and the wrong ones harder—starting small, iterating, and learning forward instead of blaming backward.
Nick Saban calls it “the dumbest decision I ever made” — a fourth-and-one call from the 2001 SEC Championship Game that still sticks with him. In this episode, Mark Graban breaks down why even the greatest coaches make mistakes, what Saban learned from the moment, and how leaders can turn high-pressure missteps into opportunities for trust and growth. Perfect for listeners interested in leadership, football, coaching, and the psychology of mistakes.
Jingle Bells is one of the most recognizable Christmas songs ever written… except it wasn’t written for Christmas at all. In this week’s Mistake of the Week, we unpack one of America’s most enduring cultural misconceptions: the belief that Jingle Bells has anything to do with Christmas. Originally titled One Horse Open Sleigh, the song debuted at a Thanksgiving church service in the 1850s and was inspired not by Santa or reindeer, but by noisy, fast sleigh races in Medford, Massachusetts. No Christmas trees. No North Pole. Just winter racing, youthful chaos, and a catchy melody. Over the decades, repetition turned assumption into “truth,” and a Thanksgiving song quietly shifted into a holiday anthem. It’s a perfect example of how knowledge mistakes spread — harmless, familiar, and rarely examined. In this 3–4 minute episode, Mark explains: Why Jingle Bells was never meant to be a Christmas song How repetition and cultural habit transformed it anyway What this teaches us about assumptions, organizational habits, and the stories we never question Why small knowledge mistakes can persist for generations If you care about learning, improvement, and understanding how mistaken beliefs take root, this episode offers a fun seasonal reminder: even our most cherished “facts” deserve a second look.
A 32-year-old woman in Switzerland underwent an unnecessary surgery after her lab sample was mixed up at Basel University Hospital. Doctors believed she had cervical cancer. She didn’t — but the procedure went ahead anyway, potentially affecting her ability to carry a pregnancy in the future. In this Mistake of the Week, Mark Graban unpacks how such devastating but preventable errors happen — and why “being careful” isn’t a real safeguard. Drawing on past lab mix-ups he’s written about, Mark explores how system design, workload pressure, and weak error-proofing make these tragedies almost inevitable. This isn’t about bad people or careless workers. It’s about fragile systems — and how hospitals can build processes that catch mistakes before they reach the patient. Because real safety starts with learning, not blaming.
In this Mistake of the Week, Mark Graban breaks down an incident involving an American Airlines A319 on final approach to Phoenix — captured on video with its landing gear still up. A cockpit alert sounded, the crew realized what was missing, and the pilots executed a safe go-around. Their explanation to air traffic control? A perfectly understated: “It wasn’t configured in the appropriate manner.” Mark explores why these near-misses are less about individual oversight and more about systems built to detect — and correct — human error. From checklists to cockpit warnings to the decision to go around instead of pushing forward, this episode highlights why safety depends on catching mistakes early, not pretending they don't happen.  
In this edition of Mistake of the Week, Mark Graban tells a story that didn’t appear in any safety report or headline — it happened on a pickleball court. Early in learning the sport, Mark found his old tennis instincts taking over, leading to a very incorrect serve and a moment of embarrassment. What followed was a small but meaningful lesson in feedback, psychological safety, and the challenge of unlearning deeply wired habits. Supportive coaching, timely correction, and a friendly playing environment turned an awkward mistake into a productive one. Mark reflects on why unlearning is often harder than learning, and how leaders can create conditions where people feel safe enough to improve.
In this week’s Mistake of the Week, Mark Graban tells the story of a Maine hospital system that accidentally mailed condolence letters to 531 very-much-alive patients. The cause? A computer glitch — and a few missing fail-safes. Mark explores what this bizarre mix-up reveals about system design, automation, and trust in healthcare. Beyond the absurd headline lies a familiar pattern: when we blame people instead of learning from process failures, we guarantee more mistakes. So what does “fully resolved” really mean? And what can leaders learn from a mistake that’s literally to die for? If you received this episode through your podcast app and not a séance, you’re doing fine.  
Patrick Engasser spent two years ranked near the bottom of a 615-person sales organization -- broke, in debt, and grinding through trial and error -- before one decision changed everything. He hired a coach. The problem wasn't effort or talent. It was not knowing that was even an option. Episode page with links and more In this episode, Patrick shares the mindset shift that had to happen before any strategy could work, how he turned blindness from a perceived liability into a genuine competitive advantage, and what actually separates leaders people want to follow from managers who just have a title. What you'll learn: Why trial and error is the most expensive way to learn -- and what to do instead How mindset has to come before strategy in any coaching relationship What real leaders do differently when things go wrong How to coach people through excuses without damaging the relationship What procrastination is really telling you -- and how to interrupt it What to do (and not do) when you encounter a guide dog in public Patrick Engasser is the bestselling author of "If I Can Do It, You Can Do It" and a business coach and motivational speaker who built a seven-figure sales team after starting from zero.
Dr. Jen Fry's favorite mistake is a disagreement with her best friend of over ten years -- a small miscommunication that led to eight months of silence. Neither of them knew how to reconcile it. Then Jen's mother passed away, and her friend sent a card. That single act of reaching out changed how Jen thinks about conflict, reconciliation, and the kind of people worth keeping in her life. Episode page with transcript, links, and more  Jen is a sports geographer, tech founder, TEDx speaker, and author of I Said No: A No-Nonsense Guide to Setting Boundaries, Speaking Up, and Having a Backbone Without Being a Jerk. In this conversation, she draws on her background as a college volleyball coach, tech founder, and conflict expert to break down what leaders and teams get wrong about conflict, feedback, and boundaries. We dig into why niceness gets weaponized to keep people quiet, why kindness requires accountability, and why people pleasing quietly ruins reputations and results. Jen explains why conflict-avoidant bosses create conflict-avoidant cultures, why anonymous feedback does more harm than good, and the critical difference between being defensive and defending yourself. She also shares what she saw on a high school volleyball video that she wishes she could burn -- and what it taught her about being a better teammate and leader.
In this episode of My Favorite Mistake, Mark Graban talks with Dr. Tyler B. Evans, infectious diseases and addiction medicine physician, public health leader, and author of Pandemics, Poverty, and Politics. Episode page with links, video, and more Dr. Evans shares a deeply personal “mistake” — giving up his dream of working in global health abroad to take what he thought was a conventional job in the United States. That decision led him to work with Native American communities in Wyoming, build refugee health programs in New York, and serve in leadership roles during the COVID-19 pandemic. What initially felt like a detour ultimately shaped his career and mission. The conversation explores the politicization of public health, the erosion of trust in expertise, and why solidarity among healthcare professionals may be essential to restoring confidence. Dr. Evans reflects on lessons from seatbelt laws, smoking reduction, and pandemic response — and why public health measures are fundamentally about protecting communities, not restricting individuals. They also discuss how scientific understanding evolves, how leaders can communicate uncertainty responsibly, and why learning — not blame — must guide how we respond to mistakes.
In Episode 339 of My Favorite Mistake, Mark Graban talks with Genevieve Skory, executive coach and former Chief Field Development Officer, about a leadership mistake that many high performers make: confusing performance with alignment. Episode page with links, video, and more For years, Genevieve defined winning by revenue and results. Pressure was normal. Constant pivoting felt strategic. Intensity was rewarded. The numbers came in — but so did exhaustion, turnover, and a culture operating in fight-or-flight mode. In this conversation, we explore the hidden cost of performance-at-all-costs leadership, the neuroscience behind fear-driven decision-making, and why teams don’t always tell leaders the truth when the environment feels unsafe. Genevieve shares what changed for her and how she now helps ambitious leaders build sustainable success without burnout. If you’ve ever sensed that strong results were masking deeper misalignment, this episode will resonate.
What happens when a leader realizes their approach caused real harm? In this episode of My Favorite Mistake, U.S. Marine Corps officer and leadership mentor Olaolu Ogunyemi shares a defining moment early in his career—recognizing that his leadership style, while well-intended, crossed a line and made a Marine cry. Episode page with links, video, and more Rather than defending his authority, Olaolu reflects on the gap between intent and impact, and how that moment forced him to rethink what effective leadership really looks like. We talk about learning from mistakes, the difference between fear-based compliance and true accountability, and why psychological safety is essential—even (and especially) in high-pressure environments like the military. This conversation explores how leaders grow when they confront mistakes honestly, respond with humility, and commit to changing their behavior—not just their words.
Ray Zinn—longtime CEO of Micrel Semiconductor and the longest-serving CEO of a publicly traded company in Silicon Valley history—doesn’t believe the real problem is making mistakes. He believes the real failure is repeating the same mistake without fixing it. Episode page with links, video, and more In this episode of My Favorite Mistake, Ray shares leadership lessons from nearly four decades running Micrel, including why popular slogans like “fail fast, fail often” can actually normalize bad habits, how leaders unintentionally punish learning, and what it takes to build a culture focused on honesty, accountability, and fast problem-solving instead of blame. Ray also reflects on how losing his eyesight in his late 50s fundamentally changed the way he led—forcing him to listen more deeply, trust others more fully, and become a more empathetic leader. Those experiences shaped his approach to leadership and his latest book, The Essential Leader. If you care about learning from mistakes, building strong cultures, and leading without fear or ego, this conversation will challenge—and sharpen—your thinking.
loading
Comments 
loading