Alex Edmans on Confirmation Bias
Description
<iframe title="Embed Player" src="https://play.libsyn.com/embed/episode/id/30633068/height/128/theme/modern/size/standard/thumbnail/yes/custom-color/87A93A/time-start/00:00:00 /playlist-height/200/direction/backward/download/yes/font-color/FFFFFF" height="128" width="100%" scrolling="no" allowfullscreen="" webkitallowfullscreen="true" mozallowfullscreen="true" oallowfullscreen="true" msallowfullscreen="true" style="border: none;"></iframe>
How hard do we fight against information that runs counter to what we already think? While quantifying that may be difficult, Alex Edmans notes that the part of the brain that activates when something contradictory is encountered in the amygdala – “that is the fight-or-flight part of the brain, which lights up when you are attacked by a tiger. This is why confirmation can be so strong, it’s so hardwired within us, we see evidence we don’t like as being like attacked by a tiger.”
In this Social Science Bites podcast, Edmans, a professor of finance at London Business School and author of the just-released May Contain Lies: How Stories, Statistics, and Studies Exploit Our Biases – And What We Can Do About It, reviews the persistence of confirmation bias — even among professors of finance.
“So, what is confirmation bias?” he asks host David Edmonds. “This is the temptation to accept something uncritically because we’d like it to be true. On the flip side, to reject a study, even if it’s really careful, because we don’t like the conclusions.”
Edmans made his professional name studying social responsibility in corporations; his 2020 book Grow the Pie: How Great Companies Deliver Both Purpose and Profit was a Financial Times Book of the Year. Yet he himself encountered the temptation to both quickly embrace findings, even flimsy ones, that support our thesis and to reject or even tear apart research, even robust results, that doesn’t.
While that might seem like an obviously critical thinking pitfall, surely knowing that it’s likely makes it easier to avoid. You might think so, but not necessarily. “So smart people can find things to nitpick with, even if the study is completely watertight,” Edmans details. “But then the same critical thinking facilities are suddenly switched off when they see something they like. So intelligence is, unfortunately, something deployed only selectively.”
Meanwhile, he views the glut of information and the accompanying glut of polarization as only making confirmation bias more prevalent, and not less.
Edmans, a fellow of the Academy of Social Sciences and former Fulbright Scholar, was previously a tenured professor at the Wharton Business School and an investment banker at Morgan Stanley. He has spoken to policymakers at the World Economic Forum and UK Parliament, and given the TED talk “What to Trust in a Post-Truth World.” He was named Professor of the Year by Poets & Quants in 2021.
To download an MP3 of this podcast, right-click this link and save. The transcript of the conversation appears below.
David Edmonds: What do you think about Social Science Bites? Obviously, that it’s the world’s most interesting podcast on social science. Now, suppose you were to read an article that purported to back that up with evidence. Well, given your prior belief, you might be more inclined to believe it, and more inclined to dismiss an article that came to the opposite view. This is confirmation bias, and Alex Edmans of the London Business School has become so concerned about this and other biases that he’s written a book about it, May Contain Lies. Alex Edmans, welcome to Social Science Bites.
Alex Edmans: Thanks, David. It’s great to be here.
Edmonds: The topic we’re talking about today is confirmation bias. This is part of your research on misinformation. But this entire research project seems to be a departure for you. You’re a professor of finance, and I think it’s fair to say that most of your research has been so far on corporate governance.
Edmans: And that’s correct. So I never set out to be a professor of misinformation. So most of my work is on social responsibility. It comes under different names like ESG, or sustainable finance and I believe I was one of the early pioneers of the idea that companies that are good for society also deliver higher shareholder returns. So one of my early papers found that companies that treat their workers well create a great corporate culture, they do better than their peers, and that’s how I got into the idea of sustainable finance.
Edmonds: So treating their workers well means what? Just having a nice atmosphere at work, paying them better than the market rate? What counts as treating your employees well?
Edmans: So, what I measured was inclusion in the list of the 100 Best Companies to Work For in America. So that is a list compiled after surveying 250 to 5,000 employees at all levels, and asking them 57 questions on issues such as credibility, respect, fairness, pride and camaraderie. So certainly quantitative factors, like pay and benefits will affect that, but also qualitative factors such as, does the boss treat you as an individual rather than a factor of production? Do they give you opportunities to step up and to develop within the organization? So all of those intangible factors also matter, and what I found was that companies that did treat their workers well, they did outperform. So it’s not that they’re being fluffy or woke, they are being commercially sensible investing in the most important asset, their people.
Edmonds: That’s a lovely result. That’s the result we want to hear. We want to believe that companies that treat their employees well, treat the planet well, do better. Is it also true that diversity of employment helps, having different ethnic groups represented on the board is good for the company? Because I know some people have done some research on that claim that that is the case.
Edmans: Yeah, so that’s how I got into the topic of misinformation. So being seen as one of the early pioneers on sustainable finance with that paper, then I learned of other papers, which also seem to give similar results that if you do good stuff, then you do better as a company. And one of the things which is good stuff is to be a diverse company. And that’s something that I would love to believe, as somebody who believes in the importance of sustainability. Personally, maybe you don’t get this on the podcast, but I’m an ethnic minority, so I have personal reasons to want that to be true. And there’s tons of studies out there by luminary organizations such as McKinsey, or BlackRock, even a regulator like the Financial Reporting Council, claiming there’s a clear link between board diversity and financial performance. But when you look at those papers, there’s a huge amount of flimsiness to this, the evidence does not at all support the conclusions. And just to give you an example of one study, it claimed to find a strong result — it did 90 different tests. None of those tests gave a positive result. But they just lied in the executive summary, they claimed a result that just wasn’t there. And people just accepted it without checking the tables at the back of the report, because they wanted it to be true.
Edmonds: Lied is a very strong term. So it wasn’t just that they wanted to believe it like all those who read the report, it was actually that they were deceiving their readers.
Edmans: And that’s correct. So they misrepresented the results. So you might think there’s different forms of misinformation. One form you might think is you just conducted the test in a sloppy way, you found a positive result, claimed a positive result, and that result was sloppy. Here, there is actually no result to begin with. So, the results did not find a positive result and they claimed to have found that and so that is a different level of misinformation. And this highlights how misinformation is such a strong issue. Yes, we can quibble about whether the methodology is correct, did you measure diversity the correct way? But actual representation of your own results is something that people might be deceiving the readers with.
Edmonds: But this piece of research was so compelling because so many people wanted to think it was true.
Edmans: That’s correct. And this is the big issue of confirmation bias. So, what is confirmation bias? This is the temptation to accept something uncritically because we’d like it to be true. On the flip side to reject a study, even if it’s really careful, because we don’t like the conclusions. So, this study, this idea that diversity improves performance, there are lots of well-meaning people who thought, ‘Yeah, that is just intuitively true. Well, a diverse team makes better decisions and s