Africa's war on misinformation with Abdullahi Alim and Nolita Mvunelo
Description
Africa is on the frontline of a fast-moving battle against digital misinformation, one with profound effects for politics, trust and daily life.
In this episode of We Kinda Need a Revolution, host Nolita Mvunelo talks to Abdullahi Alim, award-winning economist and CEO of the Africa Future Fund, about how social media, YouTube rumours, deepfakes and adversarial AI are reshaping the continent, often out of the global spotlight.
From election hoaxes to ethnic divisions stoked online, they highlight the unique and urgent challenges confronting the continent and the lack of accountability from major tech platforms.
But the conversation is also about hope: practical solutions like investing in education, boosting community resilience and creating spaces for honest, offline dialogue.
Drawing on his own journey from Somalia to a different life in Australia, Abdullahi reflects on how lived experience shapes his vision of the risks and opportunities Africa faces in the digital age.
Watch the episode:
Full transcript:
Nolita: While the world's attention is often elsewhere, Africa is facing a digital war on misinformation. Nations across the continent are facing a quieter but equally dangerous battle for the truth in the age of social media and AI, one that is reshaping politics trust and power. Welcome to We Kinda Need a Revolution, a special series of the Club of Rome Podcast where we explore bold ideas for shaping sustainable futures. I am Nolita Mvunelo, and today I'm speaking to Abdullahi Alim, an award-winning economist and CEO of the Africa Future Fund. Abdullahi is a leading voice on how disinformation and adversarial AI are reshaping power and trust. These are ideas that he examines in his foreign policy essay, how Africa's war on disinformation can save democracies everywhere. In this episode, we dive into the war on misinformation in Africa and ask, what risks lie ahead, what role are young people playing, and what will it take to build resilience and reclaim the digital space? Let's explore what's at stake and what's possible.
Hi, how are you doing? Thank you so much for joining us today.
Abdullahi: I'm good. Thanks. Thanks for having me, Nolita.
Nolita: Our discussion today is going to be on Africa's war and disinformation, but before we get into that, can you please tell us more about yourself and what led you into considering some of these challenges and these potentially existential risks?
Abdullahi: I think every idea needs to be drawn back to its origins, and that also holds for me as a person too. I was born in 1992 in Somalia, and I am of the children of that initial conflict that earned Somalia, the unfortunate nickname of a failed state. And I think going from that early childhood experience in in Somalia to eventually where we settled in Australia, in a more low income bubble when you are a product of failed systems, be it, examples of systems of migration, systems of transportation, systems of housing, you have no choice but to think deeply about how those systems operate to advantage some people and how they operate to disadvantage others. So, I think I've always been a deeply reflective person, even from a young age, and I take that with great responsibility, because my story isn't the norm. I'm the exception to the norm, having had the life that I've had so far, and I want to use that responsibly. And I think that starts not so much with solving things, but asking the right questions, and that's why I lend myself better to systemic issues, systemic fault lines, like what we're about to discuss today.
Nolita: So as a start, may you please take us through the challenge and the landscape?
Abdullahi: Sure. So I think when we think of disinformation, we think of it through a US Eurocentric lens, largely because it's language borrowed from the west. When we think about the large disinformation campaigns that pique media interest, we're usually talking about events that's around the US election, or perhaps proxy conflicts taking place in Europe between pro-Russian voices and pro NATO voices. But the world of disinformation actually expands beyond that, and I think it gets the least amount of attention in Sub Saharan Africa. Least amount of attention, but some of the most profound impacts. Why? Because, I think for the most part, identity on the continent is still delineated against clan, religious and ethnic lines. So, somebody could be of X nationality, but at the same time, they may have an additional loyalty, especially when conflict comes to rise. At a more granular level, the loyalty again, could be to their ethnic group, it could be to their religious group. It could be to their clan. Now, when you have an unregulated landscape of that sort, and when you have less sort of resources deployed by the big tech companies who have a large monopoly in the information highway in these parts of the world, what it means is that those regions, and principally Africa, in this moment, is most vulnerable and most at risk to the kind of disinformation tactics which seem quite analogue relative to what we typically think of disinformation. It really could just be somebody edited to look like they've said something when they haven't. It could be a court attributed to a particular leader of a group, any of those forms of misappropriated text or deep fakes, anything from one end to the other, can have real life ramifications.
Nolita: Do you have any like specific examples or cases where this has happened and what has the impact been? I say this also, like in the current context, where there is a lot of conflict right now, is that at the same time, Africa doesn't get the same type of global attention at times of conflict.
Abdullahi: I think the example that I can give again would be in Ethiopia, because it sort of happened at the worst possible time when the conflict in Tigray broke out in Ethiopia. And of course, this has been brewing for some time. I think it came off the backs of a lot of. Tech companies culling their trust and safety teams, budgets, councils. And what you had was one moderator, for example, for every let's say I'm giving an arbitrary figure here, just to sort of give you the scale one per million of population, so that really when you, when you reduce her to that level, you're never going to be at the scale necessary to be able to tackle this issue. We saw examples in Ethiopia where one faction would basically share an image of a leader from another particular faction. This is, again, was based on ethnic lines, saying a particular, particularly provocative statement against them, or suggesting that they were about to incite violence, which they never did. It got so bad that it reached the stage where that particular misappropriated community leader from the other group was killed off the backs of this misassumption. Now, when you look at the death toll in the Tigray conflict, clocking something around 600,000 people, you cannot disassociate that from social media and the role of disinformation in this particular form of warfare.
Nolita: So then I sense that there's an element of accountability and infrastructure, like what is available for governments or maybe even people to, you know, hold platforms accountable for the lack of infrastructure, like the lack of moderation, etc, but also who chooses what gets moderated, what is right, what is wrong, what can be shared, what can't be shared. Are there any initiatives, even at the state level or even at the international organisation level, that are addressing some of these challenges?
Abdullahi: Most of the efforts now are calling for more moderation, which would have worked a few years ago, but in the age of AI, actually, it's it's going to prove quite inconsequential. I'll tell you why. So I could literally put out propaganda that calls for and incites violence against even an individual, let alone a particular group, and in such a way that I use the latest, what we call adversarial AI, to change and augment the detail of the image from the back end in such a minute way that the naked eye won't see the difference. But a machine might misread as something completely different. So it might read it as, oh, that's a rose, or that's something that isn't inflammatory. So imagine that at scale. So the question then becomes, where do we go from here? Now, unfortunately, the AI ecosystem is quite closed around the world. A lot of these big companies are running closed models. We're outsourcing this huge responsibility to smaller teams behind these tech companies, who, for the most part, don't have the incentive and may not have the interdisciplinary expertise to be able to tackle this issue at their core. So that, I think is the number one issue at the moment is that we've got closed innovation ecosystems that as these problems get more and more advanced, these disinformation tactics become more and more advanced, it actually shuts the door from a global community of experts, both technical and non-technical, being able to come to the table to figure out how to counter that from an algorithm perspective, and we're outsourcing this important duty and responsibility to smaller and smaller companies whose main incentives is really just to win the AI race, as it's called. And so I think who bears the cost? Unfortunately, it will be the continent. It will fortunately be parts of the world that don't have that. Don't have that same level of fluency with these kind of more advanced disinformation campaigns. I also think nalita, we're paying the costs for decades long poor education systems and decades long lack of investments, lack of even just community spaces to heal divides, to create spaces where tension will arise when you bring up narratives and experiences, lived experiences in particular, but not doing