Discover
A brief history of AI from ancient times to the present day
A brief history of AI from ancient times to the present day
Author: Kristy Anamoutou
Subscribed: 0Played: 0Subscribe
Share
© Kristy Anamoutou
Description
AI didn't begin with ChatGPT. For curious minds and history geeks, this narrative podcast traces the history of artificial intelligence as a millennial human adventure across civilizations.
From 8th-century BCE Greek automata to the 11th-century African Ifá binary system, from Al-Jazari's Islamic robotics to Japanese karakuri—discover how humanity dreamed of intelligent machines millennia before computers.
Each episode (15-20 min) explores AI beyond pure technology: history, culture, philosophy.
https://us.histoire-ia.fr/introduction
From 8th-century BCE Greek automata to the 11th-century African Ifá binary system, from Al-Jazari's Islamic robotics to Japanese karakuri—discover how humanity dreamed of intelligent machines millennia before computers.
Each episode (15-20 min) explores AI beyond pure technology: history, culture, philosophy.
https://us.histoire-ia.fr/introduction
49 Episodes
Reverse
What the Deep Learning Revolution Teaches Us: Conclusion and opening toward the futureFrom AlexNet to ChatGPT. From DeepMind to DeepSeek. From Mistral to African-language models. From the AI Act to silicon gardens. Six continents. Fifteen years. What does this journey teach us?Four threads run through it. Exponential acceleration — each year brings capabilities that the previous year would have judged impossible. The global race — AI has become a geopolitical issue where technological alliances reflect political alliances. The concentration of power — a few companies dominate models, data, computing. The ambivalence of creators — those who invented deep learning are among the most worried about its consequences.But each continent also has its singularity. Africa builds its own models. America created the godfathers and the giants. Asia became the center of gravity. Europe invented the rule and made the exception emerge. The Middle East made gardens bloom. Oceania seeks its place.This period leaves us a transformative technology — and the responsibility to shape it. The tools are here. The questions are posed. The choices belong to us.The journey continues — where to, we decide together.
Archipelago of Innovation: How Australia seeks its place in the AI revolutionAustralia produces one point six percent of global AI research — but only zero point two percent of patents. The Australian paradox: scientific excellence is not converting quickly enough into economic power.Publications have doubled in ten years. Patents have quadrupled. CSIRO Data61 hosts one of the largest concentrations of AI expertise in the world. But Australia does not have a large language model comparable to GPT-4 or Claude.In December 2025, the National AI Plan tried to bridge this gap. The AI Safety Institute was created. Australia joined the international network of safety institutes.But the Australian choice was different from the European one. No specific AI law. A "light" approach to attract investment.Oceania seeks its place — between scientific excellence and commercialization, between geographic isolation and global connection. The archipelago continues to build its bridges.
Silicon Gardens: How the desert became an artificial intelligence laboratoryIn 2017, the United Arab Emirates appointed Omar Al Olama as Minister of Artificial Intelligence. He was thirty years old. It was a world first.This was not a symbolic gesture. It was a declaration of intent.In 2019, MBZUAI became the first university in the world entirely dedicated to AI. In 2022, Falcon LLM proved the Emirates could compete with giants. In 2024, Microsoft invested one and a half billion dollars in G42, the Emirati champion. AI could contribute ninety-six billion dollars to the Emirati economy by 2030.Israel, for its part, remained the "startup nation." Wiz reached twelve billion dollars in valuation. Nvidia acquired Run:ai. Ilya Sutskever, co-founder of OpenAI, opened a laboratory in Tel Aviv.Saudi Arabia invested hundreds of billions in NEOM — a futuristic city piloted by AI.The desert has bloomed. Silicon gardens are transforming yesterday's oil into tomorrow's data.
Rule and Exception: How Europe regulated AI and made Mistral emerge from the improbableIn 2016, AlphaGo defeated the world champion of Go. In 2020, AlphaFold solved the protein folding problem. In 2024, Demis Hassabis received the Nobel Prize in Chemistry. DeepMind, founded in London, had proven that Europe could produce AI excellence.Then came the rule. On July 12, 2024, the European AI Act was published — the world's first comprehensive regulation of artificial intelligence. Europe was choosing to regulate what it did not dominate.But the exception emerged where no one expected it. In April 2023, three Frenchmen founded Mistral in Paris. Eighteen months later, the company was valued at fourteen billion dollars. The three founders became the first French AI billionaires.Europe has drawn red lines — mass facial recognition prohibited, behavioral manipulation banned. It has also proven it can innovate.Rule and exception coexist. History will tell which prevails.
The New Center of Gravity: How Asia became the beating heart of global artificial intelligenceIn May 2023, a Chinese company named DeepSeek was founded. Less than two years later, its models rivaled those of OpenAI — at a fraction of the cost.The world was surprised. It should not have been.China had one million six hundred seventy thousand AI-related companies. It was filing seventy percent of global AI patents. Taiwan was manufacturing ninety percent of the planet's advanced chips — the "silicon shield" that makes the island indispensable. India had become the world leader in AI skills penetration.Morris Chang invented TSMC in 1987 after being "put out to pasture" at Texas Instruments at fifty-four. Fei-Fei Li, born in China, had created ImageNet — the database that launched the deep learning revolution.Parallel paths are converging. The center of gravity is tipping. Asia is no longer the periphery of global innovation — it is becoming its heart.
Godfathers and Giants: How America created deep learning and became the theater of the AI raceDuring the winters of artificial intelligence — those periods when no one believed — three researchers persisted. Geoffrey Hinton in Toronto. Yoshua Bengio in Montreal. Yann LeCun in New York. They were nicknamed the "godfathers of deep learning."In 2019, they received the Turing Prize — the "Nobel of computing." In 2024, Hinton received the actual Nobel Prize in Physics. The obstinate ones had transformed the world.Then came the giants. OpenAI launched ChatGPT — one hundred million users in two months. Anthropic proposed safer AI. Google, Meta, Microsoft entered the race. Training costs reached hundreds of millions of dollars.But the godfathers also became prophets of concern. Hinton resigned from Google to sound the alarm freely. Bengio advocates for global governance. The joy of creating mingles with the anguish of what is created.Further south, Latin America leapt forward. Forty percent AI adoption. Argentina saved seventy-two billion liters of water through smart irrigation. Brazil applied AI to agriculture.The Americas created deep learning — and the questions it poses.
The Quantum Leap: How Africa built its own models and revealed the biases of global AIThe electron, physicists say, does not cross the space between two orbits. It disappears from one and appears in the other. A quantum leap.Africa made this leap.In 2007, M-Pesa transformed global financial inclusion from Kenya — before Apple even thought of Apple Pay. In 2023, InstaDeep, founded in Tunis, was acquired by BioNTech for six hundred eighty-two million dollars — the largest acquisition of an African technology company in history.But Africa did not content itself with adopting AI. It reinvented it.Masakhane brought together more than two thousand researchers to create natural language processing tools for African languages. Intron Health developed speech recognition for African accents where Western systems failed. Awarri built the first Nigerian large language model.Timnit Gebru and Joy Buolamwini revealed that facial recognition systems erred up to thirty-five percent more for dark-skinned women. AI is not neutral — it bears the mark of its creators.Africa is no longer waiting to be included in global AI. It is building its own. The palaver tree has gone digital.
The Deep Learning Revolution: How fifteen years upended everything we thought we knew about intelligenceOn September 30, 2012, in a bedroom at his parents' house, a Canadian doctoral student trained a neural network on two video game graphics cards. Eight days later, his system shattered all image recognition records. The world of artificial intelligence shifted.AlexNet. ChatGPT. AlphaFold. DeepSeek. Mistral. These names mark a dizzying acceleration unprecedented in the history of technology. In fifteen years, AI passed from the laboratory to the daily lives of billions of human beings. Machines learned to see, to speak, to write, to reason. They passed bar examinations. They predicted the structure of two hundred million proteins. They defeated world champions at the most complex games.But this revolution did not have just one epicenter. Africa produced more than two thousand four hundred AI companies. The Emirates appointed the world's first Minister of Artificial Intelligence. France created Mistral, the only credible European competitor to the American giants. China filed four times more AI patents than the United States. India became the world leader in AI skills penetration.You will traverse six continents. Fifteen years of dizzying acceleration. From the laboratories of Toronto to the factories of Taiwan. From the startups of Tunis to the foundries of Abu Dhabi. From the servers of San Francisco to models in African languages.And everywhere, the same question: who shapes artificial intelligence — and according to what values?The godfathers of AI have become prophets of concern. Giants are engaged in a planetary race. Regulators are trying to keep up. The summer of deep learning continues — but no one knows how it will end.Welcome to A Brief History of Artificial Intelligence, season 6.All essays are available online.
What the Information Age Left Us: Conclusion and Opening Toward the Deep Learning RevolutionFrom the CSIRAC in Sydney to the WEIZAC in Rehovot. From the ruins of Berlin to Bletchley Park laboratories. From M-Pesa in Kenya to TSMC in Taiwan. From Dartmouth to ImageNet. Six continents. Sixty-five years. What does this crossing teach us?Four threads run through this period.The leap across the abyss. Africa jumped to mobile payments. India leaped toward software services. Taiwan invented an industrial model no one had imagined. Latecomers can become pioneers — if they invent their own path.The cycles of hope and disenchantment. AI experienced summers and winters. Unfulfilled promises triggered funding crises. But researchers who persisted during the winters prepared the following summers. Yann LeCun, Geoffrey Hinton, Yoshua Bengio — the "godfathers of AI" — worked in the shadows when no one believed.Continued invisibilization. Betty Holberton and the ENIAC programmers. Rose Dieng-Kuntz and Timnit Gebru. The Argentine ComIC pioneers. Women, minorities, and contributors from the Global South remain underrepresented in the official history — and in the teams that build AI.The convergence of parallel paths. Nakashima and Shannon. India and Japan. Taiwan and Korea. Asia now manufactures the chips that run global artificial intelligence. Paths traced for half a century lead to the same horizon.This period leaves us a question: who inherits the digital revolution?In 2006, Geoffrey Hinton relaunched neural networks. In 2009, Fei-Fei Li published ImageNet. In 2012, AlexNet proved that deep learning worked. The summer that opened would be the longest in history.But this summer inherits everything that came before — leaps and falls, frugal innovations and extinguished forges, biases that perpetuate themselves, and questions that remain open.The journey continues.
The Antipodes of Innovation: How Geographic Isolation Became an AdvantageIn November 1949, in Sydney, a machine of two thousand vacuum tubes executed its first calculation. The CSIRAC joined an exclusive club: stored-program computers. There were only four others in the world — all in Great Britain or the United States.Australia had built the fifth.Trevor Pearcey worked "largely independently of European and American efforts." Isolation became an advantage: without access to others' solutions, he had to invent everything. In February 1948, before the machine even worked, he wrote a prophetic sentence: "It is not inconceivable that an automatic encyclopaedic service operated through the telephone system will one day exist."The Internet. In 1948.Graeme Clark had grown up with a deaf father. In 1978, he implanted the first multichannel cochlear device. Rod Saunders heard. Today, more than one million people wear a cochlear implant.WiFi? The CSIRO team developed a wireless transmission technique that became an essential component of modern networks. When fourteen tech giants tried to invalidate their patent, the CSIRO won — and collected four hundred fifty million dollars.Google Maps? Born in Sydney. Where 2 Technologies, founded by two Australians and two Danes in an apartment in Hunters Hill. Google acquired them in 2004.Atlassian? Ten thousand dollars of credit card debt in 2002. Australia's first tech unicorn.Oceania, at the antipodes of power centers, invented bridges to the entire world.
Gardens of the Desert: How Necessity Made Innovation Bloom in the Middle EastNo one expects flowers in the desert. Yet that is where they grow fastest — when the rain finally comes.In 1954, a six-year-old country undertook to build a computer. Israel had just emerged from its war of independence. The borders were hostile. The economy fragile. The advisory committee included Albert Einstein — skeptical — and John von Neumann — enthusiastic. Some candidates had lost their diplomas in the Holocaust. In 1955, the WEIZAC executed its first calculation.Lotfi Zadeh was born in Baku, grew up in Tehran, emigrated to the United States. In 1965, he invented fuzzy logic — that way of representing vague concepts that humans handle intuitively. Americans were skeptical. The Japanese seized upon it. Today it is in your air conditioners, washing machines, cars.Unit 8200 — the Israeli equivalent of the NSA — became, without intending to, the world's greatest startup school. Gil Shwed built the first firewall there. Check Point, Palo Alto Networks, CyberArk — so many cybersecurity giants founded by its veterans.ICQ — "I Seek You" — was born in a Tel Aviv apartment in 1996. Four young Israelis invented instant messaging. AOL bought it for four hundred million dollars.Waze and Mobileye revolutionized navigation and autonomous driving. The "Startup Nation" exported eleven billion dollars in cybersecurity in 2021.The desert has bloomed. Necessity became invention.
Digital Reconstruction: How Europe Invented the Computer, Scuttled Its Future, and Rebuilt ItselfEurope invented the computer twice. The first time in secret. The second time in oblivion.In 1941, Konrad Zuse completed the Z3 in Berlin — the world's first programmable computer. The Nazi regime saw no use in it. A bombing raid destroyed it. In 1944, Tommy Flowers delivered Colossus to Bletchley Park — the first electronic computer, two years before ENIAC. He was ordered to burn the plans.Then came the Lighthill Report.In 1973, a British mathematician with no AI experience published a devastating assessment: "total failure to achieve its grandiose objectives." The government cut funding. Europe had just triggered the first "artificial intelligence winter."But Europe rebuilt itself.In Marseille, Alain Colmerauer invented Prolog — the language that would inspire the Japanese Fifth Generation project. At CERN, Tim Berners-Lee created the World Wide Web. In Finland, Linus Torvalds wrote Linux — the system that runs most of the world's servers. In France, Yann LeCun laid the foundations for convolutional neural networks — the technology behind image recognition.You will discover Donald Michie, a Bletchley Park veteran who built MENACE — a machine learning tic-tac-toe through reinforcement. Edsger Dijkstra, who invented the shortest path algorithm. DeepMind, founded in London, whose AlphaGo would beat the world Go champion.Europe invented, forgot, scuttled — and started over. Its resilience is part of its genius.
Parallel Paths: How Asia Discovered, Invented, and Dominated the Foundations of AIIn 1937, Claude Shannon defended his legendary thesis at MIT. He demonstrated that Boolean algebra could describe electrical circuits. The same year, in Tokyo, Akira Nakashima published the same discovery. Shannon cited him. Then one became a legend. The other was forgotten.Two men. Two continents. The same idea. The history of parallel paths.In 1930, Prasanta Chandra Mahalanobis invented in Calcutta a statistical measure still used every day in machine learning. In 1960, India inaugurated TIFRAC, its first locally designed computer. In 1982, Japan launched the Fifth Generation Computer Project — a dream of revolutionary computing that became "the lost generation."Then came the leap.India had COBOL programmers. The West no longer did. The "Y2K bug" became the launchpad for the Indian computer industry. TCS, Infosys, Wipro. Bangalore — the "Silicon Valley of India" — with thirty-eight percent of the country's IT exports.Morris Chang was "put out to pasture" at Texas Instruments at fifty-two. He left for Taiwan. He invented the "pure-play foundry" model — a company that manufactures chips without designing them. TSMC now enables NVIDIA, AMD, and Apple to exist without owning factories.You will discover Fei-Fei Li, born in China, creator of ImageNet — the database that launched the deep learning revolution. Kai-Fu Lee, who developed speech recognition, led Google China, and became one of the most influential AI investors.The parallel paths are converging. Asia manufactures the chips that run global artificial intelligence.
The Forge and the Forgetting: The Summers, Winters, and Invisible Women of American Artificial IntelligenceIn 1956, twenty-one researchers gathered at Dartmouth College for an eight-week summer conference. They had an ambitious goal: create a "machine capable of simulating every aspect of human intelligence." They thought they could do it in one generation. They were wrong — by a great deal.American artificial intelligence experienced summers and winters. The Dartmouth summer, then the first winter when funding collapsed in the 1970s. The expert systems summer, then their collapse when conventional machines caught up. Finally, the deep learning summer — the one still ongoing.But the American history of AI is also a history of forgetting.In February 1946, the army presented ENIAC to the press. In the background of the photos, six women manipulated cables — Betty Holberton, Kay McNulty, and their colleagues. They were not introduced. It took fifty years for their names to be learned.You will also discover Mexico, which received its first computer in 1958 and created the first computer science master's in Latin America. Argentina and its ComIC pioneers — Clarisa Cortes, Cristina Zoltan, Liana Lew, Noemi Garcia. Brazil, which manufactured sixty-seven percent of its computers locally in 1982.And Chile. Salvador Allende. Fernando Flores who wrote to Stafford Beer. The Cybersyn project — "a sort of socialist Internet, decades ahead of its time," according to The Guardian. The futuristic operations room, destroyed by the September 11, 1973 coup.America forged artificial intelligence. It also forged forgetting.
The Digital Palaver Tree: How Africa Invented Financial Inclusion and Algorithmic BiasIn every African village, there is a tree beneath which people gather to talk, listen, and decide together. The palaver tree. A patient democracy where decisions are binding only when all parties agree. No majority vote crushing the minority. An inclusive consensus.Ubuntu: "I am because we are." This philosophy guided Nelson Mandela and Desmond Tutu. And it contains, without knowing it, the principles of distributed systems and digital consensus protocols.On March 6, 2007, a Kenyan company launched M-Pesa — "M" for mobile, "Pesa" for money in Swahili. Sending and receiving money by simple mobile phone. In 2006, less than nineteen percent of Kenyans had access to a bank account. M-Pesa brought this figure to eighty percent. Before the West invented Apple Pay, Africa was already paying by mobile.Then came Ushahidi — "testimony" in Swahili. During the 2007 electoral violence, four technologists created in three days a citizen mapping platform. One hundred thousand deployments in one hundred sixty countries since.You will discover Rose Dieng-Kuntz, the first African woman admitted to Polytechnique, a pioneer of the semantic web. Timnit Gebru, who revealed that facial recognition systems erred up to thirty-five percent for dark-skinned women — versus less than one percent for white men. Mark Shuttleworth, who named Ubuntu Linux after the philosophy that had inspired him.Africa leaped across the technological abyss. It invented mobile financial inclusion before the rest of the world. And it posed the first questions about artificial intelligence biases.The palaver tree has become digital.
The Information Age: From the Ashes of the World War to Leaps Across the AbyssIn February 1948, in a Sydney laboratory, an engineer named Trevor Pearcey wrote a sentence that still resonates: "It is not inconceivable that an automatic encyclopaedic service operated through the existing telephone system will one day exist."The Internet. Predicted from Australia. Forty years before the World Wide Web.This period — from 1945 to 2010 — is when dreams became machines. Shannon's circuits took form in microprocessors. Turing's universal machine became the personal computer. Boolean logic became the Internet. And the Dartmouth dream — simulating human intelligence — passed through summers of euphoria and winters of disillusionment before being reborn, transformed.But this era was also one of leaps across the abyss.Africa, disconnected from the global telephone network, jumped directly to mobile payments. M-Pesa preceded Apple Pay. India, having missed the hardware turn, leaped toward software services — the Y2K bug became its launchpad. Taiwan invented the "pure-play foundry" and became the world's silicon shield. Israel, a six-year-old nation surrounded by enemies, built one of the world's first computers and became the "Startup Nation."You will traverse six continents. Sixty-five years of history. From the secret laboratories of Bletchley Park to the Tel Aviv apartments where ICQ was born. From the ruins of World War II to the servers of Google Maps, born in Sydney.Everywhere, the same question: who inherits the digital revolution — and who is excluded from it?The AI winter is over. The summer that opens will be the longest in history. But to understand where we are going, we must first understand where we came from.Welcome to A Brief History of AI, season 5.
What the Age of Revolutions Bequeathed to Us: Conclusion and Opening Toward the Information AgeFrom the African aquifers to the burned codices of Yucatan. From Ramanujan's notebooks to the secret laboratories of Bletchley Park. From Aboriginal stars to the dried springs of Baghdad. Six continents. One hundred and fifty-six years. What does this crossing teach us?Four threads run through this period. Epistemicide as policy: everywhere, knowledge was destroyed to justify domination. The exile of geniuses: Ramanujan, Al-Sabbah, Rutherford — all had to leave their homelands to flourish. The invisibilization of contributors: Nakashima, Seki, the women of Bletchley Park — erased because they did not fit the expected image. Parallel discoveries: the same truths emerge in places that know nothing of each other.This period bequeathed us binary, logic, the universal machine — and their blind spots. From Leibniz to Turing, the path is direct. But other paths could have been taken.The artificial intelligence we build today bears the imprint of this double history. It speaks the languages that were written, not those that were sung. Its corpora contain Cook's journals, not Tupaia's navigation songs.The inferno is extinguished. The ashes are still warm. What we build on these ashes depends on us.The next period — the information age — will inherit these silences. It will also inherit the possibility of repairing them.The journey continues.
Forgotten Stars: How Oceania Developed Humanity's First Astronomy — and Was Erased There is an emu that crosses the southern sky. You cannot see it by looking at the bright stars, but by observing the darkness between them.The Aboriginal peoples of Australia had developed what researchers call humanity's first astronomy. Sixty-five thousand years of observing the sky. Constellations in the dark spaces between stars. The Gawarrgay — the great emu — predicts the breeding seasons of the earthly bird.Polynesian navigators memorized two hundred and twenty stars to cross the Pacific without instruments. Their body-counting systems, their kinship mathematics represented algorithms before the word existed.Then came colonization. The legal fiction of terra nullius denied sixty-five thousand years of human presence. Between 1788 and 1900, the Aboriginal population collapsed by ninety percent. The Stolen Generations — children torn from their families between 1910 and 1970 — interrupted knowledge transfer.On the same soil, Ernest Rutherford was born in New Zealand, discovered the atomic nucleus, and received the Nobel Prize. Alexander Aitken, a New Zealand calculating prodigy, could multiply thirteen-digit numbers in his head.Two traditions on the same territory. And no bridge between them. Rutherford was knighted with a coat of arms bearing a Maori warrior — an aesthetic symbol, not an epistemic source.Oceania reminds us that coexistence is not dialogue, that stars can be extinguished in a single generation.
Dried Springs: How the Middle East Bequeathed the Words and Lost the InstitutionsEvery time a computer executes an operation, it performs an algorithm. The word comes from al-Khwarizmi — a ninth-century Persian mathematician. "Algebra" comes from al-Jabr. "Arabic numerals" still carry the memory of a transmission.Words survive. Institutions die.The House of Wisdom in Baghdad was destroyed in 1258. But in the nineteenth century, the Nahda — the Arab Renaissance — tried to make the springs flow again. Rifa'a al-Tahtawi translated two thousand European works into Arabic. Muhammad Abduh reformed al-Azhar. The Bulaq Press disseminated scientific knowledge.Then colonialism, the Sykes-Picot agreement, and the fragmentation of the Arab world interrupted the momentum.Hassan Kamel Al-Sabbah was born in Lebanon in 1895. A genius of electrical engineering, he filed more than seventy patents — for General Electric, in the United States, where he had to emigrate. He designed solar turbines, photoelectric cells, and power transmission systems. He died at thirty-nine in a car accident. In Lebanon, a statue was erected. The patents remained American.In Egypt, Muhammad Ali had built engineering and medical schools, sent students to Europe. The country had the world's fifth-largest cotton industry. Then debt, the Suez Canal, and British occupation ended the modernizing momentum.The Middle East gave the world the fundamental concepts of calculation. And was prevented from continuing what it had begun.The words remain. The springs await their chance to flow again.
The Forge and the Inferno: How Europe Invented Artificial Intelligence on the Ashes of the Libraries It BurnedA forge is not only a place of creation. It is also a place of fire.In 1679, Leibniz conceived the binary system. In 1854, Boole formalized the algebra of logic. In 1843, Ada Lovelace wrote the first computer program for a machine that did not exist. In 1936, Turing invented the universal machine. In 1944, Tommy Flowers completed Colossus — the first electronic computer. Europe forged all the conceptual tools of artificial intelligence.But the inferno accompanied the forge. The women of Bletchley Park made up seventy-five percent of the staff. Joan Clarke worked alongside Turing on decrypting Enigma. Mavis Batey cracked the Abwehr code at nineteen. Their names were erased for decades.In Berlin, Konrad Zuse built alone the Z3 — the world's first programmable computer — in 1941. The Nazi regime was not interested. A bombing raid destroyed it. When history was written, Zuse was barely mentioned.Colossus preceded ENIAC by two years. But the Colossus machines were destroyed after the war, their plans burned. Tommy Flowers received orders to erase everything. The history of computing ignored this first for thirty years.Refugees fleeing Nazism — Einstein, Fermi, Goedel — enriched America with what Europe was losing. European colonialism destroyed elsewhere the knowledge systems it did not recognize.Europe forged the tools of AI. It also forged them on the ashes of the libraries it burned.




