Discover
Brownstone Journal

177 Episodes
Reverse
By Roger Bate at Brownstone dot org.
Introduction
Global public health has long been animated by moral purpose and collective ambition. When nations join under the banner of "health for all," it reflects both humanitarian conviction and political calculation. Yet, the architecture of global health governance often produces outcomes that diverge from its lofty ideals. The World Health Organization (WHO), its treaties, and its many partnerships embody both the promise and the peril of global cooperation: institutions that begin as vehicles for public good can evolve into complex bureaucracies driven by competing incentives.
A useful way to understand this paradox is through the old "Bootleggers and Baptists" framework - coined to explain how moral crusaders ("Baptists") and opportunists ("Bootleggers") find common cause in supporting regulation.
In global health, this coalition reappears in modern form: moral entrepreneurs who campaign for universal virtue and institutional purity, joined by actors who benefit materially or reputationally from the resulting rules. But there is a third, often overlooked participant - the bureaucrat. Bureaucrats, whether within WHO secretariats or international treaty bodies, become the custodians of regulation and its moral aura. Over time, their incentives can subtly shift from serving the public interest to preserving and enlarging their institutional mandate.
This essay explores how these three forces - the Baptists, the Bootleggers, and the Bureaucrats - interact within global health governance. It looks at the WHO's Framework Convention on Tobacco Control (FCTC) as a revealing case, and then considers how similar patterns are emerging in the proposed Pandemic Treaty. The analysis argues that moral certainty, donor dependency, and bureaucratic self-preservation often combine to produce rigid, exclusionary, and sometimes counterproductive global health regimes. The challenge is not to reject global cooperation, but to design it in ways that resist these incentives and remain responsive to evidence and accountability.
Bootleggers and Baptists in Global Health
The "Bootleggers and Baptists" dynamic was first described in the context of US alcohol prohibition: moral reformers (Baptists) called for bans on Sunday liquor sales to protect public virtue, while illegal distillers (Bootleggers) quietly supported the same restrictions because they reduced competition. Together, they sustained a regulation that each group wanted for different reasons.
In global health, the same coalition appears frequently. The "Baptists" are the moral crusaders - public health activists, foundations, and advocacy NGOs that promote regulations framed in universal ethical language: eliminating tobacco, ending obesity, halting pandemics. Their arguments often appeal to collective responsibility and moral urgency. They mobilize attention, generate legitimacy, and supply the moral energy that international institutions depend upon.
The "Bootleggers" are the economic and bureaucratic actors who benefit materially or strategically from these same campaigns. They include pharmaceutical firms that profit from mandated interventions, governments that gain moral prestige through leadership in treaty negotiations, and donor organizations that extend their influence through targeted funding. The alignment between moral appeal and material interest gives regulatory projects their durability - and their opacity.
Unlike national policy debates, global health regulation takes place far from direct democratic oversight. It is negotiated by diplomats and sustained by international bureaucracies that answer only indirectly to voters. This distance allows the Bootlegger-Baptist coalition to operate with less friction. The Baptists supply moral legitimacy; the Bootleggers provide resources and political cover. The resulting regulations are difficult to challenge, even when evidence shifts or unintended consequences emerge.
Bureaucrats and...
By Clayton J. Baker, MD at Brownstone dot org.
Way back in the B.C. era (Before Covid), I taught Medical Humanities and Bioethics at an American medical school. One of my older colleagues - I'll call him Dr. Quinlan - was a prominent member of the faculty and a nationally recognized proponent of physician-assisted suicide.
Dr. Quinlan was a very nice man. He was soft-spoken, friendly, and intelligent. He had originally become involved in the subject of physician-assisted suicide by accident, while trying to help a patient near the end of her life who was suffering terribly.
That particular clinical case, which Dr. Quinlan wrote up and published in a major medical journal, launched a second career of sorts for him, as he became a leading figure in the physician-assisted suicide movement. In fact, he was lead plaintiff in a challenge of New York's then-prohibition against physician-assisted suicide.
The case eventually went all the way to the US Supreme Court, which added to his fame. As it happened, SCOTUS ruled 9-0 against him, definitively establishing that there is no "right to die" enshrined in the Constitution, and affirming that the state has a compelling interest to protect the vulnerable.
SCOTUS's unanimous decision against Dr. Quinlan meant that his side had somehow pulled off the impressive feat of uniting Antonin Scalia, Ruth Bader Ginsberg, and all points in between against their cause. (I never quite saw how that added to his luster, but such is the Academy.)
At any rate, I once had a conversation with Dr. Quinlan about physician-assisted suicide. I told him that I opposed it ever becoming legal. I recall he calmly, pleasantly asked me why I felt that way.
First, I acknowledged that his formative case must have been very tough, and allowed that maybe, just maybe, he had done right in that exceptionally difficult situation. But as the legal saying goes, hard cases make bad law.
Second, as a clinical physician, I felt strongly that no patient should ever see their doctor and have to wonder if he was coming to help keep them alive or to kill them.
Finally, perhaps most importantly, there's this thing called the slippery slope.
As I recall, he replied that he couldn't imagine the slippery slope becoming a problem in a matter so profound as causing a patient's death.
Well, maybe not with you personally, Dr. Quinlan, I thought. I said no more.
But having done my residency at a major liver transplant center in Boston, I had had more than enough experience with the rather slapdash ethics of the organ transplantation world. The opaque shuffling of patients up and down the transplant list, the endless and rather macabre scrounging for donors, and the nebulous, vaguely sinister concept of brain death had all unsettled me.
Prior to residency, I had attended medical school in Canada. In those days, the McGill University Faculty of Medicine was still almost Victorian in its ways: an old-school, stiff-upper-lip, Workaholics-Anonymous-chapter-house sort of place. The ethic was hard work, personal accountability for mistakes, and above all primum non nocere - first, do no harm.
Fast forward to today's soft-core totalitarian state of Canada, the land of debanking and convicting peaceful protesters, persecuting honest physicians for speaking obvious truth, fining people $25,000 for hiking on their own property, and spitefully seeking to slaughter harmless animals precisely because they may hold unique medical and scientific value.
To all those offenses against liberty, morality, and basic decency, we must add Canada's aggressive policy of legalizing, and, in fact, encouraging industrial-scale physician-assisted suicide. Under Canada's Medical Assistance In Dying (MAiD) program, which has been in place only since 2016, physician-assisted suicide now accounts for a terrifying 4.7 percent of all deaths in Canada.
MAiD will be permitted for patients suffering from mental illness in Canada in 2027, putting it on par with the Netherland...
By Eyal Shahar at Brownstone dot org.
In the years that I served as an associate editor (for The American Journal of Epidemiology), I have seen the entire spectrum of "peer reviews" - from meticulous, thoughtful critiques whose authors evidently invested several hours in the task to sketchy reviews that reflected carelessness and incompetence. I have read friendly reviews by admirers of the authors and hostile reviews by their enemies. (It is not difficult to tell from the tone.) In the practice of science, human beings still behave like human beings.
Matters got worse during the pandemic. Studies that praised the Covid vaccines were quickly certified "peer-reviewed," whereas critical, post-publication peer review was suppressed. As a result, we now have a historical collection of published poor science. It cannot be erased, but it is time to start correcting the record.
Biomedical journals are not the platform. First, there is no formal section for open peer reviews of articles that were published long ago. Second, editors have no interest in exposing falsehoods that were published in their journals. Third, the censorship machine is still in place. So far, I was able to break it only once, and it wasn't easy.
So, how can we try to correct the record, and where?
Let me make a suggestion to my colleagues in epidemiology, biostatistics, and related methodological fields who preserved their critical thinking during the pandemic. Choose one article or more about the Covid vaccines and submit your peer review to Brownstone Journal. If it is interesting and well-written, there is a good chance it will be posted. I advise cherry-picking: find those peer-reviewed articles that irritated you most, either because they were pure nonsense or because the correct inference was strikingly different. And if you posted short critiques on Twitter (now X) or thorough reviews on other platforms, expand, revise, and submit them to Brownstone. Perhaps we can slowly create an inventory of critical reviews, restoring some trust in the scientific method and in biomedical science.
Here is an example.
A Review and a Re-Analysis of a Study in Ontario, Canada
Published in the British Medical Journal in August 2021, the paper reported the effectiveness of the mRNA vaccines in early 2021, shortly after their authorization.
This research was typical of vaccine studies from that time. Effectiveness was estimated in a "real-world" setting; namely, an observational study during a vaccination campaign. The study period (mid-December 2020 through mid-April 2021) included the peak of a Covid winter wave in early January. We'll discuss later a strong bias called confounding by background infection risk.
The design was a variation of the case-control study, the test-negative design. Eligible subjects underwent a PCR test because of Covid-like symptoms. Cases tested positive; controls tested negative. As usual, odds ratios were computed, and effectiveness was computed as 1 minus the odds ratio (expressed in percent). The sample size was large: 53,270 cases and 270,763 controls.
The authors reported the following key results (my italics):
"Vaccine effectiveness against symptomatic infection observed ≥14 days after one dose was 60% (95% confidence interval 57% to 64%), increasing from 48% (41% to 54%) at 14-20 days after one dose to 71% (63% to 78%) at 35-41 days. Vaccine effectiveness observed ≥7 days after two doses was 91% (89% to 93%)."
Like almost every study of effectiveness, the authors discarded early events. As explained elsewhere, this practice introduces a bias called immortal time, or case-counting window bias. Not only does it obscure possible early harmful effects, but it also effectively leads to overestimation of effectiveness. RFK, Jr. alluded to this bias in non-technical terms (see video clip).
The correct approach is simple. We should estimate effectiveness from the administration of the first dose to later timepoints (built-up immunity). My tab...
By Paula Jardine at Brownstone dot org.
WE SHUT our schools, we emptied our streets, we shuttered our shops not because reason demanded it, but because fear commanded it. We followed the models, not the evidence. We fought a virus by locking down the healthy. We sacrificed livelihoods on the altar of safety. And we were told by Prime Minister Johnson: There Is No Alternative! That was the lockdown myth - a lie wrapped in panic, cloaked in science. And we complied in our homes, in our hospitals, in silence. But history will ask: was it necessary? Or was it, in the end, a fight against reason itself?
We don't need to wait for history to judge, because a central part of the pernicious lockdown myth - that an unprecedented response was justified due to the unprecedented uncertainty of an invisible threat, that biodefense experts such as Sir Jeremy Farrar and Dr Richard Hatchett told us with all the sincerity they could muster, was certainly deadly - can be easily busted.
The myth that lockdowns were unprecedented before 2020 and that the practice began when Wuhan was imprisoned on the eve of CEPI's announcement at Davos 2020 that Moderna had a newfangled vaccine ready to go into Phase 1 trials is just that: a myth. There is a precedent, and a highly instructive one at that.
The first lockdown occurred in April and May of 2009 in Mexico. That it took place little more than a decade before the Covid lockdowns makes it all the more remarkable that it has been erased from collective memory. Like the lockdowns of 2020, it was a shakedown and the fingerprints of one person, the aforementioned Dr Hatchett, are all over it. He was not a public figure in 2009 but he was a key figure advising the White House directly from his pulpit as the Director for Medical Preparedness Policy on the US National Security Council (NSC).
On April 17, 2009, the final day of President Obama's visit to Mexico, the US Centers for Disease Control and Prevention (CDC) issued an advisory after swine flu (H1N1) was detected in two Mexican-American children in California. After showing flu-like symptoms, they'd had nasal-pharyngeal swabs taken as part of a surveillance study. Neither had severe illness and both recovered but the CDC sounded the alarm anyway, saying the children had no known direct contact with pigs.
The detected viruses, said the CDC, showed resistance to existing antivirals so they were testing two new ones, GSK's Relenza (zanamivir) and Tamiflu (oseltamivir), which was developed by Gilead Sciences, a company linked to President G W Bush's former Defense Secretary Donald Rumsfeld and licensed to the Swiss pharmaceutical company Roche, to see if either of those might work should the H1N1 virus start spreading in humans.
As it was required to under the amended International Health Regulations 2005, the Mexican government reported cases of patients with severe respiratory disease to the WHO in early April and dutifully sent patient samples from these purportedly 'unusual pneumonia cases' to Canada's National Microbiology Laboratory (NML) in Winnipeg, one of the WHO Collaborating Centres for Influenza. On April 23, the NML reported detecting H1N1 swine flu. The Mexican government, which was now reporting 16 deaths from swine flu, swiftly ordered schools and businesses in Mexico City, the sprawling and densely populated capital, to close on April 24 due to the public health emergency.
As Leslie Bassett, the deputy chief of mission at the US Embassy in Mexico City in 2009, said: 'Waking up to a pandemic is like walking toward a beautiful garden and hitting a plate glass door. Without any warning, the expectations you never questioned are violently disrupted. Your brain reels, unable to process the brutal warping of reality. The British call this "gobsmacked." A health crisis professional might describe this as the prelude to a pandemic response.'
A day later, the New York Times reported that the World Health Organization (WHO) was considering raisin...
By Jeffrey A. Tucker at Brownstone dot org.
[The following is an excerpt from Jeffrey Tucker's book, Spirits of America: On the Semiquincentennial.]
It's impossible to speak of American history without reference to the life of the farmer and the land. The experience shaped many generations. It formed the basis for the belief in freedom itself, the conviction that a family can provide for itself through hard work and defend its rights based on the little slice of physical land that the family controlled.
Read any of the writings of the Founding Fathers, and you find an unrelenting romanticization of life on the land. "When I first entered on the stage of public life," wrote Thomas Jefferson, "I came to a resolution never…to wear any other character than that of a farmer."
The idea rattles us a bit. We don't really have agronomy anymore. We live in cities, type on laptops, play with digits, farm information, and our only connection with food is the grocery store and restaurant.
Reading Jefferson, then, makes one think: we don't live on farms anymore, so all must be lost. That, of course, is untrue. His point is simply that the agrarian life provides a bulwark, not that you cannot have freedom if it gives way to other modes of living.
And the agrarian life did give way, for reasons both organically evolving but also through force, which is deeply regrettable. As the Industrial Revolution advanced, fewer and fewer people lived on farms. We moved to the cities. By 1920, it was pretty well done: industry beat agriculture in its overall contribution to American productivity.
For most of my adult life, I made fun of people who had regrets about this. What's wrong with corporate farming? It's feeding the world and we would starve otherwise. We need big companies, huge machinery, oceans of pesticide and fertilizer, and consolidated supply chains. We simply cannot and should not go back.
I've come to change my mind, however, now that I've been so heavily exposed to a critique of industrial food and Big Agriculture. I see now that it is not entirely natural and normal that they would have replaced small farms.
Last year, I drove to the countryside, pulled over at a farmers' market, and had a long conversation with the husband and wife who ran the farm and the meat and vegetable stand. They talked of their struggles with the weather, of course, and dealing with the exigencies of nature.
Mostly, they spoke of the artificial struggles they face. They are hit relentlessly with tax on land, taxes on production, taxes on profits, taxes on everything. There are regulations too. They are prevented from selling directly to stores. They face grueling restrictions on meat processing. The health inspectors drive them nuts. They face constraints on wages, hourly restrictions on labor, and wrangle with bureaucrats constantly.
Without all of this, they are certain that they could make a better go of it. They could compete with the big guys. After all, their products are healthier, more delicious, and just overall better. No question, they said, that they could compete and win on a fair playing field. As it stands, they barely survive.
I've come to appreciate that point of view. Imagine if we suddenly did have a free market in agriculture. No taxes, no regulations, no mandates, no restrictions. Anyone can raise food, process it, and sell it to whomever under any conditions. In other words, what if today we had the same system we had in the time of Jefferson and Washington?
We would see an absolute explosion in small farms. Everyone would be selling eggs. Produce would be everywhere and so would meat. We would learn not to depend on grocery stores and supercenters but our friends and neighbors. The idea of eating locally would not have to be preached by anyone; it would just become our daily routine again.
This is because everyone prefers local produce over industrially shipped and packaged corporate food. We only have the ubiquity of the latter due t...
By Richard Kelly at Brownstone dot org.
This time 4 years ago, I learned to cut my own hair, with predictable results. This time 4 years ago I was prohibited from going to a barber. Yes, the two statements are connected. I could have just let my hair grow, but it would have annoyed me. As it is, despite the improvement (I think) in my skilfulness, now my haircuts annoy others. Well-meaning comments are graciously accepted, and my usual reply is, "Thanks, I did my best."
I've made only one exception to the self-haircut - on the happy occasion where I was the Father of the Bride. But apart from that, every haircut in the last 4 years has been all my own work.
It's become a ritual, if not quite a sacrament. The result is an 'outward visible sign of an inward bloody-minded determination', and the process is a contemplative homage to the lives and livelihoods, conventions and core values that were utterly destroyed during 'the troubles'.
The ritual takes place in the small garden shed I use as a workshop. Surrounded by large power tools and small hand tools, shirtless, staring into a mirror and protected by a locked door, the hair comes off and drifts to the workbench and the floor. Various other grooming niceties take place before I emerge, with stocks of defiance replenished in equal measure with the sadness remembered.
I don't tend to bring the fight to others, except in the form of wonky haircuts. The fight for accountability, the fight for apology, the fight for truth. But when the fight comes to me, I tend to push back.
I pushed back when a grumpy admissions nurse scolded me for not wearing a mask, and got a reply from the hospital two weeks later confessing all the mask requirements had now been abandoned; I pushed back when I ridiculed the communion wine being presented in an eye-dropper, and in short order we went back to a common cup. I mostly push back when something in the news gets up my nose, like a police chief commissioner complaining that he felt 'bruised' having to implement the ridiculous health orders, like filling skate parks with sand and checking inside people's coffee cups to see if there was any coffee left that justified not wearing a mask.
When the antagonist is not a family member, or a friend, or an acquaintance, pushing back is less risky than when they are. And much harder, requiring more skill, thoughtfulness, and, frankly, courage. Likewise, the more subtle the nature of the affront, the more 'nuanced' it is, the harder it is to stand firm and not destroy relationships.
In front of me is a proposal to use our church as a 'pop-up vaccination site' for flu vaccines. Some see it as a great 'missional opportunity'. Presumably, the logic goes, 'Flu vaccines are safe and effective, we will save lives by loaning our meeting room, and vaccine recipients will recognise that we did them a favour by loaning our meeting room, and then they will make the leap and come to faith, somehow, in a sliding-doors moment that would never have happened without our meeting room.
I'm not convinced. None of the clauses in the logic holds water on its own, let alone in sequence. The flu vaccine doesn't work; the life-saving claim is only supported by conjecture and modelling. There's no guarantee anyone will even give a passing thought to the generosity of loaning our meeting room, and while I won't second guess mystery, I remain sceptical of the likelihood of a 'road to pop-up' conversion.
I won't be within a bull's roar of the pop-up vax clinic, should it go ahead. In that sense, I have no bone to pick with those who might attend it. They can knock themselves out. And I don't worry that some might not come to faith as a result of attending. That's above my pay grade. What troubles me is the outward visible sign of the monstrous social disgrace that was inflicted on us all, and some of us more than others, in the recent past. To have a vax clinic inside the very meeting room from which unvaccinated parishioners were exclu...
By Peter C. Gøtzsche at Brownstone dot org.
It has been surprisingly difficult to get an answer to a simple and highly relevant question: Is aluminium in vaccines harmful? After having studied the best evidence we have, the randomised trials, in great detail, I conclude that the answer is yes.
Like lead, aluminium is a highly neurotoxic metal. We will therefore expect vaccines containing aluminium adjuvants to cause neurological harms if the aluminium enters the nervous system in neurotoxic amounts.
The aluminium in the adjuvant is important for eliciting a strong immune response in non-live vaccines and their efficacy is related to their toxicity at the injection site. Immune-reactive cells engulf particles of aluminium adjuvant and distribute their load throughout the body, including to the brain, where they are killed, releasing their contents into the surrounding brain tissue where they can produce an inflammatory response.
The precise mechanism of action is not so important, but the data we have on the harms are, and they have been systematically distorted.
False Information from the European Medicines Agency (EMA)
In October 2016, my research group complained to the European Ombudsman about the EMA's mishandling of their investigation into the suspected serious neurological harms of the HPV vaccines. In his reply to the Ombudsman, EMA's Executive Director Guido Rasi stated that the aluminium adjuvants are safe; that their use has been established for several decades; and that the substances are defined in the European Pharmacopoeia.
Rasi gave the impression that the aluminium adjuvants in the HPV vaccines are similar to those used since 1926. However, the adjuvant in Gardasil, Merck's vaccine, is amorphous aluminium hydroxyphosphate sulfate, AlHO9PS (AAHS), which has other properties than aluminium hydroxide, the substance Rasi mentioned. Moreover, its properties are not defined in the pharmacopoeia. AAHS has a confidential formula; its properties are variable from batch to batch and even within batches. The harms caused by the adjuvant are therefore likely to vary. When we investigated whether the safety of AAHS has ever been tested in comparison with an inert substance in humans, we were unable to find any evidence of this.
Rasi mentioned that the assessment of the evidence for the safety of the adjuvants had been performed over many years by the EMA and other health authorities, such as the European Food Safety Authority, the FDA, and the WHO.
However, none of his five references supported his claim about safety. Three links, to EMA, FDA, and WHO, were all dead. One worked two years later but contained nothing of relevance. A link to the European Food Safety Authority was about the safety of aluminium from dietary intake, which has nothing to do with aluminium adjuvants in vaccines. Very little oral aluminium is absorbed from the gut, and much of what is absorbed is eliminated by the kidneys. The last link was to a WHO report that was also unhelpful. It mentioned that the FDA had noted that the body burden of aluminium following injections of aluminium-containing vaccines never exceeds US regulatory safety thresholds based on orally ingested aluminium, which is irrelevant information.
The Randomised Trials Document the Toxicity of Aluminium Adjuvants
As an expert witness for the Los Angeles law firm Wisner Baum, I have read 112,000 pages of confidential Merck study reports. If Merck's aluminium adjuvant causes serious neurological harms, one would expect to see more harm with Gardasil 9 than with quadrivalent Gardasil because it contains five more HPV antigens and more than double as much adjuvant, corresponding to 500 µg vs 225 µg aluminium.
And this is what we see. Three trials have compared Gardasil 9 with Gardasil, but two of them were so small, only 1,095 patients in total, with only 3 serious adverse events, that they cannot shed any light on this issue. The third trial, however, was large, with a total of 14...
By Tom Markson at Brownstone dot org.
Last week, President Trump set in motion a transformation that will fundamentally redefine American medicine for a generation. In a single news conference, he delivered a direct and sweeping challenge to the entire structure of public health authority.
As a result, the credibility of national health agencies, leading professional associations, and the corporate medical establishment has been shaken at its very core. The talking heads - from cable news "experts," to hospital CEOs, the Kaiser Foundation, AMA, AHA, APHA, ACP, AARP - all of the alphabet soup - will never be looked at the same.
At the President's recent autism briefing, he indicated that public health officials weren't "really letting the public know what they knew" about the strong link between Tylenol and autism. In fact, this information was known for nearly a decade but never disclosed to expectant mothers. For a profession built on transparency and informed consent this represents a systemic and deeply consequential breach of trust.
The spell of "settled science" has been decisively broken. Because Tylenol is so widely used and present in nearly every home, this debate cannot be dismissed as a fringe dispute over vaccines. The accusation reaches into every household medicine cabinet and challenges the universal assumptions that have long guided routine medical advice.
It's clear that the President views this as a grave ethical violation and an abandonment of the duty to protect the most vulnerable. For his administration, this is not a narrow policy debate. It is a fundamental confrontation with the very foundations of medical authority and the ethical standards that have guided public health for more than a century.
Public confidence was already eroding. During and after the Covid period, childhood booster rates dropped sharply, even before the government adjusted the vaccine schedule. Many Americans sensed that official experts were not telling the full story, as the same agencies dutifully calculated "acceptable losses" from among those taking the Covid shot. Acceptable to whom?
The revelation of evidence linking acetaminophen to autism has ruptured what remained of expert invincibility, particularly as healthcare officials knew of the risk yet continued to recommend the drug while manufacturers protected a market worth billions.
Now, every standard recommendation from the entrenched medical establishment will face relentless scrutiny - as they should. Consider that Americans are the most medicated population on earth with more than sixty percent taking at least one prescription drug, yet chronic illness continues to rise. That contradiction will drive an expansive and sustained reexamination of every official guideline.
Competing experts are already clashing in real time. Traditional gatekeepers are using social media to dismiss the President's announcement while pregnant anti-Trump moms post videos of themselves gobbling up bottles of Tylenol.
But the very existence of this open conflict signals a structural shift. The doctor-patient relationship is poised to reclaim central authority. Informed consent will once again dominate examination rooms and pharmacy counters, and corporate physicians will lose their long-standing dominance.
The Hippocratic Oath embodies a simple mandate: first do no harm. For too long, public health leaders have inverted that principle, promoting chemicals and treatments until harm became undeniable and then relying on public relations to extend profits. The Trump administration has moved caution back to the forefront and, in doing so, has exposed the motivations of those who continue to deny these risks.
This moment is devastating for the traditional medical establishment and for the pharmaceutical industry. It validates the growing suspicion that profit often outweighs public good and raises a larger question: what other risks remain hidden behind institutional walls?
The war of experts i...
By Joseph Varon at Brownstone dot org.
When I was a young medical student, I believed with all my heart that medicine was the highest calling a human being could answer. We were not just training to earn a degree or secure a position. We were stepping into a lineage, inheriting a tradition that stretched back to Hippocrates, Galen, Vesalius, Osler, and countless others who saw the care of the sick as a sacred covenant.
Every time I walked into a ward, I felt both nervous and exhilarated, as if I were entering a cathedral where the human body and spirit were laid bare.
A patient's trust was not a transaction - it was a gift, a profound act of vulnerability. To be allowed into that sacred space was to be given a responsibility greater than anything I had known. We did not speak in the language of "compliance metrics" or "quality indicators." We spoke of healing, of service, of devotion. Medicine was not a career. It was a vocation, a purpose, a life anchored in something deeper than self.
Over the years, however, something shifted. What was once a vocation has been stripped of its soul. It has been rebranded, reframed, and reduced until it barely resembles the profession I entered with such hope. Medicine today is a business enterprise. Patients are consumers, doctors are "providers," and healing has been crowded out by billing codes, liability fears, and the suffocating weight of bureaucracy. The vocation has been replaced by a job, and a job can always be abandoned.
That is what haunts me most.
The decline of vocation did not happen overnight. It was gradual, almost imperceptible at first, like a slow leak in the hull of a ship. Administrators multiplied until they outnumbered physicians. Insurance companies dictated what treatments were permissible, not based on medical judgment but on actuarial tables. Pharmaceutical firms turned research into marketing, blurring the line between scientific discovery and sales strategy.
Hospitals transformed into corporations with CEOs, branding departments, and profit margins to defend. The physician's desk became a computer terminal, and the patient was no longer a soul in need of healing but a data point to be coded and billed. Even the language betrayed the transformation: patients became "units of care," outcomes became "deliverables," and clinical judgment was rebranded as "adherence to protocol."
This hollowing out of medicine's soul reached its most devastating climax during Covid. It was a moment that should have summoned the deepest instincts of our profession. Uncertainty, fear, and suffering filled our hospitals. That is precisely when vocation matters most. The physician is supposed to walk into the fire when others flee. Yet what did we see? Doors closed, clinics shuttered, doctors retreating to their homes, waiting for bureaucrats and government agencies to tell them what to do.
Protocols were enforced even when they harmed. Independent thought was punished. Dissent was silenced. And while patients gasped for air and families begged for help, too many physicians were nowhere to be found.
I remember vividly those early days of the pandemic. There was terror in patients' eyes, but also profound gratitude when they saw a physician willing to step into the room, to touch them, to treat them as human beings rather than contagions. The vocation of medicine means that when everyone else runs out, the doctor runs in. Yet in those months, only a few did. The rest followed orders from afar, citing fear or policy as justification for absence.
Covid revealed what I had long suspected: when medicine is reduced to a job, it can be deserted. But when it is a vocation, it cannot.
This crisis was not an accident. Its roots stretch back decades. The Flexner Report of 1910 reshaped American medicine for better and worse. On one hand, it elevated scientific standards and eliminated substandard schools. On the other hand, it centralized control, tethering medicine more tightly to institutional ...
By Thomas Harrington at Brownstone dot org.
In the Fall semester of 2018, I was given permission to teach at my college's campus in Barcelona, a program that I had founded nearly two decades before and visited quite frequently in my roles as its academic director and frequent leader of its summer programs.
Needless to say, I was excited, as the city and its culture had been a prime focus of my research for several decades. That I would be there at a time when the independence movement was still strong and my book in Catalan on that subject would be released, with all that that would hopefully entail in the way of press interviews and book signings, only added to my sense of anticipation.
But most of all, I looked forward to sharing some of what I had learned about Spain and Catalonia over the years in situ with my students.
At the risk of sounding immodest, I can say that I never had much problem connecting with my students. Of course, I never reached them all. But I almost always managed to get the majority to engage seriously with historical ideas and events and to ponder their possible links to their own lives and cultural circumstances.
That was until that Fall semester of 2018 in Barcelona.
Under pressure from the college to increase Study Abroad enrollments, we had lifted the Spanish-only requirement for the program. While it did increase our numbers, it brought us a very different type of student than that I was used to working with (courageous enough to attempt serious intellectual work in their second language), ones much more like the indifferent seat-warmers I had heard my colleagues from larger and less demanding departments serially gripe about back in Hartford.
A week or so into the course, a million-person march for Catalan independence filled the streets of Barcelona (a city with one of the higher population densities in Europe) in a way that was absolutely impossible to ignore.
In the days preceding that September 11 Diada, I had given the students a brief explanation on why this was happening and had encouraged them to go out and observe the always remarkable and highly photogenic mass spectacle.
The following day - in a class centering on the history of Spain and Catalonia - I immediately opened the floor for questions and comments on what they had seen.
No one had anything to say. And no one, and I mean no one, was the least bit curious about what had taken place in the streets of the city the day before in terms of its relationship to politics, history, social esthetics, or anything else. Pure silence and pure indifference.
And things continued in this manner for another several weeks as I presented documents that had, in my classes' long elicited intense curiosity and lively questioning about the social dynamics of identity formation in general, and the historical particulars of such phenomena within the city of Barcelona and the various "culture nations" (Castile, Catalonia, Galicia, Portugal, and the Basque Country) of the Iberian Peninsula. .
Fed up, I finally decided to break the fourth wall; that is, to open up a discussion on the meta-dynamics of the classroom theater in which we were all engaged.
I started things off by saying that it seemed to me that we were playing a game in that they had decided beforehand was essentially vacuous and insincere, in which their role was to politely listen to me and what they had decided would be my boring and uninspired pro-forma murmurings and, when it came time for papers and exams, to repeat back to me a reasonable summary of my own words in order to get a good grade.
When they got over the initial shock produced by my naming of the game, their tongues suddenly loosened, and one by one they began telling me, each in their own way, that what I had said was more or less spot on.
They then went on to tell me that this was what went on in almost all their classes back on the home campus with what they understood to be the full, if tacit, complicity of their pro...
By James Lyons-Weiler at Brownstone dot org.
On September 30, 2025, the image of Albert Bourla standing in the White House beside President Donald J. Trump stunned large segments of the public. The moment instantly became a lightning rod, drawing condemnation and confusion from those who remembered the unresolved - and in many cases, still unaccounted for - devastations of the Covid-19 response.
My inbox, and those of others who have worked to expose the record, flooded with a single question, usually framed in rage or betrayal: What the F*?
This piece is not an apology, nor an attempt to launder history. We must hold multiple truths at once. What happened in 2020 and 2021 was a global institutional collapse, and many of the facts still buried beneath academic soft-pedaling or regulatory capture are not only real - they are documented.
The 95% efficacy figure behind Pfizer's original mRNA vaccine - marketed with urgency and without full transparency - was the result of a methodological sleight of hand. The trial protocol only counted cases beginning seven days after the second dose. That choice excluded early infections, skewed efficacy upward, and generated headlines that conditioned the world's response. This is not speculation. It is a matter of record in the trial design published in the New England Journal of Medicine.
The case-counting window bias - a form of immortal time bias we've termed the Lyons-Weiler/Fenton effect - was not reported, disclosed, or corrected by regulators, the sponsor, or the academic authors. It was never subjected to adverse scenario modeling. It is still cited.
Other elements of the Covid-19 response defy benevolent interpretation.
The spike protein sequence appearing in Pfizer's patent holdings prior to public awareness of SARS-CoV-2; the credible findings by Kevin McKernan and others of plasmid DNA contamination in mRNA vaccine vials, including SV40 enhancer elements; the lopsided prioritization of vaccine deployment over therapeutics like ivermectin and fluvoxamine; the use of high-cycle threshold PCRs without Ct reporting, which amplified false positives; and the refusal to distinguish properly between "died with" and
"died from" - these are not conspiracy theories.
They are failures. The ethics of care collapsed. The financial incentives prevailed.
The charge that some hospitalists used sedation and early ventilation not out of necessity but to clear beds or reduce infection liability cannot be dismissed out of hand. Ventilator-era mortality was catastrophic. Retrospective chart audits suggest secondary bacterial infections were rampant, often untreated, and that many patients were coded for Covid when sepsis or bacterial pneumonia was the true cause of death. Whether this was manslaughter or negligence is a question for the courts.
But morally, we must say plainly: lives were ended for reasons that had nothing to do with care.
In my view, the evidence of Albert Bourla's opportunistic stock sale - what many rightly call a pump-and-dump - is also real. It is a matter of public record that on November 9, 2020, the same day Pfizer announced top-line results from its pivotal vaccine trial, Bourla sold over 130,000 shares of Pfizer stock under a pre-set Rule 10b5-1 trading plan adopted on August 19. The optics were indefensible.
While technically legal under the rules at the time, the maneuver benefited directly from nonpublic material information about vaccine performance - information withheld from the broader public until after markets opened.
No ethical oversight committee reviewed it. No internal justification was disclosed. Just a silent cash-out on the back of news that moved markets globally. The SEC has since tightened Rule 10b5-1 to avoid these abuses, but Bourla's behavior happened under the old rules, and that should not be forgotten. It remains on the record, and it remains unresolved. I want to remind everyone that all options against Bourla the man, and Pfizer the company, are ...
By Ramesh Thakur at Brownstone dot org.
The decades-old International Health Regulations, as amended last year, came into effect on 19 September. A new Pandemic Agreement, adopted in May, will be opened for signature after a pathogens access and benefits sharing deal that is expected to be reached next year.
The WHO Pandemic Accords, as the two documents are known, are a good example of the type of global governance initiatives on which there is a consensus among technocratic elites, but against which there is a rising populist revolt. Two other examples that were mentioned by President Donald Trump in his UN address on 23 September are immigration and climate change. The speech was a wide-ranging defence of national sovereignty against globalism.
Flawed Assumptions
Yet, pandemics are rare events that, compared to endemic infectious and chronic diseases, impose a low disease burden. The rationale for the accords rests on the false understanding that the risk of pandemics is rapidly growing, predominantly from increasing zoonotic spillover events in which pathogens move from animals to humans. Well-founded suspicion that Covid arose from gain-of-function research and a lab leak negates the second part of this justification.
The assumption of increasing pandemic risk is also undermined by work from the University of Leeds. They show that the reports of the WHO, World Bank, and G20 that back the pandemic agenda don't support the agencies' claims. Data show reducing mortality and outbreaks in the decade prior to 2020. Much of the recorded 'increase' in episodes reflects improved diagnostic technologies, not more frequent and more serious outbreaks.
Previous major epidemic diseases like yellow fever, influenza, and cholera continue to decline overall. The historical timeline of pandemics shows that improvements in sanitation, hygiene, potable water, antibiotics, and other forms of expanding access to good healthcare have massively reduced the morbidity and mortality of pandemics since the Spanish flu (1918-20) in which fifty million people are believed to have died.
According to Our World in Data, in the 105 years since the Spanish flu, a grand total of 10-14 million people have died in pandemics including Covid-19. To put this in perspective, in 2019 alone, nearly eight million people died from non-Covid infectious diseases. Another 41 million deaths were caused by non-communicable diseases. In the five years 2020-2024 inclusive, 7.1 million Covid-related deaths were recorded.
Projecting the trendlines from 2000-2019, in the 2020-24 five years, we could have expected a total of around 35 million deaths from non-Covid infectious diseases and another 220 million from non-communicable; that is, chronic diseases.
Calculations by Leeds university's REPPARE project also show how key claims of massive costs from pandemics are inflated whilst the costs of endemic infections are downplayed. Establishing a dedicated, treaty-based, and resource-intensive international machinery to prepare for a low-burden disease of infrequent outbreaks will distort public health priorities and divert scarce resources and finite attention from more urgent health and other goals.
This is bad public policy that fails the basic test of cost-benefit analysis.
Expanded Powers and Increased Resources for the WHO
Covid saw a successful bureaucratic coup that displaced elected governments with unelected experts and technocrats as de facto policy-makers. The pandemic accords provide the WHO legal authority to declare an actual or apprehended emergency and the power thereafter to commandeer resources for itself from sovereign states and redirect resources funded by the taxpayers of one country to other states, on the basis of what the WHO chief alone considers simply a risk of potential harm.
Many governments argue that other issues like climate change, gun violence, and racism also constitute public health emergencies. These would expand the WHO's remit even more....
By Peter C. Gøtzsche at Brownstone dot org.
Nephrologist Drummond Rennie died on 12 September 2025, aged 89. He was deputy editor at the New England Journal of Medicine and at JAMA, for a total of 36 years.
Drummond's key interest was to improve the quality of medical research. He made numerous outstanding contributions to science and received the 2008 Award for Scientific Freedom and Responsibility by the American Association for the Advancement of Science for promoting integrity in scientific research and publishing and for defending scientific freedom in the face of efforts to suppress research.
Drummond's sense of humour was also outstanding. He told me that he was highly astonished to get an award from the biggest scientific association in the US, which publishes Science: "In my short acceptance speech, I thanked the pharmaceutical industry and my corrupt clinical colleagues for writing my scripts."
Drummond was keenly aware of the dark side of science. When he, in 1986, conceived of and announced the first Peer Review Congress to subject peer review to scientific scrutiny and improve its quality, he wrote:
"There are scarcely any bars to eventual publication. There seems to be no study too fragmented, no hypothesis too trivial, no literature citation too biased or too egotistical, no design too warped, no methodology too bungled, no presentation of results too inaccurate, too obscure, and too contradictory, no analysis too self-serving, no argument too circular, no conclusions too trifling or too unjustified, and no grammar and syntax too offensive for a paper to end up in print."
I met Drummond for the first time at the second Peer Review Congress in Chicago in 1993. The same year, I cofounded the Cochrane Collaboration and opened the Nordic Cochrane Centre in Copenhagen. Drummond was very supportive and became a director of the San Francisco Branch of the US Cochrane Center. We were frustrated that most of the medical literature was unreliable and our mission was to publish critical systematic reviews of trials of the benefits and harms of interventions in healthcare.
Drummond described the old type of a scientific review as being the opinion of a pundit, panjandrum, poohbah, nabob, or lord high executioner, and when the BMJ asked us for advice about a conflicts of interest issue, he noted that if I disagreed with him, he would eat his hat publicly in Tavistock Square "and out in rural Oregon, it's a pretty big hat." I told him he didn't need to eat his hat, which relieved him, "especially as I'd have had to buy the cowboy hat first."
Pfizer's Fraud with Its Antifungal Agent
In 1998, my wife, professor of clinical microbiology Helle Krogh Johansen, and I found out that Pfizer, one of the most criminal drug companies in the world, had rigged a series of trials of their antifungal agent, fluconazole, and we submitted our revelations to JAMA.
Drummond found it uncomfortable and could blush if people praised him, but he was not shy of praising other people. He found our paper "excellent," "wonderful," and "famous," and said he was "very happy to be associated with two such good scientists, and two such brave, open, and honest people." Drummond had these qualities himself.
Pfizer had combined the results for amphotericin B with those for nystatin in a "polyene" group even though it was well known that nystatin is ineffective in patients with cancer complicated by neutropenia. Drummond asked us to confirm this, which we did in a meta-analysis. Moreover, most patients received amphotericin B orally even though it was known that it is poorly absorbed and should only be used intravenously.
It was also unclear if some patients were counted more than once, as the data were sliced and published several times, and as the reports were obscure. The primary investigators didn't answer our questions but referred us to Pfizer which didn't answer them either.
Drummond and I discussed the paper's legal implications at a meeting in Oxfo...
By Jeffrey A. Tucker at Brownstone dot org.
A website specializing in data visuals offered a helpful graphic on global inflation, 2020-2025, with no other comment about how or why this happened. The results are eye-popping and amazing, and a reminder that hardly anyone has fully come to terms with what transpired over five years.
Most currencies in the world took a 25-35 percent haircut, Far East excepted.
That's a technical description that obscures what actually happened. The measures by which most people in the world hold the liquid part of their worldly possessions - the money they earned through hard work and saving - was robbed by a quarter and more.
Where did it go? After all, the wealth didn't sink in the ocean. It was transferred from one group to another. It went from the poor and middle class to the elites in well-connected industries and government. It was simply sucked away from one sector to another, achieving in a matter of a few years what would have been impossible in normal times.
The forced transfer of wealth went from small business to large, from physical enterprise to digital, from store fronts to online, from citizens to government-connected contractors, from workers to leveraged capital, from families to corporations, from savers to a deeply indebted government, and so on.
You are perfectly free to believe that this was all a mistake. Just bad policy. The world panicked because of a pathogen, and central banks ran the printing presses. Out of compassion for our suffering, legislators rained fresh paper on the population which we used to buy hardware and digital gadgetry, while fostering addiction to online entertainment.
Regrettably and mistakenly, governments criminalized small businesses and subsidized large ones. Inadvertently our communities and extended families were divided and then shattered and replaced with the only technology around, Zooming and TikToking while awaiting artificial intelligence to replace the intelligence lost during school and college closures.
Sadly, the shots that everyone thought would save us made us sicker than ever - surely an earnest attempt gone wrong - while a depressed population got hooked on weed and liquor from shops that remained open, and availed themselves to psych drugs newly available through liberalized access via telehealth. The population in the developed world lost three years of lifespan expectation.
You can believe all of this befell people all over the world at the same time via a series of pathetic misjudgments.
Or you could be more realistic and see that this was not a mistake at all. It was entirely intentional, the unfolding of a dark scheme hatched by an indescribably sadistic ruling class. Indeed, if this had all been an accident, we surely would have heard someone apologize by now.
There is also the planning involved. There was Event 201, the lesser-known Crimson Contagion, and many others. They are usually described in the mainstream press as rehearsals for unplanned contingencies, like resiliency training. Absurd. This was plotted far in advance. We have all the receipts. To realize this and connect the dots does not make you a conspiracy theorist. It makes you a person with the capacity to think.
To deny nefarious motives and schemes makes you impossibly naive to the point of sedation. At best, it makes you ill-read in history.
After five years, what can we say was the plan and purpose of this calamity? We all have our views. Certainly within Brownstone ranks, there are many opinions. We argue among ourselves all the time. Coming up with a clean and clear explanation is not easy because there are so many moving parts and so many industrial opportunists who took advantage of the crisis to cash out.
So we all have our own judgments. Mine is as follows. There were three primary motivations and purposes for destroying the world as we knew it: Political, Industrial, and Pharmaceutical.
Political
In the years before the Covid response, the deep...
By Yaffa Shir-Raz at Brownstone dot org.
As ACIP deliberated without access to full trial data, an even more alarming pattern was already unfolding in the real world. Now, analysis of the FDA's FAERS database reveals 37 infant deaths out of just 991 reports - a fatality signal nearly twice that of other routine vaccines. Why was this not disclosed?
The warning sign was already visible in the clinical trials: infant deaths in the treatment groups were twice as frequent as in the control arms - a signal that should have triggered immediate scrutiny. As documented in a previous Brownstone article, this alarming imbalance was withheld from ACIP during its June 2025 review of Merck's competing RSV antibody, Clesrovimab.
Now it emerges that this was not the only red flag kept from the committee. An analysis of real-world data from the FDA's adverse event reporting system (FAERS) reveals an even starker reality: since Sanofi's Beyfortus (nirsevimab), which was approved and added to the US infant immunization schedule in 2023, there have been 1,012 adverse event reports - including 37 infant deaths, a concentration rarely seen in pediatric vaccine safety profiles.
A Disproportionate Share of Deaths
As of September 29, 2025, the FAERS database lists 1,012 adverse event reports for Beyfortus, including 684 serious cases and 37 infant deaths (see Figure 1). This reflects a proportion of reported deaths of 3.6% - far higher than historical norms. A comprehensive CDC surveillance study (1991-2001) found that deaths typically comprised only 1.4% to 2.3% of all pediatric VAERS reports.
A 2023 systematic review covering over three decades of VAERS data similarly found that deaths accounted for just 1.0% of all reports across all age groups, with most years staying below 2%, and only isolated spikes in the early 1990s exceeding that level. Against this backdrop, the proportion of reported Beyfortus cases involving infant deaths appears nearly double the historical average.
The overall severity profile is equally concerning. Of the 1,012 total Beyfortus reports in FAERS, 684 (67.4%) were classified as serious adverse events - defined as hospitalization, life-threatening conditions, disability, or death. As detailed above, this includes 37 infant deaths (3.6%). The remaining serious cases include 415 hospitalizations (40.9%) and 46 life-threatening events (4.5%).
For comparison, the same CDC study found that only 14.2% of reports were classified as serious, while the 2023 systematic review reported hospitalization rates of just 5.8% and life-threatening events at 1.4% of all reports. These benchmarks underscore how disproportionately severe the adverse event profile is for Beyfortus.
While VAERS reports do not establish causality, they are widely used by regulators for signal detection. Importantly, even established passive surveillance systems such as VAERS are estimated to capture only 1-10% of actual adverse events. These patterns, even if preliminary, merit urgent investigation, not dismissal.
A Seasonal Mortality Pattern Hidden from Reviewers
At first glance, one might suggest that the rising number of deaths simply reflects the expansion of Beyfortus use. Yet the timeline tells a more nuanced story - one that reveals a growing and disproportionate signal, even before full-scale uptake.
Before examining the seasonal trends in detail, it is important to understand the scale of uptake. During the 2023-2024 RSV season - the first season in which either nirsevimab or the maternal RSV vaccine was available - CDC data show that only 29% of eligible infants were immunized through either route. State-level coverage ranged from just 11% to 53% (CDC, 2024).
This limited adoption is critical context: if serious adverse events are already emerging at submaximal coverage, what will happen as usage expands?
The year-by-year timeline of deaths provides an instructive contrast:
2023: Beyfortus was rolled out only in October, with a limited three-month ...
By David Bell at Brownstone dot org.
Changing the direction of a dinosaur was, presumably, hard for any who tried. Especially when the direction of the dinosaur was highly profitable to its minders. While paleontology does not fully support the analogy, the picture describes the new Global Health Strategy just released by the US government. Someone is trying hard to return the dinosaur - the largest source of funding for international public health there is - back toward a path that addresses healthcare and real diseases.
Someone else wants to keep it steered on the path preferred by the World Health Organization (WHO), Gavi, CEPI, and the corporate industrial complex that has co-opted public health. Both are trying to look like 'America First.'
Within all this, a thread emerges that does seem to be pushing for a more stable, healthier world. The hope is that the strategy document's confusion just reflects an underlying transition, and the glimpses of a return to common sense and good policy will become more obvious as it is implemented.
The strategy has three pillars, which read as if written by people with very different ideas. The first attempts to claw back what the pandemic industry lost when the US administration defunded the WHO and Gavi. The second aligns with the US HSS stated approach of evidence-based policy and reduced centralization (i.e. good public health). The third plugs (not unreasonably) for US manufacturing, and its future really depends on which of the first two pillars is doing the administration's bidding.
Pillar One: Supporting the Pandemic Industrial Complex
Pillar One, 'Making America Safer,' deals with outbreak risk and essentially reiterates the talking points of the WHO, Gavi, and CEPI, which the current US administration has been defunding.
Whilst the White House is telling us that Covid-19 was almost certainly the result of a lab leak after reckless gain-of-function research (a logical assumption), the strategy document would have the US public believe that pandemics of natural origin (within which they still include Covid) pose an existential threat to Americans in America, and that the US has stopped "thousands" of such outbreaks in recent years.
Ebola. COVID-19. Swine Flu. Zika. The world has experienced multiple epidemics and pandemics in the 21st century, and the threat of a future pandemic is increasing with global connectivity amongst humans and between humans and animals at an all-time high.
This is massively disappointing to read in a serious document. Global data indicates that mortality, and probably outbreak frequency, declined for the decade pre-Covid as infectious disease mortality has generally. The last major mortality outbreak likely of natural origin, the Spanish flu, was in the pre-antibiotic era over a century ago. Medical technology has progressed since then, not just propaganda.
We are better at detecting and distinguishing epidemics from the background of disease because we invented PCR, point of care antigen and serology tests, gene sequencing, and digital communications. Much of this came from America, but is here being used against it to purloin more resources on the pretext that if we lacked the technology to detect a pathogen previously, then the pathogen could not have existed.
Does anyone seriously believe that a hundred years of technological development, improved living conditions, and wildlife eradication actually leaves us more vulnerable?
A return to this poorly-evidenced pandemic rhetoric is a win for the pandemic industrial complex and those who see a need to continue what the strategy document calls elsewhere "perverse incentives to self-perpetuate rather than work towards turning functions over to local governments."
The strategy plans to detect outbreaks within seven days, and will staff countries considered to be high risk for this purpose. This is where logic breaks down. If Covid is indeed a product of gain-of-function research, then the focus should ...
By Thomas Harrington at Brownstone dot org.
[The following is an excerpt from Thomas Harrington's book, The Treason of the Experts: Covid and the Credentialed Class.]
Most of us, I suspect, have had the experience of walking into a darkened room we presume to be empty, only to find someone sitting silently in the shadows observing our movements. When this happens, it is, initially at least, an unnerving experience. Why? Because, though we don't often speak about this, there are things we do, think about, and say to ourselves when alone that we would never allow ourselves to do, think about, or say to ourselves in the presence of others.
When seeking to understand what Bourdieu called the "structuring structures" of a culture it helps to have a keen ear for language, and more specifically still, an ability to register the ways in which certain terms have entered or left the culture's everyday lexicon over the course of our lives.
For example, while terms like "fuck" and "suck," which were once reserved for the expression of our most savage emotions have gone banally mainstream, words like dignity and integrity, which embody timeless and universal ideals have become surprisingly scarce.
On those few occasions when it is uttered today, integrity is pretty much used as a synonym for honesty. While this is not wrong, I think it gives short shrift to the fullness of the concept lurking behind the word. Viewed etymologically, to have integrity is to be integral, that is, to be "one of a piece" and therefore largely devoid of internal fissures.
In practice, this would mean being - or more realistically - assiduously seeking, to become the same person inside and out, to do what we think, and think about what we do.
Going back to the example of the dark room above, having true integrity would mean getting to a point where the sudden presence of the other person in the shadows would not disturb us because he or she would be seeing nothing in us that we would not want to be seen, or that we had not displayed openly on countless occasions in public settings.
There is, I believe, also an important existential correlate to this idea of integrity. It might be summed up as the ability to enter into an active, honest, and fruitful dialogue with what awaits us all: diminishment and death.
It is only through a constant and courageous engagement of the mystery of our own finiteness that we can calibrate the preciousness of time, and the fact that love and friendship may, in fact, be the only things capable of mitigating the angst induced by its relentless onward march.
There is nothing terribly new in what I have just said. Indeed, it has been a core, if not the core, concern of most religious traditions throughout the ages.
What is relatively new, however, is the full-bore effort by our economic elites and their attendant myth-makers in the press to banish these issues of mortality, and the moral postures they tend to channel us toward, from consistent public view. Why has this been done?
Because talk of transcendent concerns like these strike at the core conceit of the consumer culture that makes them fabulously wealthy: that life is, and should be, a process of endless upward expansion, and that staying on this gravity-defying trajectory is mostly a matter of making wise choices from among the marvelous products that mankind, in all its endless ingenuity, has produced, and will continue to produce, for the foreseeable future.
That the overwhelming majority of the world does not, and cannot, participate in this fantasy, and continues to dwell within the precincts of palpable mortality and the spiritual beliefs needed to palliate its day-to-day angst, never seems to occur to these myth-makers.
At times, it is true, the muffled screams of these "other" people manage to insinuate themselves into the peripheral reaches of our public conversation. But no sooner do they appear than they are summarily banished under a concerted rain of imprecation, ...
By Aaron Kheriaty at Brownstone dot org.
The following was published recently in First Things and is reprinted here with permission.
A recent article in MIT Technology Review carries the strange title, "Ethically sourced 'spare' human bodies could revolutionize medicine." Three Stanford biologists and ethicists argue for the use of so-called bodyoids in science and medicine. This infelicitous term refers to hypothetical modified human bodies created from stem cells - bodies that have been genetically altered so that they lack brains, and thus, presumably, are without consciousness.
The authors acknowledge that we do not yet have the technical capability to create such beings, but recent advances in stem cells, gene editing, and artificial uteruses "provide a pathway to producing living human bodies without the neural components that allow us to think, be aware, or feel pain."
Strictly speaking, artificial uteruses are not necessary for the development of bodyoids. Such a reprogrammed embryo could theoretically be created in a lab and implanted in a woman's uterus, as is done with IVF. But the notion that an entity regarded as subhuman should be born from a human mother seems too gruesome even for these bioethical pioneers to contemplate.
The authors admit that many will find the prospect of bodyoids disturbing, but they argue that a "potentially unlimited source" of "spare" human bodies will be immensely useful and should be pursued. We could, for example, harvest the organs of these presumably nonsentient humans and conduct experiments on them in order to test drugs and other medical interventions.
The authors even suggest that it would be more ethical to do drug testing on humans who cannot feel pain, because they lack nervous systems, than on animals that can feel pain. There are other potential benefits for animal species as well, they aver, since we could use animal bodyoids to avoid causing pain and suffering in the cows and pigs we slaughter for food.
Human bodyoids are not entirely within the realm of science fiction. Scientists have recently produced "embryoids," or "synthetic embryos," from reprogrammed stem cells, without the use of sperm and eggs. Embryoids are living entities that seem to develop as human embryos do but that presumably lack the capacity for full human development.
(We do not know for sure that they do, as they are typically destroyed after fourteen days, before the heart and brain have begun to develop.) Just as advocates for embryoids argue that their innovation allows us to avoid the ethical problems associated with embryo-destructive research, so advocates for bodyoids propose to provide us with "ethically sourced 'spare' human bodies."
The Christian ethicist Oliver O'Donovan described "a position too familiar to technological society, that of having achieved something that we do not know how to describe responsibly." In the case of bodyoids, I submit, advocates do not know how to describe them at all. One can hear them stumbling over their words and fumbling with descriptors.
Bodyoids are human bodies. Or rather, human-like bodies. But not human in any morally relevant sense - they lack brains, after all. But sufficiently human that we can harvest their organs for transplant and conduct experiments on them to see how "real" humans would respond to drugs. Indeed, they are of interest to scientists precisely because they are so, well, so very human. But not really. For the most part.
Well, then, what are human bodyoids?
Long before ethicists began to contemplate living - or at least, undead - human creatures who lack all brain function, such entities were explored in science fiction and horror films. The precise name for such a creature is zombie. The concept has roots in Haitian folklore, where the term is zonbi, referring to a person who has been brought back from the dead through magical means to serve as a mindless slave.
The problem with creating zombies, our stories suggest, is that...
By Russ Gonnering at Brownstone dot org.
Ten years ago, the internet was completely captured by "The Dress."
A photo of a dress was displayed. Was it Blue and Black? Was it White and Gold? Everybody had an opinion, and it was definite.
This is distinct from other optical illusions, such as The Rubin Vase, which can be easily reversed by most people:
The Dress launched a flurry of scientific inquiries in an attempt to explain the science behind this. The Journal of Vision, a respected academic ophthalmology journal, launched an open-access special edition in an attempt to explain these curious findings on the basis of multiple objective measurements of luminosity, color saturation, assumptions on illumination with natural or artificial light, and prior exposure to long or short wavelengths.
Perhaps the most interesting explanation, and one that is more easily understood by a lay audience, is contained in an issue of Wired magazine:
So, when context varies, so will people's visual perception. "Most people will see the blue on the white background as blue," Conway says. "But on the black background some might see it as white." He even speculated, perhaps jokingly, that the white-gold prejudice favors the idea of seeing the dress under strong daylight. "I bet night owls are more likely to see it as blue-black," Conway says.
Importantly, it made no difference if it was known by the viewer of the photo that the real dress was blue and black…
Is there a larger lesson that can be learned from this? If illumination and context of a representation of reality of a physical object form perception of that reality, can the same be true of other, less tangible things, such as ideas? I believe so.
The last decade has seen a profound and deepening divide on multiple issues. Consider the recent heated exchanges during the meeting of the Advisory Committee on Immunization Practices (ACIP) of the CDC. Unless one took the time to watch the two days of the meeting, only a summary, most often as a news report, is available. One sees a representation of the actual meeting.
Just as when looking at a photo of The Dress, perception will be greatly influenced by the context and the illumination of the representation. However, this time it is not only the context and illumination of the viewer, but also that of the producer of the news.
Herein lies the problem. How can we ever arrive at a true representation of reality? During the Great Covid Disaster, I kept thinking that if only the actual unbiased data could be shared with those who insisted the virus evolved naturally, or those who believed early treatment was impossible, or those who insisted the mRNA agents were "safe and effective," the impasse could be broken. Alas, that never happened because the source of illumination had been altered from what it was in the past.
With the advent of Postmodernism, the literal definition of truth itself has changed. The truth has been replaced by my truth and your truth. Truth has become an opinion, no more important than whether you like your steak rare or medium.
In the past, we relied on ethical medical science to lead the way to find the truth, but is that even possible now? In current medical studies, it seems that the conclusions are now made first, then the study is designed to fit those conclusions. A recently published study on a trial of a new drug to treat hypertension had this statement appended to the end:
The sponsor designed and conducted the study, including collection, management, analysis, and interpretation of the data. The sponsor was involved in the preparation, review, and approval of the manuscript and decision to submit the manuscript for publication in collaboration with all authors. The final decision on content was exclusively retained by the authors.
I realize that drug companies are interested in proving their products indeed help people, but if the drug manufacturer "designs and conducts the study, including collection, manageme...
By Peter C. Gøtzsche at Brownstone dot org.
The vaccine area is much more complicated than I knew when I worked at a department of infectious diseases as a young doctor. I didn't perceive vaccines could be a problem and had taken all the recommended ones.
My 2013 book, Deadly Medicines and Organised Crime, hardly mentions vaccines at all because none of the major scandals in healthcare where drugs had killed thousands of patients because of drug company fraud had involved vaccines.
In 2015, the former head of Cabinet in the Danish Ministry of Health asked me to attend a meeting about an ongoing dispute about the safety of the HPV vaccines. He was hoping I would agree that there was no reason to worry about the alleged serious neurological harms of the vaccines.
There surely was, and my 2021 book, Vaccines: Truth, Lies, and Controversy, has a long chapter about the HPV vaccines. It also documents how the influenza vaccines have been hyped beyond belief and the evidence, both by the US Food and Drug Administration (FDA) and by the Centers for Disease Control and Prevention (CDC). I never had a flu shot, and after having studied the data and considering that influenza is a rare disease, I decided I would never get one.
My work with the vaccine book made me realise that it is very difficult to get honest information about vaccines. The vaccine area is fraught with censorship, retaliations, and intolerance. I was called an anti-vaxxer even when I only asked questions and when I argued why mandatory vaccinations are unethical.
During the Covid-19 pandemic, Harvard professor Martin Kulldorff was fired for debating official policies. One of the things Martin had said was that those with prior natural infection and children did not need the vaccine. Children are at very low risk of becoming seriously ill after a Covid-19 infection whereas the mRNA Covid-19 vaccines have killed about 1-2 per 200 children who got myocarditis. But it did not matter that Martin was right. What mattered was that he had broken the omertá.
Secretary of Health and Human Services Robert F. Kennedy, Jr. is determined to get rid of the corruption at the CDC. He fired the whole Advisory Committee on Immunisation Practices (ACIP) and installed a new, much better one. There have also been changes at the top, and his Deputy Secretary Jim O'Neill, Acting Director of the CDC, announced on X:
"During the previous administration, CDC lost public trust by manipulating health data to support a political narrative…We have…ended the misuse of the childhood immunization schedule for Covid vaccine mandates."
Most disturbingly, vaccine programmes have not taken the important results by Danish researchers Peter Aaby and his wife, Christine Stabell Benn, into account. They have shown that live attenuated vaccines decrease total mortality more than what can be predicted from their specific effect, while non-live vaccines increase total mortality.
They have also shown that the sequence of vaccinations is important; that it is best to end with a live vaccine; and that the harms of non-live vaccines predominantly affect girls. These results are so groundbreaking that they are on the list of milestones in Nature that starts with the discovery of the smallpox vaccine, which, like the measles vaccine, has saved millions of lives.
Just as for other drugs, we need to look at each vaccine separately to find out if it is worth taking. It is meaningless to divide people into being for or against vaccines. We don't divide people into being for or against humans. It depends on the person.
When Peter gave a talk at an international meeting I had arranged in 2019 about vaccines, YouTube removed the video and disregarded our protests. When I interviewed Christine about vaccines for our Broken Medical Science channel and we uploaded it on YouTube, the video was removed. The videos can be seen here and here. Everything Peter and Christine said was correct, but this is immaterial for the censorsh...