The misinformation crisis isn’t about truth, it’s about trust
Without free speech, academic freedom, and confidence in our experts and institutions, no amount of fact-checking will help.
Joe Biden won the 2020 presidential election.
Vaccines do not cause autism.
There are important biological differences between men and women.
These are, to the best of our current knowledge, facts. And it matters whether or not we believe them. It directly affects our democratic process, our health, and our culture. It could determine whether we’d have found ourselves storming the U.S. Capitol on January 6, 2021. Or whether our children are among the growing number currently contracting the measles. Or whether scholars like the evolutionary biologist Carole Hooven get to keep their academic positions at Harvard.
There are many reasons why people may not believe these facts, or actively rail against them. Tribalism. Echo chambers. Social pressure. Political or ideological expedience. And of course, being misinformed.
Sadly, there are fewer solutions to this problem than there should be. One reason for this is that our expert classes — those of us who should be trusted authorities on these matters — have torpedoed so much of their credibility with the public.
Last week, The New York Times published an op-ed by the sociologist Zeynep Tufekci (
) called “We Were Badly Misled About the Event That Changed Our Lives.” In it, Tufekci discusses how “in 2020, when people started speculating that a laboratory accident might have been the spark that started the Covid-19 pandemic, they were treated like kooks and cranks.” Indeed, she continues, “[m]any public health officials and prominent scientists dismissed the idea as a conspiracy theory, insisting that the virus had emerged from animals in a seafood market in Wuhan, China.”The idea that Covid came from a lab wasn’t just dismissed as a conspiracy theory. It was also labeled racist, adding a more toxic aura to the notion and making it even riskier for people to entertain it.
Particularly in 2020, when a lot was changing very quickly and so many things were either unknown or uncertain, disagreements about the origins of the Covid-19 virus were perfectly understandable. Given the chaos and confusion, emphasizing the wrong hypothesis without ample evidence could have terrible consequences. It is in fact the right thing to do to avoid jumping to conclusions in a situation like that.
But that’s not quite what happened. As Tufekci notes:
To promote the appearance of consensus, some officials and scientists hid or understated crucial facts, misled at least one reporter, orchestrated campaigns of supposedly independent voices and even compared notes about how to hide their communications in order to keep the public from hearing the whole story. And as for that Wuhan laboratory’s research, the details that have since emerged show that safety precautions might have been terrifyingly lax.
This is just one example, but it clearly illustrates the larger problem affecting our culture, our discourse, and our capacity for knowledge-creation. Those who decried and dismissed the lab leak hypothesis showed an unwarranted level of certainty and arrogance. It’s not as if there had been a massive investigation, with the cooperation of the Chinese government, to get to the bottom of this and definitively establish the origins of the virus. Despite the chaos and uncertainty throughout the pandemic, the lab leak hypothesis was always a reasonable idea. Jon Stewart even joked about it back in 2021.
In this context, platitudes like “trust the science,” or signs saying, “In this house we believe…,” which were repeated like mantras and plastered everywhere, show themselves to be mere tribal signifiers — symbols of a group’s certainty and moral superiority. More than that, they betray a fundamental misunderstanding of what science actually is: the systematic application of doubt.
It’s difficult to overstate just how much damage our academic, scientific, and intellectual elites have done to our universe of shared facts, our institutions, and the public’s ability and willingness to believe them. There is a growing sense among people that trust and confidence in our experts and institutions is unwarranted — and particularly in the last ten years, there has been no shortage of behavior to justify this suspicion.
Cancel Culture, ideological conformity, and the erosion of trust
The paperback edition of Greg’s and
’s 2023 book “The Canceling of the American Mind” hits shelves on April 29 with updated data, reflections on FIRE’s 2025 College Free Speech Rankings, and an entirely new epilogue. The book is filled with examples and case studies illustrating how the behavior of our social and intellectual elites has contributed to this crisis of trust in expertise and institutions. This includes Cancel Culture itself, which Greg has previously described as “the military arm of the Anti-Discourse Industrial Complex.”In the last decade or more, we have seen people getting in trouble for being on the “wrong” side of virtually every hot button issue in the United States. Cancel Culture has ruined lives. It has cost people their livelihoods. And combined with the constant denial that Cancel Culture even exists, it has understandably fomented a general distrust in academia, journalism, and expertise — the very mechanisms of knowledge creation in our society.
This shouldn’t be surprising. When the penalty for having a disfavored opinion can be life-destroying, trust in the objectivity of experts is inevitably going to take a hit. As Greg mentioned in an early ERI post, “When even a single thinker is punished for their academic opinion or for engaging in thought experimentation, it leads the public to be justifiably skeptical that any expert on that topic is being fully honest.”
And when people don’t trust the experts, they’ll find other people to consult on matters of social and scientific import. Is it any wonder, then, that we find ourselves in this miasma where political, ideological, and cultural groups have less and less of empirical reality in common?
Our expert class has grossly underestimated their own responsibility for this problem. For example, one major issue we’re facing is the replication crisis — or the discovery that many of the findings in behavioral science can’t be replicated in new experiments, seriously compromising our confidence in their veracity. Whether it’s through intentional falsification, incompetence, oversight, or some combination of all of these, it’s a serious blow to the authority of our expert class to discover that much of our data might be unreliable.
For far too long, our intellectual and ideological gatekeepers have taken their expertise, experience, and the trust of the public for granted. And they have abused that trust on the basis of several errors.
The first is the assumption that trusting experts is a moral issue, rather than a political and pragmatic one. They believe that trust in and fealty to them is owed rather than earned. Given their immense cultural and institutional power, this assumption creates several self-fulfilling dynamics that help bolster and perpetuate it. For example, failing to trust experts often reflects negatively on the person who doesn’t trust them rather than on the experts themselves. And by dismissing dissenters as kooks, cranks, or bigots, the experts are not obligated to do anything to reassess their positions, craft stronger arguments, produce better evidence, or take other actions to build and preserve trust.
This is the dynamic that was at play with the lab leak hypothesis in 2020. And as readers of ERI and “Canceling” will know, using labels like “racist” to shut down arguments is part of a more complex tactic Greg and Rikki call the Perfect Rhetorical Fortress. This method for winning arguments without actually winning arguments is bolstered by an assumption called The Great Untruth of Ad Hominem: “Bad people only have bad opinions.”
The second error our expert classes have made is closely tied to this Great Untruth: the embroiling of expertise and authority with political and ideological outcomes. The assumption is that experts aren’t just smart people focused on knowledge creation or grappling with important ideas in our highest institutions, they’re also better people — and better people are always right.
This is also how they explain the overwhelming tendency among the expert class to be politically liberal. Rather than see it as evidence of bias in these institutions, they consider it proof that liberal views are correct and supported by science. This is another reason why the system of ideological hurdles that keeps dissenters and conservatives out of academia — which Greg and Rikki call the Conformity Gauntlet — often goes ignored or unnoticed in these spheres.
As a result of this moral and ideological scaffolding, the expert class is also prone to assuming that their political opinions are automatically correct, leading to moves meant to “do the right thing” but which often backfire. Consider, for instance, the scientific journal Nature endorsing Joe Biden’s presidency in 2020, despite polling that showed this didn’t increase support for Biden. Indeed, the only effect it did have was an increased distrust in science.
This shouldn’t have been a surprise. Studies have shown that politically or ideologically homogenous media and institutions increase polarization — but you shouldn’t have needed that data to see how counterproductive a move like that would be. Actions like these have the aftereffect of politicizing everything, making the acceptance of a scientific fact a signal of tribal affiliation more so than a pragmatic position based on empirical evidence. It’s not hard to imagine how this will destroy trust in science overall.
This same outsized sense of rightness and authority also allows experts to be comfortable with so-called “noble lies.” Anthony Fauci’s claims regarding mask use and effectiveness at the outset of the Covid-19 pandemic are a perfect example. At first, Fauci said masks were ineffective, which comported with the evidence (particularly for cloth masks) at the time. Then he doubled back, revealing that he had said so to avoid panic and to prevent N95s from becoming scarce for hospital workers who needed them.
Because experts assume they know better, and because they assume they’re the good people, they give themselves the authority to decide when it’s acceptable to mislead or lie to the public. The problem, of course, is that the truth has a tendency of coming out eventually. And once people know you’ve lied to them, they may never trust you again even when you’re right. In fact, there is something worse about people lying to you when they think it’s for your own good, because it means they’re inclined to lie to you about the most important things.
These errors have been the cause of so much of the mayhem we’ve witnessed in the last decade of cultural discourse. The unreliability of our institutions and our expert classes to be honest, transparent, and competent has eroded a much-needed trust — not just in the academics, journalists, and scientists who engage in this behavior, but also in academia, journalism, and science as endeavors to knowledge creation.
How to restore faith in expertise and save our knowledge-creating institutions
It’s no exaggeration to say that our current cultural situation is unsustainable. We cannot continue siloing ourselves in ideological bubbles, accepting only what confirms our beliefs and rejecting the rest as the other side’s propaganda. We need competent, reliable, transparent methods for determining what is true, and we need those methods to be as free from bias and ideological capture as possible.
Unfortunately, many current efforts at fact-checking and combatting “misinformation and disinformation” consist of doubling and tripling down on many of the same behaviors that got us here. This is especially the case when it is the government attempting to do the reality-policing. As we’ve seen, top-down methods for labeling and suppressing misinformation are too easy and tempting to misuse and abuse.
The reality is that no amount of fact-checking will work in a context where trust in experts, expertise, our institutions, and one another has disintegrated. Even more distributed methods like X’s Community Notes feature, which has been shown to be effective in pushing back on falsehoods, can still easily be at risk. Recently, Elon Musk said he’s going to “fix” Community Notes because it has been producing results he doesn’t like, and as the owner of the platform, he can do it.
Still, the problem goes deeper than that. Even if we could somehow guarantee perfect, unbiased fact-checking and corrections of mis- and disinformation, it won’t make any difference to the tribes. This is because a fundamentally different game is being played. It isn’t about knowledge creation and discovery, but rather the supremacy of one ideological group over another.
So how can we really solve this problem? How can we fix the rampant mistrust in expertise, science, our institutions, and one another? The answer is simple, but not easy.
Our expert class and our institutions need to earn back the trust they lost. And they need to do this by consistently showing themselves to be transparent, honest, and competent.
The first and most obvious step is for them to adhere to free speech law, and to promote and fervently protect a robust free speech culture. Free speech law will protect us from government attempts to crack down on disfavored speech and information. But a free speech culture is even more important. We need a society where people are not terrified to say what they really believe, if for no other reason than that it allows us to know that’s what they believe. This idea is what Greg calls the Pure Information Theory of free speech. The reality is that it’s inherently valuable to know the world as it really is, and that requires a free and open discourse.
This is also why academic freedom is critical. A circumstance where people do not feel free to dissent, challenge the prevailing orthodoxy, and engage in good-faith debate is one in which our ability to discern truth and produce knowledge ceases to exist.
To be clear, we aren’t saying that all of our knowledge-creating systems, institutions, and experts are compromised. We’ve cited plenty of research and journalism in this piece, and we stand by their reliability until proven otherwise. The point is that we need more and more reliable systems, institutions, and experts to help us generate knowledge and see the world as it truly is.
Of course, many of our current systems are broken or compromised — but they can be repaired and restored, and there are plenty of people intent on making that happen. This is one of the reasons we are optimistic about new National Institutes of Health Director Jay Bhattacharya’s ideas for fixing the replication crisis (despite other concerns with the NIH in general, which we may get into in a future post). It’s also why we look forward to what the University of Austin and other attempts to break the monopoly of higher education do moving forward.
We also need to discuss and consider bigger and bolder ideas for helping us generate knowledge and test existing assumptions. Artificial Intelligence, for example, can be unleashed to check existing scholarship for fake data, p-hacking, and plagiarism in a way that could actually be tremendously helpful.
The bottom line is that our solution to this crisis cannot be top-down. It cannot be centralized. It has to rely on a range of individuals, groups, and institutions that have earned the public’s trust and intend to keep it. We also have to recognize, as
has often said, that you never quite get to the truth itself. Rather, our lives and efforts are more about diligently chipping away at falsity — ever honing in on truth through subtraction, but never quite getting there.We have more responsibility — and more capacity — to check our biases than we think
Our institutions and expert class have a lot to answer for. That much is clear. But the rest of us aren’t off the hook, either. Part of bridging these empirical and ideological gaps will require our own efforts at recognizing our common goals and interests, as well as our individual and collective limitations when it comes to discovering truth.
It can be easy to discuss bias as if it’s something that we just discovered. But bias has always been with us, and luminaries like John Locke, Montesquieu, and others made great strides in examining it. Many of our modern systems and structures, like the scientific method, were created for the specific purpose of checking that all-too-human tendency.
The Founding Fathers thought about bias all the time, which is why they worked the separation of powers and divided government into the American system. These structures are all about finding truth in a situation where you take bias for granted. Academic freedom was meant to be one of those structures, and while it’s not currently working too well, we can fix it. FIRE has been fighting that good fight for more than 25 years.
It’s important to consider the many natural impediments we have to understanding the world as it is, and to develop or adopt systems, technologies, and incentives to help us correct them. It’s a new, constantly changing world, and all solutions should be on the table.
This of course includes the tried-and-true methods, which we should constantly reinforce. For instance, the fundamental idea that preserving free speech for those we disagree with is precisely what protects our own free speech from those who disagree with us. These commitments to fundamental principles will not only allow us to communicate more openly and effectively, they will also remind us of the wide range of common ground we actually occupy despite our differences.
It is also incumbent upon all of us as individuals to recognize the part we play in this ongoing tribal conflict and crisis of trust. The freedom to discover the truth for ourselves — which we should all have, and want — is also the responsibility to behave like rational, reasonable adults, and not tribal lunatics. We all must be just as open to dissent, just as allergic to ideological bubbles and echo chambers, and just as intent on discovering the truth no matter how uncomfortable and inconvenient it might be. After all, the institutions we’ve rightly criticized for their bias and blindness are made up of people very much like ourselves. We aren’t free from their flaws.
The truth is hard to discover, and we can’t do it alone
Fundamentally, our job is to recognize that we cannot discover the truth on our own. It can be fun to log onto social media and play the part of epidemiologists, economists, foreign policy experts, military tacticians, and immigration lawyers — but the reality is that we live in a crazy, complicated world. Deluding ourselves about our capacity to effectively navigate and fully understand it doesn’t help. We need facts and knowledge, and reliable systems and structures for producing it, because we can’t reasonably or reliably do it all ourselves. But trust is critical to the effectiveness of those systems and structures. When we lose it, all hell breaks loose.
Greg has spoken before about how analogous our current moment is to the invention of the printing press. It’s easy to forget, given how far we’ve come, just how destabilizing a technology the printing press really was — especially to the authorities of the time. By allowing millions of people to enter the cultural conversation, the nature, scope, and complexity of that conversation drastically changed. Social media has allowed billions of people to enter the global conversation, and that has inevitably had an exponential impact on every aspect of our lives, including knowledge creation. It exposed just how fragile that system really is, and showed us just how much work it will take to preserve it.
We should never take it for granted. Believable authority and good reputations are very hard to build and very easy to lose. Sadly, our expert class has damaged their credibility in the eyes of the public far more than they seem to actually understand. It’s going to take serious work to get it back, and it won’t happen overnight. As we continue to barrel our way into an uncertain future, we are going to need institutions to create, disseminate, and safeguard knowledge — and an expert class that shows itself to be worthy of our trust.
SHOT FOR THE ROAD
As mentioned above, my most recent book, “The Canceling of the American Mind,” which I co-authored with the Gen-Z wunderkind Rikki Schlott, will be out in paperback April 29! I expect I’ll be doing another wave of press to promote it, but here are some of the greatest hits from the press tour after the hardcover’s release:
Lex Fridman Podcast in September 2023:
GBNews’ “The Dinosaur House” with John Cleese in November 2023:
And, of course, HBO’s “Real Time with Bill Maher” in December 2023:
Excellent article.
There are 5 major lies we were told in the last 8 years and they have undermined just all our trust in our cherished institutions. Victor Davis Hansen articulates them well. They are in no particular order:
5 lies
1. Covid wasn’t a lab leak
2. Biden was fit to serve as president mentally
3. The border couldn’t be controlled without comprehensive immigration reform.
4. Trump colluded with Russia in order to win 2016 election
5. The Hunter Biden laptop was misinformation.
So we lose trust in CDC, FDA, legacy media, FBI,CIA. The common thread was suppression of free speech. That’s why X’s community notes is so much better than an “unbiased “ fact checkers. I trust science but I don’t trust scientists. That’s why Francis Bacon developed the scientific method to keep our biases as far from the research as possible.
Excellent piece. Looking forward to the book. While I generally agree with all of it, I'd like to see some deeper root causes addressed: the turn of academia from knowledge creation to activism, and the similar turn of journalism to activism. Karl Marx may have dismissed traditional philosophy in favor of saving the world (in his view), but is it really a good idea that those entrusted with the discovery and transmission of knowledge to be pushing ideological agendas?