TRIBAL EPISTEMOLOGY IS A BIPARTISAN PROBLEM
by LAWRENCE M. EPPARD & STEVEN SLOMAN
“Remember, the firemen are rarely necessary. The public itself stopped reading of its own accord.” —Faber
“When the facts conflict with. . . sacred values, almost everyone finds a way to stick with their values and reject the evidence.” —Jonathan Haidt
“Nowadays, too much information is on offer, most of it bad or wrong, and we spend our time either sifting for gold in the filth or mistaking the filth for gold.” —Ian Bogost
Introduction
In The Poisoning of the American Mind, a new book due out on July 31st, Lawrence Eppard, Jacob Mackey, and Lee Jussim argue that the consumption of misleading information is a bipartisan problem in the United States.
Both conservative and liberal Americans are regularly bombarded with questionable assertions from sources that they believe to be trustworthy and authoritative, sources which often present the information in a manner that appeals to the sacred beliefs of consumers’ in-groups. This increases the chances that many Americans will fall for dubious claims.
David Roberts provides a useful description of the problem many Americans are currently facing, a problem he calls “tribal epistemology”:
“Information is evaluated based not on conformity to common standards of evidence or correspondence to a common understanding of the world, but on whether it supports the tribe’s values and goals and is vouchsafed by tribal leaders. ‘Good for our side’ and ‘true’ begin to blur into one.”
While he was writing about the right-wing, we believe the phenomenon he describes afflicts Americans of all political stripes today.
We Humans Are Easy Targets
A significant amount of psychological research suggests that humans are actually prime targets for misleading information. Compelling empirical evidence suggests that people all across the political spectrum have cognitive tendencies that encourage them to seek out news and information sources that mirror their worldviews, avoid ones that don’t, and interpret information using cognitive filters that force an alignment with what they already believe. As social psychologist Jonathan Haidt has observed:
“All groups value the truth. . . All groups hold something sacred. And if you hold something sacred. . . your sacred values are going to conflict with the truth. And when that happens, all groups are the same: they throw truth under the bus, and they go with their sacred values. And that’s where we are.”
Brookings senior fellow Jonathan Rauch similarly explains that “[B]ecause our biases evolved to guide us in some directions and away from others, they do not result in randomly distributed errors. Rather, the errors lead us down predictable pathways, again and again.” He goes on to write that “When facts collide with beliefs which implicate our prestige or define our identity. . . the facts tend to bend.” And social psychologist David Dunning writes:
“Some of our most stubborn misbeliefs arise not from primitive childlike intuitions or careless category errors, but from the very values and philosophies that define who we are as individuals. Each of us possesses certain foundational beliefs—narratives about the self, ideas about the social order—that essentially cannot be violated: To contradict them would call into question our very self-worth. As such, these views demand fealty from other opinions. And any information that we glean from the world is amended, distorted, diminished, or forgotten in order to make sure that these sacrosanct beliefs remain whole and unharmed.”
The more we surround ourselves with low-quality sources of information, the more these common human propensities distort our understanding of reality.
People have always had these cognitive tendencies, and there are various plausible reasons for their existence—some of them probably innate, some likely the result of social learning. One reason is that people likely have hard-wired biological predispositions that lead them to favor certain ideologies—these ideologies exist in the world, and instead of us acquiring them, they acquire us because our predispositions make them hard to resist and render their alternatives unattractive. Another is that values and beliefs can serve to help people understand the world, form their view of themselves, and/or bind them to loved ones and in-groups—these values and beliefs thus become a cherished part of one’s identity, social bonds, and understanding of the social order. Yet another is that, in early childhood, the information that we are exposed to and that we incorporate into our emerging identity and worldview is highly curated by those closest to us and within the environments that we spend the most time.
These cognitive tendencies weren't as dangerous in previous eras when people were forced to rely on legitimate sources of information on a regular basis, while untrustworthy, partisan, and fringe sources were far fewer and difficult to access. There are considerably more easy-to-access media and information sources today than there were in the 1980s and 1990s, and now many of them are fragmented along partisan lines. Additionally, there are currently some areas of academia that are malfunctioning and helping to proliferate suspect claims. These two realities—partisan news/information sources and malfunctioning academic fields—have made an enormous amount of low-quality information available to the average American.
Lee McIntyre, author of Post-Truth, provides an apt metaphor for the vast ocean of information—much of it high quality, but much of it garbage, too—easily available to the average American:
“There is a scene in Indiana Jones and the Last Crusade where he is in a room with all of these goblets and chalices and doesn’t know which one is the Holy Grail. That’s where we are right now. We have the truth right in front of us, but we don’t know which one it is.”
People’s cognitive tendencies encourage them to avoid and/or misinterpret trustworthy news and information if they do not agree with it—and current conditions allow them to pull this off to a degree that they couldn't in even the recent past. This allows far more Americans to stick to sources they agree with, however flawed or flat out wrong, and convince themselves that they have legitimately affirmed a distorted view of reality:
“In an age of cheap and abundant information, it is relatively easy to be a better-informed citizen. But the commitment to become such a citizen requires changes in even small habits that many people are unwilling to make, including reading a reputable newspaper and turning off the gladiatorial propaganda of social media, video postings, and cable shows. If making such changes means feeling less good about ourselves, or even thinking less often about ourselves, many of us will simply refuse to do it.” (Tom Nichols)
This epistemic crisis is incredibly corrosive for our society and culture. Sadly, we do not see an obvious way out of this mess.
All of this misleading information comes in a variety of forms, including misinformation (false information), disinformation (intentionally false information), and malinformation (true information used in a misleading manner).
There are numerous areas where these different forms of misleading information are a problem on both the left and right. One very salient one as we write this in the summer of 2024 is in discussions about transgender issues.
In July of 2024, for instance, Planned Parenthood published a tweet containing the following claims:
Gender affirming care reduces suicide risk. This is an example of either misinformation (if the tweet’s author is truly unaware it is misleading) or disinformation (if he/she is aware of its dubious nature and made the claim anyway). The empirical link between gender dysphoria and suicide risk, and the link between gender affirming care and suicide reduction, have been called into question by several researchers.
Gender affirming care is backed by every major medical association in the U.S. This is an example of malinformation—the information they present may be true, but they conveniently ignore non-U.S. organizations. By carefully curating which medical associations “count” in this debate (not due to the accuracy of their conclusions but instead whether they ideologically align with Planned Parenthood’s worldview), they are able to engineer a misleading understanding of the preponderance of the evidence on this issue. See recent developments in several European countries to understand how fraught this research area is at the moment.
Gender affirming care is essential. This is an example of either misinformation or disinformation, depending on the author’s knowledge, as there are varying scientific perspectives on what gender affirming care should consist of and which version of this care is truly beneficial for gender dysphoric individuals.
Misleading claims like these about transgender issues are widespread on the left. In progressive circles, it is common to hear that a particular version of gender affirming care preferred by leftists is “settled science,” as GLAAD routinely calls it. The intended effect of such claims seems to be to shut down debate by creating the impression that a specific progressive interpretation of transgender issues is backed by “the science” and opponents’ interpretations are discredited by “the science.” Thus, it is implied that the public should trust those like Planned Parenthood and GLAAD who claim to be on the side of “truth” on this issue, not those supposed know-nothings and science-deniers and bigots (and Republicans!) on the other side.
But the problem runs even deeper than the misuse of information by incompetent and/or bad faith actors. People consuming news and information often do not even respond to evidence or arguments. Instead, we rely on our preformed ideas to make sense of what we see. We interpret information to conform to our expectations.
Research dating back to the 1970s shows that people’s beliefs about the colors of balls in an urn shape how they update their beliefs when shown balls sampled from the urn. Different studies from the same era show that those who support capital punishment thought that research validating the effectiveness of capital punishment used better methodologies than people who were against capital punishment. Judgments about the quality of the methods were reversed for studies that found that capital punishment was ineffective. In other words, people thought that studies that agreed with them were better done.
More recently, psychologist Keith Stanovich has reviewed a variety of what he calls “myside biases,” findings that people not only evaluate evidence so that it conforms to their own beliefs, opinions, and attitudes, but we are similarly biased in how we generate evidence and even test hypotheses:
“Research has shown that myside bias is displayed in a variety of experimental situations: people evaluate the same virtuous act more favorably if committed by a member of their own group and evaluate a negative act less unfavorably if committed by a member of their own group; they evaluate an identical experiment more favorably if the results support their prior beliefs than if the results contradict their prior beliefs; and when searching for information, people select information sources that are likely to support their own position. Even the interpretation of a purely numerical display of outcome data is tipped in the direction of the subject’s prior belief. Likewise, judgments of logical validity are skewed by people’s prior beliefs. . .
Many cognitive biases in the psychological literature are only displayed by a subset of subjects—sometimes even less than a majority. In contrast, myside bias is one of the most ubiquitous of biases because it is exhibited by the vast majority of subjects studied. Myside bias is also not limited to individuals with certain cognitive or demographic characteristics. It is one of the most universal of the cognitive biases.” (Stanovich in Quillette)
Often, opinions and attitudes that we support are not really our own at all. As cognitive scientists Steven Sloman and Philip Fernbach explain in The Knowledge Illusion, human beings live within a community of knowledge and the opinions and attitudes that we express are not generated by our own processes of judgment and reasoning but are borrowed from our wider community. Human knowledge is a communal entity; only a tiny fraction of it is contained within the head of any individual. Most of us do not know much about how even basic everyday things like toilets or zippers work despite how vitally important they are in the modern world, never mind how little we understand about how complex social policies and institutions operate. It takes years of study to really appreciate the complexity of, say, the American health care system or the causes and implications of immigration. Few people have the time or ability to conduct such analyses and hence the vast majority of us rely on others for our opinions and attitudes.
Does this lack of knowledge about myriad aspects of our daily lives hold us back? Not necessarily. In fact, it is an advantage:
“A modern society cannot function without a social division of labor and a reliance on experts, professionals, and intellectuals. . . No one is an expert in everything. No matter what our aspirations, we are bound by the reality of time and the undeniable limits of our talent. We prosper because we specialize, and because we develop both formal and informal mechanisms and practices that allow us to trust each other in those specializations.” (Tom Nichols)
If each of us had to master everything we rely upon every day in order to function, it would be necessary for our world to be extremely limited and technologically basic.
We are able to excel in the modern world not because of our incredibly complex understanding of it, but because of the community’s collective understanding of it and our trust in and reliance on the expertise of others within that community to sustain it. So it is vital that we rely on information produced by the trustworthy members of the larger epistemic system as we make countless decisions in life. And when we begin to avoid these trustworthy members in favor of untrustworthy ones, our society can begin to falter.
Unfortunately, we believe that this is where we are as a society.
Epistemic Shortcomings Afflict Both Sides
Some Americans believe, mistakenly in our opinion, that this is just a right-wing problem. They are of course correct to identify that conservative America is mired in a worrisome epistemic crisis.
Indeed, many on the right hold questionable beliefs on a number of issues. One of the most obvious and egregious recent examples was Donald Trump’s “Big Lie” about the 2020 U.S. presidential election. Even years after the election took place, most Republicans somehow still believed Trump’s claims that his loss to Joe Biden was illegitimate. His lies continued to proliferate in right-wing circles despite not only a lack of evidence but also the fact that Trump is a notorious conman who telegraphed that he would make such a preposterous claim before the election even took place.
Election results they don’t like aren’t the only area where conservatives fall for misleading claims—from the silly Obama birth certificate affair to lies about climate change to the bizarre rise of QAnon and much more, the right-wing is guilty of some serious epistemic shortcomings.
In our view, there are two main problems on the right when it comes to misleading information. They are not the only causes of their epistemic shortcomings, but are two very important ones nonetheless.
“Fox is not a news channel—it is the right’s Pravda.” —Mona Charen
First, there has been an explosion of low-quality sources of news and information on the right in recent decades—including the rise of partisan talk radio shows (such as Rush Limbaugh and Sean Hannity) and podcasts (such as Ben Shapiro, Charlie Kirk, and Steve Bannon), partisan cable news channels (such as Fox News, Newsmax, and One America News Network), and partisan websites (such as The Daily Wire and Breitbart):
“False, partisan, and often deliberately misleading narratives now spread in digital wildfires, cascades of falsehood that move too fast for fact checkers to keep up. And even if they could, it no longer matters: a part of the public will never read or see fact-checking websites, and if they do they won’t believe them.” (Anne Applebaum)
One of the core institutions within the conservative information ecosystem, Fox News, may bill itself as a news organization, but it is hardly that. Like so many of the partisan media outlets on the right, the primary function of Fox seems to have little to do with journalism as it is properly understood and more to do with promoting the Republican agenda while diminishing progressivism:
“The late conservative Roger Ailes (funded by conservative Rupert Murdoch) created Fox News, a channel that carried, and still carries, mostly talk radio-style right-wing commentary. Like talk radio, it is of the conservative movement, in a way that no mainstream media outlet would ever think of itself as of the left. . . Fox plopped down on cable and dared the mainstream media to say anything about it. It never saw itself as better mainstream media—it saw itself as a conservative competitor to a liberal incumbent. It started mainstreaming conservative talking points and conspiracies, quickly gained a huge (mostly white, mostly old) audience, and, through sheer chutzpah, was accepted as a legitimate news outlet. It’s not that Fox News hasn’t produced some good journalism and good journalists. It’s that the ultimate axis around which the enterprise revolves is partisan. It is an instrument to advance the interests of the conservative movement.” (David Roberts)
Fox cherry-picks stories that conservatives care about (especially ones that make them angry/scared/resentful), frames them in the most conservative-friendly manner possible, and either ignores stories that are uninteresting/unfavorable to conservatives or reports those stories but frames them in a partisan manner.
To quote Mona Charen, “Fox is not a news channel—it is the right’s Pravda.” She goes on:
“All of us indulge the urge, at least sometimes, to hear news that confirms our own views. What Fox’s audience must grapple with is that choosing news is not like other consumer choices. . . If your doctor assured you that your skin lesion was benign because he thought this would be more welcome than the news that it was melanoma requiring immediate treatment, the doctor would be guilty of malpractice and you wouldn’t thank him. When Fox News and its competitors lie to viewers, they are endangering not their physical health but their civic health and the good of the nation.”
There are numerous examples which illustrate Fox’s serious flaws, but perhaps none more shocking than their help in ensuring the widespread acceptance of the “Big Lie” among conservative Americans—despite Fox’s leadership and on-air talent admitting behind closed doors that it was nonsense.
Along with the explosion of low-quality sources within their information ecosystem, a second main epistemic problem for American conservatives is that they trust very few news and information sources, and the ones that a majority do trust tend to be partisan and low quality.
For example, in Table 1 below you can see that the only source that a majority of Republicans trusted in a recent Pew Research Center survey was the partisan and low-quality Fox News, while only a minority trusted any of the verifiably trustworthy and high-quality sources (see the Connors Institute’s Media Report Card for ratings of the news).
Far too many conservatives stay within their own partisan, self-contained information ecosystem, “an internally coherent, relatively insulated knowledge community, reinforcing the shared worldview of readers and shielding them from journalism that challenge[s] it.” As David Roberts argues, “[C]onservatives are pulled with increasing gravity into an information vortex that simply has no analogue elsewhere in American politics.” He goes on:
“The right hypes its base up with bullshit—it has for decades—until an already tribally inclined audience has now descended into near-total epistemic closure. It is contemptuous of outside fact-checking, no matter how assiduous, but endlessly gullible toward information shared on the inside. Consequently, it is an easy target.”
And as Harvard Law School’s Yochai Benkler and his colleagues explain:
“What we find in our data is a network of mutually-reinforcing hyper-partisan sites that revive what Richard Hofstadter called ‘the paranoid style in American politics,’ combining decontextualized truths, repeated falsehoods, and leaps of logic to create a fundamentally misleading view of the world. . . By repetition, variation, and circulation through many associated sites, the network of sites make their claims familiar to readers, and this fluency with the core narrative gives credence to the incredible.”
They go on: “It is a mistake to dismiss these stories as ‘fake news’; their power stems from a potent mix of verifiable facts. . . familiar repeated falsehoods, paranoid logic, and consistent political orientation within a mutually-reinforcing network of like-minded sites.”
We are well aware of conservative America’s epistemic problems and believe they are incredibly damaging to our society. But while there has been a considerable amount of attention paid to misleading information on the right, we do not believe there is nearly enough paid to this problem on the left. So while it is true that the right-wing has serious problems with misleading information, we believe there is strong evidence that it plagues the left-wing as well, and it needs to be taken much more seriously by those on the left in positions to actually do something about it.
We want to make clear that we do not attempt to draw an equivalency between the epistemic issues facing each side. We make no assertions about which side has it “worse” due to our honest inability to quantify such a thing. Some will see this as a cop-out, but in our minds, it is the honest truth. We have spent a considerable amount of time between us attempting to quantify the problems on each side to allow for comparison but have yet to find a suitable formula.
Even if we somehow could quantify the degree to which each side is damaging society with misleading information, however, we do not think it matters as much as some people may claim. Each side’s information problems would be utterly corrosive to American society even in the absence of epistemic problems across the aisle. Our society tolerates an unacceptable amount of dysfunction as the result of each side’s epistemic problems, and thus both need to be addressed.
Misleading Information on the Left
On the right, as we have discussed, we see lack of trust in legitimate institutions and a fragmented and partisan media ecosystem as the primary drivers of the misleading information conservative Americans receive. Countless high-quality sources are available, but so are countless low-quality ones, and these are the sources that far too many conservatives trust.
The left-wing’s epistemic crisis seems to be different in nature. Yes, there are plenty of partisan news and information sources that flatter the liberal worldview, such as MSNBC, HuffPost, and Vox. But liberal Americans are more likely than conservatives to trust legitimate journalistic outlets, so we’ll need to turn our focus elsewhere to fully understand the left-wing’s epistemic crisis.
We think the bigger problem for liberal America is that, while they may be more likely to trust legitimate sources of information, (a) those outlets are often unaware that they are spreading misinformation because it appears to be backed by “the science,” and (b) without understanding the preponderance of the evidence, these outlets often pick and choose “the science” that backs their arguments. Either way, it appears to the unknowing public that they are being presented with arguments that reflect the preponderance of the empirical evidence when they are not.
We believe one of the most important contributors to this problem is certain malfunctioning segments of academia, such as the social sciences, where much of the misleading information is created in the first place.
The problem is not with the epistemic system as a whole—it remains impressive and unparalleled and should be allowed to continue to grow and better our world. The problem is that certain areas of this system—perhaps the academic social sciences most obviously—have gone off the rails a bit and need to be fixed. We see at least five major issues plaguing academia, issues which play a major role in the left’s epistemic problems.
The modern academy is not functioning properly.
First, university faculties and administrations do not represent the true diversity of America, with a wide range of important mainstream perspectives underrepresented. This is glaringly true for political diversity (see Figure 1 below), with far too many left and far-left perspectives, and not enough centrist and center-right perspectives to correct for the left’s ideological blind spots:
“The university professoriate is overwhelmingly liberal, an ideological imbalance demonstrated in numerous studies conducted over the last two decades. This imbalance is especially strong in university humanities departments, schools of education, and the social sciences; and it is specifically strong in psychology and the related disciplines of sociology and political science.” (Keith Stanovich)
As Greg Lukianoff and Jonathan Haidt explain, when the epistemic system is working properly, there is enough diversity of perspectives to make sure that ideas are properly vetted:
“Each scholar suffers from confirmation bias—the tendency to search vigorously for evidence that confirms what one already believes. One of the most brilliant features of universities is that, when they are working properly, they are communities of scholars who cancel out one another’s confirmation biases. Even if professors often cannot see the flaws in their own arguments, other professors and students do them the favor of finding such flaws. The community of scholars then judges which ideas survive the debate. We can call this process institutionalized disconfirmation. The institution (the academy as a whole, or a discipline, such as political science) guarantees that every statement offered as a research finding—and certainly every peer-reviewed article—has survived a process of challenge and vetting. That is no guarantee that it is true, but it is a reason to think that the statement is likely to be more reliable than alternative statements made by partisan think tanks, corporate marketers, or your opinionated uncle. It is only because of institutionalized disconfirmation that universities and groups of scholars can claim some authority to be arbiters of factual questions.”
But the modern academy is not functioning properly. The critical ideological imbalance at today’s universities is no doubt limiting the quality of both the scholarship produced by academics as well as the education that students receive: Some research questions get investigated and others are avoided, some methods utilized and others ignored, information gets interpreted in biased ways, and some legitimate viewpoints are marginalized while others are amplified:
“[W]hen the majority of scientists in a discipline share the same sacred values, then the checks and balances of peer review and peer skepticism that science relies upon can fail. Peer review, critical engagement, skepticism, and the other virtues of science. . . become tyrants that promote and protect the sacred values of the scientific community.” (Jonathan Rauch)
This leads to a partisan understanding of many issues in some disciplines: “[S]tudents in politically homogenous departments will mostly be exposed to books and research studies drawn from the left half of the range, so they are likely to come down to the ‘left’ of the truth, on average” (Lukianoff & Haidt). Lukianoff and Haidt go on to note that: “[V]iewpoint diversity is necessary for the development of critical thinking, while viewpoint homogeneity (whether on the left or the right) leaves a community vulnerable to groupthink and orthodoxy.”
As an example, consider implicit bias. In many quarters, due in large part to a psychological demonstration called the Implicit Association Test (IAT), it is now taken as a given that most people suffer from implicit attitudes that they are not aware of. Most prominently, many people are thought to be implicit racists and this idea is taught widely in courses on sociology and social psychology and in diversity seminars. The problem is that the evidence for it is weak and disputed. But many academics are either unaware of, ignore, or misinterpret this reality, in large part because of their prior ideological commitments. In doing so, they convince a large portion of the American public that “the science” has authoritatively decided an issue when it has come nowhere close to doing so.
“[V]iewpoint homogeneity (whether on the left or the right) leaves a community vulnerable to groupthink and orthodoxy.” —Greg Lukianoff & Jonathan Haidt
There are many topics where academics make claims that go far beyond (and sometimes completely contradict) what the evidence will support: not only implicit bias, but research about systemic racism, police shootings, microaggressions, free markets, poverty, sexism, the gender pay gap, sex differences, transgender issues, single parenthood, IQ, and more.
The problem is widespread in academia and typically errs on the side of supporting leftist ideologies. And we know that ideologies distort our understanding of the world and have massive blind spots. They do not just offer analysis of what is going on, they also traffic in sacred values, absolute values about which actions are correct and which are prohibited. As Steven Sloman argues in forthcoming work, reliance on such values leads to intransigence. As people on the right see their sacred values regarding issues like abortion, immigration, and taxation violated, and those on the left witness violations of their sacred values in the form of racism, homophobia, and other forms of real or perceived victimization, the resulting outrage causes the two groups to grow farther apart until they are consumed by antipathy and loathing. Because of its homogeneity, academia is contributing to this cycle.
If there is enough ideological diversity in academia, flawed partisan ideas will have a difficult time gaining traction without being revised in a more nuanced and objectively more accurate direction. But while the academy has long had problems with its liberal imbalance, it is even worse today than in the past, increasing the number of ideological blind spots among its members. This extreme imbalance allows for highly questionable empirical standards of evidence for truth claims in some fields. There just aren’t enough skeptical voices pushing back: “[W]e can’t count on ‘institutionalized disconfirmation’ anymore because there are hardly any more conservatives or libertarians in the humanities and social sciences.”
A second major issue plaguing academia, we believe, is that the standards of entry into some malfunctioning academic disciplines are clearly too low. One can teach in a college classroom and publish in academic journals in some disciplines with a very poor understanding of research and data. In Lawrence’s field of sociology, for instance, you can earn a Ph.D. from a respectable program and spend an entire career teaching and publishing in a tenured university position without ever developing even a rudimentary grasp of quantitative research methods. Many sociologists do so. At many universities across the U.S., sociology professors are presenting information to their students that they incorrectly assume is strongly supported by empirical evidence because, while the professors themselves may not know or understand the research behind the claims, the authorities in their field have assured them it is sound. This information passed peer review, was published by leading journals/book presses, was accepted by the larger field, made its way into textbooks, etc. Most sociology professors have not seen the empirical research behind much of the information they present in class, likely would not fully understand it if they did, and are unlikely to question it anyway because it aligns with current left-wing social justice assumptions.
We’re not picking on sociology—we believe a number of academic disciplines face this same problem—it’s just that we can speak confidently about our own disciplines because we have seen all of this firsthand.
As we have stated, the social sciences seem to be the worst of the malfunctioning disciplines. As Keith Stanovich argues:
“If you are a person of high intelligence, if you are highly educated, and if you are strongly committed to an ideological viewpoint, you will be highly likely to think you have thought your way to your viewpoint. And you will be even less likely than the average person to realize that you have derived your beliefs from the social groups you belong to and because they fit with your temperament and your innate psychological propensities. University faculty in the social sciences fit this bill perfectly. And the opening for a massive bias blind spot occurs when these same faculty think that they can objectively study, within the confines of an ideological monoculture, the characteristics of their ideological opponents.”
He goes on:
“We now have entire departments within the university. . . that are devoted to advocacy rather than inquiry. Anyone who entered those departments with a ‘falsifiability mindset’ would be run out on a rail—which of course is why conclusions on specific propositions from such academic entities are scientifically worthless.”
And as Lee McIntyre argues: “It is unfortunately true that a good deal of social science today is unreliable, due to its infection by political ideology. Even in universities, in some fields there is no clear line between ‘research’ and political advocacy.”
The third and fourth problems we identify are that the standards of evidence at many academic journals and book presses are low and/or biased and the incentive structure of academia requires even poor researchers to publish, leading to too much bad research which sustains too many low-quality academic journals.
[T]he more that America polarizes, the more it contains not one but two Overton windows, the ‘red’ window and the ‘blue’ window. Speech that is squarely mainstream in Red America is completely out of bounds in Blue America, and vice versa.” —David French
And fifth, we come to the chilling effect of cancel culture. In the most ideologically homogenous fields, questioning the dominant narrative can be quite dangerous for one’s career and reputation. Keith Stanovich writes that:
“Identity politics advocates have succeeded in making certain research conclusions within the university verboten. They have made it very hard for any university professor (particularly the junior and untenured ones) to publish and publicly promote any conclusions that these advocates dislike. Faculty now self-censor on a range of topics. The identity politics ideologues have won the on-campus battle to suppress views that they do not like.”
David French argues that:
“[A] person can be cast out of polite society for saying something completely conventional, normal and in good faith. . . [T]he more that America polarizes, the more it contains not one but two Overton windows, the ‘red’ window and the ‘blue’ window. Speech that is squarely mainstream in Red America is completely out of bounds in Blue America, and vice versa.”
He goes on:
“Americans have read story after story (from across the political spectrum) of activists, corporations and colleges targeting individuals for speech that is squarely within the mainstream of either progressive or conservative thought. In other words, dissent—even thoughtful dissent—has become dangerous, in both right- and left-leaning America.”
There are numerous examples of the toxic consequences of cancel culture on both the left and right in America as well as in the academy.
Michigan State University Vice President of Research and Innovation Stephen Hsu, for instance, published research on intelligence differences between racial groups, questioned the practice of diversity in hiring, and was accused of promoting research questioning the existence of racial bias in police shootings, among other alleged transgressions. According to Hsu, his university’s president asked him to resign his VP position after a campus petition calling for his firing gained steamed, which Hsu did.
Another example is Gordon Klein, a professor at UCLA who declined a student’s suggestions that he consider altering his final exam format, schedule, or grading policies for African American students in the wake of the killing of George Floyd in the summer of 2020. He was put on mandatory leave after his email response went viral and a Change.org petition calling for his firing accumulated over 20,000 signatures.
These five problems we (weren’t the first to) identify, along with what we are sure are others we have missed, interact to allow questionable claims to be disseminated as trustworthy information—carrying the symbolic weight of being backed by “the science”—throughout left-wing circles where they distort liberal America’s perception of reality: “For decades, Critical Theories had been confined to humanities and Studies departments of universities. But the ideas have spread to other disciplines and the outside world, where they have been picked up by activists and the press” (Dorian Abbott et. al.).
Academics, partisan media personalities, activists, and politicians on the left are guilty of frequently spreading claims (that they insist are scientifically authoritative) that further a social justice agenda without realizing/acknowledging the preliminary, weak, or nonexistent empirical support behind their assertions. Many liberals who hear/read these claims will believe them because they see them repeated often by various credentialed sources who they trust, they assume the claims are backed by credible evidence, and they fit their worldview/make them feel good.
Additionally, for a liberal to oppose these claims would be to align oneself with “bad” people on the other side (supposed bigots, know-nothings, etc.). To correct some flawed but popular social justice claims is to oppose the noble goals of one’s tribe and/or to signal that one does not take the problem seriously. As Dorian Abbot and his colleagues argue, “[I]n the ‘right’ circles, one can make almost any ridiculous claim, as long as one frames it as advancing ‘Social Justice.’”
Some of the most glaring examples concern claims about race and gender, as Jonathan Haidt explains: “On the left, including the academic left, the most sacred issues involve race and gender. So that's where you find the most direct and I'd say flagrant denial of evidence.”
Michael Jindra and Arthur Sakamoto make a similar argument:
“In complex areas like the study of racial inequality, a fundamentalism has taken hold that discourages sound methodology and the use of reliable evidence about the roots of social problems. We are not talking about mere differences in interpretation of results, which are common. We are talking about mistakes so clear that they should cause research to be seriously questioned or even disregarded. A great deal of research. . . rigs its statistical methods in order to arrive at ideologically preferred conclusions.”
They go on:
“[I]deologically driven abuse of statistics happens all across the social sciences. Why? In left-leaning academic discourse, there are strong biases toward ‘structural’ causes, in part because scholars face strong pressures to avoid ‘blaming’ people and cultures for social problems. But social theory must recognize both structure and agency, alongside intermediary forces of social influence such as culture. . . Again, we are not talking about normal differences in the interpretation of results. We are talking about clear errors, or at least very poor scholarship that should not have passed peer review. It is easy to question some of these results because they often don’t make intuitive sense. . . Research simply shouldn’t be directed by a priori ideological commitments. It should follow the evidence. Often, that evidence won’t lead to clear-cut or definitive results. Some of these articles should be candidates for retraction, but retraction is rare. . . Some scholars even received major promotions, perhaps partly because their findings fit favored narratives. Instead, papers that violate ideological beliefs, more than those with errors of fact, receive pressure for cancellation, often from Twitter activists.”
The authors note that in the social sciences, the entire system of research, funding, publication, and promotion strongly values findings that support current social justice goals. Because so many of the career rewards (and sanctions) are aligned with these goals, “people will go to extraordinary lengths to achieve them.”
Our Golden Age of Information
Imagine for a moment that you were to travel in a time machine back a century or more into America’s past. You greet somebody you encounter there and ask to be taken to their most impressive library. This person honors your request, and upon arrival he/she brags to you about the immense knowledge contained within the library’s walls. You then retrieve your smartphone from your pocket (with a noticeable smirk on your face) and explain to your host that this small device in your hand gives you access to exponentially more information than their library could ever hope to. Your new acquaintance would be left speechless (if he/she believed you).
You then hop back into your time machine, blast some Huey Lewis, and get up to 88 mph as fast as possible, leaving him/her bewildered as you disappear back to the future—all without kissing your mother!
It might seem an odd argument in a piece about misinformation, but we nonetheless contend it is true: Americans have easier access to high-quality factual information, and more of it, than ever before. As The Atlantic’s David Frum quipped: “I was promised flying cars, and instead all I got was all the world’s libraries in my pocket and the ability to videochat 24-hours a day for free with my grandchildren on the other side of the world.” This should in fact be a golden age of information.
Americans have easier access to high-quality factual information, and more of it, than ever before.
The scale and quality of knowledge production that occurs in the modern world is a marvel and a historical breakthrough. As Jonathan Haidt explains, our modern epistemic system is:
“a set of institutions for generating knowledge from the interactions of biased and cognitively flawed individuals. English law developed the adversarial system so that biased advocates could present both sides of a case to an impartial jury. Newspapers full of lies evolved into professional journalistic enterprises, with norms that required seeking out multiple sides of a story, followed by editorial review, followed by fact-checking. Universities evolved from cloistered medieval institutions into research powerhouses, creating a structure in which scholars put forth evidence-backed claims with the knowledge that other scholars around the world would be motivated to gain prestige by finding contrary evidence. Part of America’s greatness in the 20th century came from having developed the most capable, vibrant, and productive network of knowledge-producing institutions in all of human history, linking together the world’s best universities, private companies that turned scientific advances into life-changing consumer products, and government agencies that supported scientific research and led the collaboration that put people on the moon.”
The network of knowledge-producing institutions within our modern epistemic system is something to behold, unimaginable to our ancestors. Many major advances take place each year in areas such as science, technology, and medicine. And there are more high-quality news and information outlets than ever before. People from different historical eras would be awestruck to find out that we have so many high-quality sources of information available to us at all times in our pockets.
Of course, our epistemic system regularly gets things wrong. News outlets miss some stories and botch others. Peer review will sometimes fail. Questionable findings are sometimes accepted as settled fact for decades. Some let their biases pollute their work or their evaluation of others’ work. Some fail to meet high standards on a regular basis and/or violate established norms. There are cases of outright fraud. Many make claims far greater than the empirical evidence warrants—and some make claims diametrically opposed to the best available evidence. Some attack those whose findings make them uncomfortable.
All of these things are true, yet on the whole, the modern epistemic system eventually self-corrects and gets it right at a far greater rate than any alternative way of knowing. Whatever mistakes are being made at the current moment, one can be assured that we are closer to “the truth” and “reality” now than we were 50 years ago (and they were closer than those 50 years before them and so on). Our understanding of reality at any given moment is always imperfect, always provisional, and always tentative. It can and will change in the future as more information becomes available. We keep working, always inching closer and closer to the truth, year after year after year.
The modern epistemic system, with its “open-ended, depersonalized checking by an error seeking social network,” is “the only legitimate validator of knowledge”:
“Other communities, of course, can do all kinds of other things. But they cannot make social decisions about objective reality. . . [This assertion] goes down very badly with lots of people and communities who feel ignored or oppressed by the Constitution of Knowledge: creationists, Christian Scientists, homeopaths, astrologists, flat-earthers, anti-vaxxers, birthers, 9/11 truthers, postmodern professors, political partisans, QAnon followers, and adherents of any number of other belief systems and religions. It also sits uncomfortably with the populist and dogmatic tempers of our time.” (Jonathan Rauch)
The system’s logic and structure ensure that, even though errors are made, the larger system will eventually identify, correct, and learn from a great many of these mistakes:
“The advantage of the reality-based community is not that it catches every error immediately, but that it catches most errors eventually, and many errors very quickly. No other regime can make that claim, or come anywhere close.” (Rauch)
The modern epistemic system’s track record is unmatched by any other way of knowing.
Epistemic Secession
Concerning the astonishing amount of information being produced by our modern epistemic system, Jonathan Rauch writes:
“[B]y organizing millions of minds to tackle billions of problems, the epistemic constitution disseminates knowledge at a staggering rate. Every day, probably before breakfast, it adds more to the canon of knowledge than was accumulated in the 200,000 years of human history prior to Galileo's time.”
So if it is so great, then what’s the problem?
Well, if all of the information it produced were reliable, or if people only stuck to the high-quality sources, there would indeed be nothing to worry about. The problem, we believe, is not with the system overall, but certain segments of that system which we believe are currently malfunctioning.
Despite the unprecedented amount of high-quality information provided by our epistemic system, and the fact that we believe the vast majority of the system is working just fine, the malfunctioning portions we have discussed in this piece (such as some partisan media outlets on the right and some malfunctioning academic disciplines on the left) are nonetheless exposing Americans to far too much misleading information.
Despite enjoying easier access to more high-quality information than at any other point in history, we also have easy (and often easier) access to more low-quality information than ever before.
Coming of age in an increasingly polarized society and adrift in a vast ocean of both (a) more easily-accessible high-quality information than ever before and (b) an explosion in easily accessible low-quality information, many people create “ideological silos” for themselves—meaning they seek out news/information sources they agree with, surround themselves with mostly people they agree with, avoid sources and people that tell them things they disagree with where possible, and struggle to identify what is reliable information and what is not.
In these silos, millions of Americans become addicted to questionable information because it makes them feel good, they agree with it, and/or they heard it from somebody they like/trust. This can lead red America and blue America to develop different understandings of epistemic authorities, methods, and standards of evidence, and drift “into disconnected moral matrices backed up by mutually contradictory informational worlds.”
Ideological silos can overwhelm people with information that is favorable to their side’s beliefs, hide information that questions them, and provide them with constant ideological affirmation from people they love, respect, and/or trust. Inside of ideological silos, partisan messages are repeated back to people constantly, while the silo prevents them from confronting contradictory messages as often.
When red America and blue America debate important issues, their divergent understandings of the truth can seem to the other side like a different dialect or another language altogether—try this experiment: use the term “BIPOC” across various social settings and notice the different reactions that you get! As Jonathan Haidt writes “It’s been clear for quite a while now that red America and blue America are becoming like two different countries claiming the same territory.” It can almost seem like “epistemic secession.”
These conditions just didn’t exist in the 1980s—partisan cable news and talk radio were less prevalent, there wasn’t widespread access to personal computers or the internet, and there were no smartphones or social media.
“People have always had different opinions. Now they have different facts.” —Anne Applebaum
If you were a kid in the 1980s, your parents might read a mainstream newspaper and watch a little of the network news at night—most of their friends had the same media diets and a much more shared understanding of reality than today. It was much more difficult to avoid trustworthy sources and the shared facts of the larger culture than it is today, and it was harder to access partisan sources and communities that would affirm fringe beliefs.
Modern conditions do not force such collective media habits, and many choose a more untrustworthy information diet that helps shape an alternate understanding of the truth:
“[T]he old newspapers and broadcasters created the possibility of a single national conversation. In many advanced democracies there is now no common debate, let alone a common narrative. People have always had different opinions. Now they have different facts.” (Anne Applebaum)
And as Greg Lukianoff and Jonathan Haidt write:
“Long ago is the time when everybody watched one of three national television networks. By the 1990s, there was a cable news channel for most points on the political spectrum, and by the early 2000s there was a website or discussion group for every conceivable interest group and grievance. By the 2010s, most Americans were using social media sites like Facebook and Twitter, which make it easy to encase oneself within an echo chamber. . . Both the physical and the electronic isolation from people we disagree with allow the forces of confirmation bias, groupthink, and tribalism to push us still further apart.”
Where Do We Go From Here?
In The Poisoning of the American Mind, Lawrence Eppard and his colleagues discuss a variety of ideas about how to navigate the future. But it isn’t clear they will be implemented or that they would even work.
We face an uncertain road ahead as a nation.
Organizations that promote themselves as “news organizations” should not spread obvious misinformation and disinformation. How do we ensure this? Is there a reasonable way to regulate these organizations that would not run afoul of the law, the Constitution, or the wishes of the general public, and that would pass both houses of Congress? We’ve yet to hear solutions that would clear all of these thresholds.
And we are well aware of how such power to regulate misleading information could be abused by partisans on both sides who choose to incorrectly label legitimate information as misinformation/disinformation simply because it is unfavorable to their side.
But we are in an epistemic crisis that is destabilizing our nation, and we believe we should at least explore every possibility. . . urgently.
Social media platforms have made efforts to stop the spread of misleading information—we should encourage the continuation and strengthening of these efforts.
Academia needs much more political diversity, much higher standards of entry into teaching and research positions, more political diversity in journal and book press editorial positions, serious reforms to the peer review process, more metanalyses, more replication, and more retraction.
It might make sense to make all anonymous research data and anonymous peer review reports available for free online. It might also be a good idea to make the abstracts of all submissions that academic journals receive (even the ones they reject) publicly viewable.
We can also create new university centers and institutes specifically designed to welcome heterodox viewpoints and provide ample space for open scholarly debate—such as the new Connors Institute for Nonpartisan Research and Civic Engagement at Shippensburg University.
Serious reform in academia will take a long time to accomplish in such an ideologically homogenous industry. Luckily, there are already efforts under way in the field of psychology to try to right the ship. We hope these continue and are expanded and spread to other malfunctioning fields.
In the meantime, Americans are going to have to protect themselves.
Aware that many academic fields are malfunctioning, they will need to view paradigm-shifting claims cautiously and try their best to gauge the entirety of the debate, not just what one discipline or subdiscipline is saying.
Americans must consume more information from credible sources. The most efficient strategy for guarding against misleading information, we believe, is not to try to fact-check every story—most of us do not have the skills nor the time to do so.
Yes you should fact-check where possible. But for efficiency’s sake, at minimum you should only consume information from sources whose content has recently been scrutinized by independent, professional analysts using rigorous, objective, and rule-based methodologies and been deemed high-quality.
At the Connors Institute we have identified several trustworthy news and information sources that have been shown to provide accurate information with limited bias.
Don’t lock yourself in an ideological silo—regularly expose yourself to legitimate center-left, centrist, and center-right perspectives.
Steelman your beliefs, meaning frequently look for ways to rigorously challenge your beliefs with the best possible counterarguments and evidence.
Be actively dispassionate about information, even (and especially) when it is about people and institutions that you like. Try not to become emotionally invested in ideas. Don’t let beliefs about the social order become convictions or a part of your identity wherever possible.
Turn off cable news like CNN, MSNBC, Fox News, and Newsmax. Avoid partisan websites like HuffPost and Daily Wire. Go to the gold standards like the Associated Press and Reuters.
When you see a particularly important news story, a controversial story, and/or a hard-to-believe story, see if it is being covered by multiple credible sources and covered in the same manner. If not, explore why.
Remain doubtful about the certainty of your beliefs. Know what you don't know and the limits of your understanding—we all have huge blind spots in our knowledge and worldviews. Identify your beliefs that you can’t explain in an expert manner, let this teach you some humility, and look for ways to work toward a better understanding.
Always ask yourself: what new information would be necessary to make me acknowledge this deeply-held opinion is wrong? Then be on the lookout for such disconfirming evidence. Look forward to admitting you are wrong and changing your mind. It means you are learning something. A person who never changes their mind is a person who refuses to grow and learn. The best writing goes through many drafts, each time getting better. Even after a piece is published, writers know they could keep making it better with endless subsequent drafts. Your beliefs are like pieces of writing—keep revising them and making them better and more nuanced and more accurate.
Value listening and learning over winning arguments, looking smart, and/or avoiding embarrassment. If your beliefs aren’t convictions or a part of your identity, it is easier to admit when they need revising.
Parting Thoughts
There are no “right” or “wrong” answers to many of the biggest challenges facing our country. There are facts and data that support a variety of positions, but how this information should be prioritized is subjective.
But whatever we decide to do, we should insist that the information we use to make our decisions is factual and of the highest possible quality.
Jonathan Haidt warns that:
“American democracy is now operating outside the bounds of sustainability. If we do not make major changes soon, then our institutions, our political system, and our society may collapse.”
The epistemic challenges we’ve discussed aren’t a hoax or overblown, and solving them isn’t an exclusively Republican or exclusively Democratic concern, no matter how many times irresponsible and/or bad faith actors make such claims.
All Americans stand to benefit a great deal from any progress that we make. None of us—left, right, or center—want to live in a world where this situation continues to spin out of control. We must solve this epistemic crisis together.
Above all: value the truth, tell the truth, and reward those who do the same.