Thursday, May 4, 2023

Why can’t Americans agree on, well, nearly anything? Philosophy has some answers

  Does wearing a mask stop the spread of COVID-19? Is climate change driven primarily by human-made emissions? With these kinds of issues dividing the public, it sometimes feels as if Americans are losing our ability to agree about basic facts of the world. There have been widespread disagreements about matters of seemingly objective fact in the past, yet the number of recent examples can make it feel as though our shared sense of reality is shrinking.

  As a law professor, I’ve written about legal challenges to vaccination requirements and COVID-19 restrictions, as well as what counts as “truth” in court. In other words, I spend a lot of time mulling over how people define truth and why U.S. society has such a hard time agreeing on it these days.

  There are two ideas that can help us think about polarization on matters of fact. The first, “epistemic pluralism,” helps describe U.S. society today, and how we got here. The second, “epistemic dependence,” can help us reflect on where our knowledge comes from in the first place.


Many takes on ‘truth’

  I define epistemic pluralism as a persistent state of public disagreement about empirical facts.

  When it comes to things that can be proved or disproved, it’s easy to think that everyone could come to the same factual conclusions if only they had equal access to the same information – which, after all, is more freely available today than at any point in human history. But while the inequality of access to information plays a role, it is not so simple: Psychological, social, and political factors also contribute to epistemic pluralism.

  For example, psychologist and law professor Dan Kahan and his collaborators have described two phenomena that affect the ways in which people form different beliefs from the same information.

  The first is called “identity-protective cognition.” This describes how individuals are motivated to adopt the empirical beliefs of groups they identify with in order to signal that they belong.

  The second is “cultural cognition”: people tend to say that a behavior has a greater risk of harm if they disapprove of the behavior for other reasons – handgun regulation and nuclear waste disposal, for example.

  These effects are not reduced by intelligence, access to information, or education. Indeed, greater scientific literacy and mathematical ability have been shown to actually increase polarization on scientific issues that have been politicized, such as the cause of climate change or the benefits of gun control. Higher ability in these areas appears to boost people’s ability to interpret the available evidence in favor of their preferred conclusions.

  Beyond these psychological factors, there is another major source of epistemic pluralism. In a society characterized by freedom of conscience and freedom of expression, individuals bear “burdens of judgment,” as the American philosopher John Rawls wrote. Without the government or an official church telling people what to think, we all have to decide for ourselves – and that inevitably leads to a diversity of moral viewpoints.

  Although Rawls was focused on pluralism of moral values, the same is true of beliefs about matters of fact. In the U.S., legal rules and social norms attempt to ensure that the state cannot constrain an individual’s freedom of belief, whether that be about moral values or empirical facts.

  This intellectual freedom contributes to epistemic pluralism. So do factors such as educational inequalities, the proliferation of information from untrustworthy sources online, and misinformation campaigns. All together, they provide ample opportunity for people’s shared sense of reality to fragment.


Knowledge takes trust

  Another contributor to epistemic pluralism is just how specialized human knowledge has become. No one person could hope to acquire the sum total of all knowledge in a single lifetime. This brings us to the second relevant concept: epistemic dependence.

  Knowledge is almost never acquired firsthand but transmitted by some trusted source. To take a simple example, how do you know who the first president of the United States was? No one alive today witnessed the first presidential inauguration. You could go to the National Archives and ask to see records, but hardly anyone does that. Instead, Americans learned from an elementary school teacher that George Washington was the first president and we accept that fact because of the teacher’s epistemic authority.

  There’s nothing wrong with this; everyone gets most knowledge that way. There’s simply too much knowledge for anyone to verify independently all the facts on which we routinely rely.

  This is true even in highly specialized areas. Replication is essential to science, but scientists don’t personally replicate every experiment relevant to their field. Even Sir Isaac Newton famously said that his contributions to physics were possible only “by standing on the shoulders of giants.”

  However, this raises a tricky problem: Who has sufficient epistemic authority to qualify as an expert on a particular topic? Much of the erosion of our shared reality in recent years seems to be driven by disagreement about whom to believe.

  Whom should a nonexpert believe about whether a COVID-19 vaccine is safe and effective? Whom should a Georgia voter believe about the legitimacy of their state’s results in the 2020 election: Sidney Powell, an attorney who helped Donald Trump’s legal team try to overturn the 2020 election or Georgia Secretary of State Brad Raffensperger?

  The problem in these and other cases is that most people are unable to determine the truth of these matters on their own, yet they are also unable to agree on which experts to trust.


Curious ‘scouts’

  There isn’t a simple solution to this problem. But there may be rays of hope.

  Intelligence alone doesn’t decrease people’s tendency to let their group identities sway their view of facts according to Kahan and his colleagues – but very curious people tend to be more resistant to its effects.

  Rationality researcher Julia Galef has written about how adopting a “scout” mindset rather than a “soldier’s” can help guard against the psychological factors that can lead our reasoning astray. In her description, a soldier thinker seeks information to use as ammunition against enemies, while a scout approaches the world with the goal of forming an accurate mental model of reality.

  There are many forces pulling our collective understandings of the world apart; with some effort, however, we can try to reestablish our common ground.


  About the author: James Steiner-Dillon is an associate professor of law at the University of Dayton.


  This article was published by The Conversation

No comments:

Post a Comment