ADVERTISEMENT

Health Facts Aren't Enough. Should Persuasion Become a Priority?

(The Upshot: The New Health Care)

In a paper published early this year in Nature Human Behavior, scientists asked 500 Americans what they thought about foods that contained genetically modified organisms.

The vast majority, more than 90%, opposed their use. This belief is in conflict with the consensus of scientists. Almost 90% of them believe GMOs are safe — and can be of great benefit.

The second finding of the study was more eye-opening. Those who were most opposed to genetically modified foods believed they were the most knowledgeable about this issue, yet scored the lowest on actual tests of scientific knowledge.

In other words, those with the least understanding of science had the most science-opposed views but thought they knew the most. Lest anyone think this is only an American phenomenon, the study was also conducted in France and Germany, with similar results.

ADVERTISEMENT

If you don’t like this example — the point made here is unlikely to change people’s minds and will probably enrage some readers — that’s OK because there are more where that came from.

A small percentage of the public believes that vaccines are truly dangerous. People who hold this view — which is incorrect — also believe that they know more than experts about this topic.

Many Americans take supplements, but the reasons are varied and are not linked to any hard evidence. Most of them say they are unaffected by claims from experts contradicting the claims of manufacturers. Only a quarter said they would stop using supplements if experts said they were ineffective. They must think they know better.

Part of this cognitive bias is related to the Dunning-Kruger effect, named for the two psychologists who wrote a seminal paper in 1999 entitled “Unskilled and Unaware of It.”

David Dunning and Justin Kruger discussed the many reasons people who are the most incompetent (their word) seem to believe they know much more than they do. A lack of knowledge leaves some without the contextual information necessary to recognize mistakes, they wrote, and their “incompetence robs them of the ability to realize it.”

ADVERTISEMENT

This helps explain in part why efforts to educate the public often fail. In 2003, researchers examined how communication strategies on GMOs — intended to help the public see that their beliefs did not align with experts — wound up backfiring. All the efforts, in the end, made consumers less likely to choose GMO foods.

Brendan Nyhan, a Dartmouth professor and contributor to The Upshot, has been a co-author on a number of papers with similar findings. In a 2013 study in Medical Care, he helped show that attempting to provide corrective information to voters about death panels wound up increasing their belief in them among politically knowledgeable supporters of Sarah Palin.

In a 2014 study in Pediatrics, he helped show that a variety of interventions intended to convince parents that vaccines didn’t cause autism led to even fewer concerned parents saying they would vaccinate their children. A 2015 study published in Vaccine showed that giving corrective information about the flu vaccine led patients most concerned about side effects to be less likely to get the vaccine.

A great deal of science communication still relies on the “knowledge deficit model,” an idea that the lack of support for good policies, and good science, merely reflects a lack of scientific information.

But experts have been giving information about things like the overuse of low-value care for years, to little effect. A recent study looked at how doctors behaved when they were also patients. They were just as likely to engage in the use of low-value medical care, and just as unlikely to stick to their chronic disease medication regimens, as the general public.

ADVERTISEMENT

In 2016, a number of researchers argued in an essay that those in the sciences needed to realize that the public may not process information in the same way they do. Scientists need to be formally trained in communication skills, they said, and they also need to realize that the knowledge deficit model makes for easy policy but not necessarily good results.

It seems important to engage the public more, and earn their trust through continued, more personal interaction, using many different platforms and technologies. Dropping knowledge from on high — which is still the modus operandi for most scientists — doesn’t work.

When areas of science are contentious, it’s clear that “data” aren’t enough. Bombarding people with more information about studies isn’t helping. How the information contained in them is disseminated and discussed may be much more important.

This article originally appeared in The New York Times.

Enhance Your Pulse News Experience!

Get rewards worth up to $20 when selected to participate in our exclusive focus group. Your input will help us to make informed decisions that align with your needs and preferences.

I've got feedback!

JOIN OUR PULSE COMMUNITY!

Unblock notifications in browser settings.
ADVERTISEMENT

Eyewitness? Submit your stories now via social or:

Email: eyewitness@pulse.com.gh

ADVERTISEMENT