The Fallacy of the Cognitive Fix

I was once tasked with spearheading the NC Museum of Natural Sciences’ “Arctic Sea Ice Day,” an event meant to draw attention to the dwindling summer sea ice in the Arctic Ocean. Our goal: communicate what’s causing the loss and what can be done about it.

So I hauled up our taxidermy polar bear, got an iPod to play the narwhal song on repeat, and printed off dozens of copies of what I thought would best communicate our goals: NASA’s graphs of the Arctic’s changing ice cover over the last hundred years. I thought those J-curves would be dramatic. But after talking with a few climate-skeptic museum visitors who, to my great surprise remained climate-skeptics after seeing my graphs, I realized how misguided my attempt was. Most people don’t change their mind because they read a new fact. As my ill-conceived museum table demonstrated, changing someone’s mind, counterintuitively, is almost never just about education.

In “Navigating Environmental Attitudes,” Heberlein (2012) discusses this idea in the context of what he calls the “cognitive fix.” The idea—that you can hand someone information and expect them to change their behavior—has found little support in the literature, at least when administered as a remedy in its own right and not in combination with other strategies for effecting change. Hand someone a floodplain map and don’t be surprised if they ignore it and build a house 20 yards from a river, at massive risk to them and their investment.  Why? The cognitive fix is inherently flawed because people are not machines. Our sole motivations are not fact-based and our behaviors have a mosaic of upstream determiners including attitudes, values and beliefs.

For example, studies (Kahan, 2012) have shown that a person’s level of scientific literacy is not predictive of their concern about climate change. However, a person’s societal worldview—whether castes should be rigid or flexible, for example—is highly predictive of their opinions about climate change. Here is an example of a person’s belief system informing their attitudes in regard to a perceived risk (climate change). This idea, known to social scientists as cultural cognition, demonstrates how beliefs and values form the core of someone’s perceptions about the world, and how these can supersede even their knowledge of the subject at hand. For such a person, belief in the value of “freedom,” for example, manifests in a negative attitude about a problem like climate change, which by necessity will impinge on that freedom. Throwing facts about climate change at such a person via the cognitive fix model is unlikely to budge their upstream belief system, and so will fail at changing their behavior. So how else can you change someone’s mind?

One strategy is to skirt the cognitive fix altogether by opting for a “structural fix.” Instead of focusing on education, focus on changing the status quo—the circumstances the individual inhabits. This could involve, for example, changing home mortgage policies for bankers such that they are encouraged to internalize the negative externalities associated with home development in flood-prone areas. As Heberlein discusses, this change to the financial structure kept homeowners from developing in the floodplain, without specific lessons about the perils of such construction.

This is not to say that the cognitive fix always fails. One strategy to make facts more effective is to leverage their emotion (“affect”). Suppose instead of showing my museum guests a graph, I had spent my time waxing poetic about Robert Peary’s arduous trek to the North Pole in 1908, the incredible wildlife he saw and the brutal, romantic feeling of the cold wild. Such a narrative might affect someone’s attitude organically, in a way that reading facts never could, especially if my framing (“it was a wild, icy fantasyland of old”) resonates with their cultural values. This is the power of affect.

Direct experience can also be leveraged to initiate behavioral change because a direct experience has a better chance of accessing someone’s beliefs. It’s one thing to read “there is no planet B.” It is quite another to watch from Apollo 8 as one speck of blue in a sea of black, holding everything you’ve ever known, sets beneath the moon. This direct experience, since likened the “Overview Effect,” profoundly affects the environmental ethics and stewardship of every astronaut that has experienced it. It’s much harder to facilitate such a dramatic attitude shift by reading a pithy environmental bumper sticker.

As hard as it may be to hear for committed environmental educators, museum exhibit designers and others, education in isolation is not an effective means of achieving pro-environmental attitudes and behavior. However, leveraging education in combination with other strategies (narrative framing, affect, structural fixes etc.) can establish a framework for a much more effective, empirically-based strategy for effecting environmental change.

Leave a Reply