These times remind us that millions of Americans have easily succumbed to magical thinking, to the embarrassment of much of the nation. Magical thinking happens when a view is reinforced more by others than hard fact. This happens in every conceivable realm of American life: medicine and vaccines, allegations of “voter fraud,” rumors about celebrities, and—of course—our national politics.
There is a clear and convincing explanation for this collective response to not notice the obvious. We can continue to call the determination to believe a falsehood “magical thinking.” But a better term is “fantasy chaining.” Let me explain.
Years ago, social scientist Robert Bales noted that groups of people put together in a room to solve a problem often reach a moment when there is a convergence of views around a preferred narrative. In many cases folks in the group didn’t have the facts or knowledge to make a judgement, but they had the support of other like-minded people around them. Think of a jury reaching a judgment on a case based on a shared prejudice. From this and other observations, Bales developed the idea of Interaction Process Analysis to track this convergence of opinions, building in part on the work of Sigmund Freud work in The Interpretation of Dreams. It was good, but not quite clear enough. And any theory resting on Freudian assumption needs a lot more grounding.
Years later communication theorist Ernest Boorman at the University of Minnesota refined Bales’ ideas into a theory Fantasy Theme Analysis. His work created a convincing model that was finally up to speed and amazingly predictive.
Basically, Boorman acknowledged that—in the absence of good information—we tend to rely on members of our reference group and our natural compulsion to spin narratives that allow us to move us from tentative claims like “I suspect” to the certainties of “I know.” That’s what a fantasy theme makes possible. He also noted the obvious: that it is easy for group fantasies to “chain out” to others with similar views.
Fantasy theme analysis helps us understand the contagion that happens when incomplete information combines with our hard-wired impulses to see the world in sets of self-contained stories. Each comes with with actors, motivations, preferred narratives, and final outcomes. We hate incomplete narratives, as when there is an airplane accident caused by bad weather. So we are happy to construct our own secondary narratives, regardless of what solid evidence might oblige us to believe. We want to have human agents in the picture and at least partly responsible.
Here’s another example I have used in a text and my classes. I was sitting in my office one day in the 80s with a copy of the New York Times opened up on my desk. A colleague dropped by and, at the same time, we both noticed the paper’s front-page picture of the new Soviet version of a space shuttle. The Buran space craft looked exactly like the American version. Same wing shape. Same color. Same size. And without missing a beat we both blurted out the view that “they must have stolen the American design.” End of story. We “knew” it and we were ready to fill in the blanks. The similarity of the shape was enough to accept the fantasy of a theft of our plans. All the while, we pretty much ignored the physics of space flight, which mandates similar design parameters for any earth-to-space vehicle.
With group fantasies, the world always makes sense. Without them, we would have to live with the continuous uncertainties mandated by incomplete information. In my field there is a Latin phrase to describe humans as homo narrans: the species that tells stories. That is our priority. Truth is far back in the pack. Tacts are optional and often downright inconvenient for humans. It feels better and it is much easier to bolster each other’s fantasies.
In time, and the sooner the better, more Americans will rejoin the reality-based world.
Top Photo Credit: <a href=”https://www.freepik.com/photos/background”>Background photo created by valeria_aksakova – www.freepik.com</a>