Fantasy and Denial: Our Public Health Challenges

Routine protocols for dealing with fast-spreading viral diseases have been reimagined as partisan ploys designed to destroy personal freedom.

In the last year my colleagues who specialize in health communication have had to face a landscape of public opinion fissures beyond what most could have imagined. Health communication explores how we can acquire information that will allow us to make better personal choices. One of the triggers for this area of study was the realization decades ago that it was costing much more to treat seriously ill smokers than to educate them about the risks. Public health practitioners realized that they needed the help of people trained in the arts of shaping public opinion.

Many who do this kind of research could be forgiven for thinking that the route to control of a treatable malady is a straight path involving the conversion of the best expert advice into coherent messages, producing high levels of compliance. But no. By nature, humans are fantasists more than critical thinkers. If vaccine resistance has taken many of us by surprise, it is perhaps because we thought we understood the power of credible medical advice. But ask members of our species about material causes for a particular result, and some will manage to weave together alternate narratives that have no bases in fact.

There is irony in living in an information-rich age that also supports bubbles of completely looney “truths.” We never need to look far. Many of us have seen folks look a reporter in the eye an assert that the seditious acts of January 6 against certifying a new Democratic president were the work of the Democratic Party. Where do you start with these people?

The eradication of polio is a representative case. In the mid-1950s, Americans anxiously lined up their children for the first vaccine against the highly infectious disease that left thousands of children paralyzed. As medical historian David Oshinsky has noted,

“If you had to pick a moment as the high point of respect for scientific discovery, it would have been then, After World War II, you had antibiotics rolling off the production line for the first time. People believed infectious disease was [being] conquered. And then this amazing vaccine is announced. People couldn’t get it fast enough.”

By early 1960, polio in the U.S. had been all but eliminated.

But this is not 1954. The pandemic has turned into an unforeseen world-wide experiment in how to manage the rapid transmission of a deadly virus while fighting off the dross of misleading messages. As it has turned out, and in spite of advances in immunology, helpful advice would have to outpace the lightning dissemination of misinformation, frequently with fantasies that a Hollywood screenwriter would have thought too outrageous. Perhaps eight to ten percent of the population is untethered to the ground, expressing phantom fears and wanting unobtainable guarantees of perfect safety.

Although many of us may owe our lives to the COVID vaccines that became available last March, widely expressed doubts about the helpfulness of masks, social distancing, and vaccines were triggered by this countermovement. It has expanded beyond the small core of anti-vaxxers and conspiracy-paranoids that have always been around. If we want to know how misguided public attitudes can be, we need look no further than many narratives that have turned treatment into a government plot. Routine protocols for dealing with fast-spreading viral diseases have been reimagined as partisan ploys designed to destroy freedom of action.

Thousands are going from hospital ICUs to their graves with the belief that COVID was a governmental plot

As this is written, only 48 percent of the residents of Alabama have gotten two doses of a COVID vaccine, with the predictable result of abnormally high per-capita death rates. Indeed, using the New York Times’ database derived from the CDC and other sources, some states like Idaho may not even know how many citizens have received vaccines. And the inequities of care within a single state can be vast. In Texas, 82 percent of the residents of Webb County are vaccinated, but only 21 percent in Gaines County.

Core public health best practices for the control of the spread of disease have been known for decades, granting some variations for local factors like weather, the mobility of the population, and the variability of medical care. Even so, it is settled science that immunization and wearing facemasks can reduce the spread of infectious disease in Burlington Vermont as well as Miami. But against the uniformity of guidelines lies the darker immutability of human conduct. Again, our dilemma is that prior beliefs and fantasies are difficult to dislodge even with sound evidence. Overlay this resistance to new fantasies that political treatments are surreptitious tools of thought control, and suddenly medical staffs have been forced to deal with wild speculation as well as disease. Indeed, a small percentage—but maybe thousands–are going from hospital ICUs to their graves with the belief that COVID was a governmental plot.

The Great Appreciators

Informed criticism is clearly diminished as a cultural mainstay, in part because we have made it so much easier to produce and distribute simulations of cultural products.

This is an era in American life where the young seem as interested in becoming content creators than content appreciators. To be sure, this is a broad  and inexact distinction. But it is clear that a large segment of younger Americans today are ready to self-identify as musicians, songwriters, filmmakers, writers or audio producers, without much experience or training. The results are usually predictably modest: unplanned videos, under-edited and “published” books, magazine-inspired blogs, or derivative music produced in front of a computer.  Without doubt, serendipity has always had a place in producing wonderful new talent. But it is also true that more of us want to count ourselves as being a part of the broad media mix made possible with nearly universal internet access. It’s now hardly surprising to meet a middle schooler who edits their own videos or, after a fashion, curates their own web presence.  As You Tube demonstrates, self-produced media content is unmistakably popular.

If this first quarter of the new century is the age of the content producer, it seems that—broadly speaking—the last half of the previous century was an era for witnessing and reflecting on breathtaking talent. The decline of this impulse is a loss. An appreciator is more than a consumer. These are folks with an understanding of the history and conventions of a form, with an equal interest in exploring how new works can build on and stretch the most stale of cultural ideas. The best work of appreciators can be cautionary, encouraging, or fire us with the enthusiasm that comes with new insights. Productive analysis can help us fathom what we do not yet understand.

               Pauline Kael

In the previous century, critics and essayists of all kinds of art were ubiquitous. Periodicals and big city newspapers routinely published considered assessments of trend-setters in popular culture, fiction, television, theater and film. Some combined their pieces in book-length studies of the period that are still worth reading. Michael Arlen and Neil Postman wrote insightful analyses of news and entertainment television. Pauline Kael and Roger Ebert were among many popular reviewers producing novel assessments of films and the film industry. They were matched by music critics like Michael Kennedy, Dave Marsh, Gene Lees and Donal Hanahan, who provided appraisals of performers and performances. Their counterparts in the visual arts included writers like Robert Hughes, Walter Benjamin, and Jerry Saltz: all exploring the vagaries of talent and caché in that enigmatic world.

Among countless publications, readers poured over this criticism in the pages of The Dial, The New Yorker, Gramophone, Paris Review, Harpers, The Atlantic, New York Review of Books and Rolling Stone. And no self-respecting daily newspaper considered itself complete without its own music and film critics. Bigger city papers also added performance reviews of dance, along with the assessments from urbanists of a city’s newest additions to its skyline.

Even beyond obvious and daily samples of book and theater reviews in many Twentieth Century news outlets, there was an entire world of appreciators with appetites for reconsidering the rivers of culture that came from distant headwaters. For example, Gramophone was founded in 1923 by the Scottish author Compton Mackenzie, who understood that there was an appetite for essays about the composers and performers captured in the new electrical recordings of the time. He proved the unlikely proposition that many wanted to read about music almost as much as they wanted to hear it.

Criticism has Diminished as a Cultural Mainstay

                       Susan Sontag

With video and digital media still mostly in the future, Americans in the first half of the century, had the time and the will to know the backstories of the cultural products of the day. Indeed, some writers like Norman Mailer, Susan Sontag, Joan Didion and Janet Malcom became intellectual thought leaders. They helped to explain what artistic mastery should look like. And they had the counterparts in a range of academic thinkers—T.W. Adorno, David Riesman, Marshall McLuhan and Kenneth Burke, for example—whose deeper cultural probes would soak into the fabric of the nation’s undergraduate curriculum. Sampling the output of so many professional appreciators would keep liberal arts students preoccupied for years, and sometimes forever.

        Toland Image From Scene from Citizen Kane

To be sure, our interest in the understandings the nation’s cultural output has not vanished. But criticism is clearly diminished as a cultural mainstay, in part because we have made it so much easier to produce and distribute simulations of cultural products. I use the word “simulations” because the impulse to be a content producer often bypasses the intellectual labor that comes in value-added art. So many today proceed without a grounding in the canons of a particular form: its histories, possibilities, and innovators. I suspect the desire to be an immediate practitioner in a realm that is barely understood is usually fed by the promise of fame. The result, as my colleagues in film sometimes lament, is that students want to be producers of video stories before the have considered the durable conventions of narrative: for example, the norms of a written screenplay, or how this first written map is converted into the visual “language” and grammar of film. To cite a specific case, it would be useful for a young filmmaker to know how cinematographer Greg Toland used light and shadow to create the unmistakable visual palette of Citizen Kane (1941), or how Steven Spielberg and John Williams exploited the tricky business of musical underscoring to leave audiences suitably terrified by Jaws (1975).

In our schools and colleges, the equipment to make art is frequently made available to students who have only rudimentary understandings of how they might be used. The youthful conceit that progress is made by setting aside what has come before is mostly an excuse to avoid the work of contemplation that creates competence and a lasting passion for an art form.