Category Archives: Problem Practices

Communication behavior or analysis that is often counter-productive

red concave bar 1

Over-Predicting

[Netflix has just released a documentary on internet privacy and the Cambridge Analytica scandal.  The film by Karim Amer and Jehane Noujaim tells what is now a familiar story about how data-mining is used to target consumers and voters.  It’s a sobering story, but a key assumption in the documentary is that an organization that knows some of your likes and preferences essentially has the keys to the kingdom of your mind.  In this view, its just a small step from using those preferences to develop ‘psychographic’ appeals we are apt to find irresistible. While there’s some truth here, the film and the social sciences generally overpredict direct persuasive effects based on known audience preferences. It’s easier to explain this process than to successfully do it. This newly re-edited piece originally entitled “The Deterministic Mind” argues against that assumption.] 

In his book How The Mind Works (1997) psychologist Stephen Pinker notes that the very idea of science assumes that there are direct causes for any material effect. Ask an experimental psychologist about the nature of a particular behavior, and the conversation will eventually drift toward its possible social or familial roots.  For most of us this science template has probably infused itself in the ways we make sense of the everyday world.  If you do “Y,” how likely is it that you will get “X.” Identifying causal chains seems like the very definition of mental rigor.  The alternative assertion that “It’s not so simple” never seems so satisfying, and is less likely to get noticed and published.

Our accounts for how the world works is anchored in our faith that things can only get better when contributing factors are revealed and quantified. After all, events without apparent causes are disquieting.  A tree that falls and kills a passerby tests out capacities to accept events beyond our direct control.  It’s natural to want to prevent such deadly events.

It’s especially true to overestimate our abilities to shape and control the behavior of others.  A chain of causation that can be filled in makes the world seem more knowable. It’s the kind of expertise we sell to others if we are in the sciences, marketing or selling.  In media analysis this is sometimes discredited as “the magic bullet theory.”

We cherish lexicons of determinism. For example, we easily classify people into personality types, where labels (“neurotic”, “needy,” “depressed,” “obsessive,” to name a few) strive to be useful outcomes from knowable origins. But why Aunt Millie has a personality disorder is still anyone’s guess. Similarly, when a plane falls out of the sky we resort to the same template for making sense of what has happened.  When we ask “what went wrong?” we expect a precipitating cause to be named.  Only later do accident investigations usually reveal multiple problems that combined to create a disaster.

 

In our rushed and over-communicated age we rely heavily on the simplistic and deterministic. 

Consider a different kind of example. Imagine if you are a neuroscientist. How long can you retain your professional credibility if you take the risk of  acknowledging that the mind is partly “unknowable?”  Neuroscientists study the brain and generally shun discussion of the “mind,” the useful label for what the brain has given its owner by way of a wealth of experiences and perceptions.  What I see in my ‘mind’s eye’ is likely not what others see.  But how do we find the causes for those mindful thoughts?  A brain scan won’t cut it.  Consciousness can’t be reduced to predictable neural pathways. And so the idea of mind muddies the scientific impulse for the measurement of particular effects.  Thus the brain sciences generally remain silent on this rich idea, preferring to study the organ of thought more than thought itself.

This kind of problem is why the search for first causes tends to force us toward the absurdly technical or the overly simplistic.  On the simplistic side, compressed ideas about why things happen indeed yield answers:  usually good enough to see us to the end of the day, but not very reliable as bases for creating lasting understandings. The shorthand vocabulary of causes that we inevitably use give us dubious deterministic links that we nonetheless cling to.  And so Muslims cause terrorism and targeted appeals usually work.  Each labeled category is pushed by an arrow that points to a list of supposed causes, producing “answers” that in their narrowness are sometimes hardly worth knowing.

The best response in reply to an unfolding set of complex human events is uncertainly.  Even with the need for simplicity in our busy lives, we have to save room to let in the messiness that is part of the human condition.  Instead of imagining arrows, we need to think of webs.  A web is a better representation of lines influence that are complex and pass through rooms of intermediate and unknown causes.  If we want to be a little smarter all that is required is the resolve to give up the short-term thrills of unearned inevitability.

red bar

The Dunning-Kruger Effect

[A 2016 piece originally titled “Happily Misinformed” cites a feature of our age that seems even more appropriate now than when it was first published. How can we explain people who hold ersatz opinions in contradiction to established facts and evidence?  Here’s an updated version of that piece.]

In his sobering 1989 study, Democracy Without Citizens, Robert Entman dwells on the irony of living in an information-rich age among uninformed citizens.  There is a rich paradox to a culture where most members have a vast virtual library available on their computer, yet would struggle to pass a third grade civics test.  According to the Annenberg Public Policy Center, only one in three Americans can name our three branches of government. And only the same lone third could identify the party that controls each of the two houses of Congress. Fully a fifth of their sample thought that close decisions in the Supreme Court were sent to Congress to be settled.

Add in the dismal results of map literacy tests of high school and college students (“Where is Africa?,”  “Identify your city on this map”), and we have just a few markers of a failed information society.

As Entman noted, “computer and communication technology has enhanced the ability to obtain and transmit information rapidly and accurately,” but “the public’s knowledge of facts or reality have actually deteriorated.”  The result is “more political fantasy and myth transmitted by the very same news media.” We seem to live comfortably without even elementary understandings of forces that effect our lives.

This condition is sometimes identified as a feature of the Dunning-Kruger effect, a peculiarly distressing form of functional ignorance observed by two Cornell psychologists.  Their basic idea is that many of us seem not to be bothered by what we don’t know, producing a level of ignorance that allows us to overestimate our knowledge.  Dunning and Kruger found that “incompetent” individuals (those falling into the lowest quartile of knowledge on a subject) often failed to recognize their own lack of skill, failed to recognize the extent to which they were misinformed, and did not to accurately gauge the skills of others. In short, a person’s ignorance can actually increase rather than decrease their informational confidence. If you have an Uncle Fred who is certain that former President Obama is a Muslim born in Kenya, or that vaccines cause autism, you have an idea of what kind of willful ignorance this represents.

The Boundaries of the Unknown Grow the More We Know

Think of this pattern in an inverted sense: from the perspective of individuals who truly know what they are talking about.  Even for the well-informed, the more they know about a subject, the larger the circumference of the boundaries that delineate the unknown.  It takes considerable knowledge to know what you don’t know. That’s why those who have mastered a subject area are often the most humble about their expertise: their expanded understanding of a field give them a sense of the many areas that remain to be explored.

All of this makes listening to the truly ignorant a measure of our forbearance.  We are left to vicariously know the shame of someone who is not smart enough to feel it. As with our President, we wonder why he doesn’t feel more embarrassment for his exaggerations and  falsehoods.  Has he never really read about the substantive policy accomplishments of F.D.R (the FDIC, Social Security) or L.B.J.? (successful anti-poverty programs, the Civil Rights acts of 1964 or 1965).  Can slapping on tariffs for goods and subverting long-standing values of equality and fairness be enough to count as great leadership?

The Dunning-Kruger Effect shows it self in other ways as well. A key factor is our distraction by all forms of media—everything from texting to empty-headed television programming—that leaves us with little available time to be contributing members of the community.   When the norm is checking our phones over 100 times a day, we have perhaps reached a tipping point where we have no time or energy left to fill in our own informational black holes.

The idea of citizenship should mean more.  In this coming election cycle it’s worth remembering that nearly half of eligible voters will probably not bother to vote.  And even more will have no interest in learning about president and legislative candidates.  Worst still, this is all happening at a time when candidates have been captured by a reality-show logic that substitutes melodrama for more sober discussions of  how they intend to govern.  Put it altogether, and its clear that too many of us don’t notice that we are engrossed with a sideshow rather than the main event.