When did the idea of a direct conversation with another become so problematic?
For those of us who study human communication, direct face to face conversation is usually the fundamental model for understanding all other forms. When two or more people are in the same space addressing each other, their exchanges are likely to contain all of the critical yardsticks for measuring successful interaction. These essential processes include awareness of the other, the potential for immediate and unfiltered reciprocity in an exchange, and access to all the visual and verbal feedback that comes with direct person-to-person contact. All other channels of communication—including the devices that extend the range of human connectivity—alter or diminish one or more of one of these processes. And though it may not seem like it, altering or reducing a conversational asset is a big deal.
Until the advent of widespread electric telegraphy in the 1850s direct communication with another in the same space has always anchored human communities. The very idea of a sociology of human relationships is mostly predicated on the expectation that we have direct and real-time access to each other.
Even so, the default model for understanding how we maintain our social nature is increasingly at odds with the ways we now live. What has changed most dramatically are the preferences of younger Americans who are less eager to seek out conversation as a problem-solving tool.
We are kidding ourselves if we believe the false equivalency that lets “social media” substitute for living in the social world.
The most interesting research on this subject is from Sherry Turkle at M.I.T., who has been documenting the well-known drift of the young away from direct interaction to alternate channels that enlarge connectivity but diminish communication richness (Reclaiming Conversation, 2015). The platforms are well-known, including Instagram, Facebook, Twitter and other forms. Under the misnomer of “connectivity,” changes in technology and adjustments to them are slowly schooling younger generations to prefer communication that is mediated and intentionally isolating. We are kidding ourselves if we believe the false equivalency that lets “social media” substitute for living in the social world.
Turkle notes a wholesale flight away from direct conversation and toward electronic messaging. In the words of many of her interviewees, meeting directly with someone is “risky,” “too emotional,” “an interruption,” and “anxiety producing.” As a high school senior she interviewed observed, “What’s wrong with conversation? I’ll tell you what’s wrong with conversation! It takes place in real time and you can’t control what you are going to say.”
Responses like these suggest a desire to escape the burdens of acquiring the essential rudiments of what psychologists sometimes call “social intelligence,” meaning the ability to navigate through many essential and unavoidable relationships that unfold in real time.
It has always been true that some conversations are difficult. But this kind of face-work is also the essential work of a complex adult life. As Turkle notes,
Many of the things we all struggle with in love and work can be helped by conversation. Without conversation, studies show that we are less empathic, less connected, less creative and fulfilled. We are diminished, in retreat. But to generations that grew up using their phones to text and messages, these studies may be describing losses they don't feel. They didn't grow up with a lot of face-to-face talk.
Of course there is always a risk among the old to assume that progress has been overtaken by regression. To paraphrase the Oscar Hammerstein lyric from Oklahoma!, it’s easy to believe that “things have gone about as far as they can go.” Even so, it’s worth remembering that forms of mediated communication are usually not additive, but reductive. Texts, e-mails, and even video games start with various fundamentals of communication, but almost always take something away. It may be immediacy. It may be full interactivity. But the most consequential of all is a reduced intimacy that happens when humans are not in the same space breathing the same air.
From a communications point of view, concealing one’s identity is usually a breach of faith. But now anonymous comments are consumed as a form of entertainment, as if the collective id of a society has broken free of its constraints.
Like many children in the 50s, I thought it was a great stunt to sneak up on a neighbor’s house, ring the doorbell, and then go hide in the nearby bushes. Serious crime it wasn’t. But it did inadvertently remind my younger self of the not-very-useful-lesson that sometimes we don’t have to take ownership of our actions.
It’s never good when we can abuse the trust of others without paying a price. In our times we find the adult equivalent of hiding in the bushes every time we scan internet sites for the often dubious comments that follow. Many regretfully permit anonymous comments: Slate.com, and the Washingtonpost.com, to name two. To their credit, others such as Huffingtonpost.com andFacebook do not. We invite trouble when a person is free to weigh in on almost any topic without claiming their own name as the marker of a basic social responsibility. Hashtags representing various avatars allow us to escape the moral consequences of owning our comments, depriving the human recipients of our criticism the right to know who we are. Is it any wonder that many reactions to online stories are ill-considered, inflated and mean-spirited? Some swirl in virtual cesspools of rhetorical maliciousness.
Character assassination by proxy is never pretty, and can’t help but make us a coarser culture.
From a communications point of view, concealing one’s own identity used to be considered a fundamental breach of faith. This is the dark stuff that arises from whisper campaigns, witch-hunts, and those awful unattributed pamphlets alleging communist treason that ruined the careers of so many artists in the 40s and 50s. Character assassination by proxy is never pretty, and can’t help but make us a coarser culture.
I’ve written before about these writers who exist on the wrong side of the borders of civil discourse, most recently just after the bomb attack at the 2013 Boston Marathon. The complaints are still valid:
Typical are the monikers used by individuals who responded to a Slate.com story about the recent Boston bomb attacks. Slate was careful and responsible in its reporting. But as with most news sites, the individuals who signed on to make comments concealed their identities. Readers heard from “Celtic,” “ICU,” “ddool,” “roblimo,” “Dexterpoint,” “Lexm4,” and others. “Celtic,” for example, noted that the suspects were “Muslims,” expressing mock surprise that any of them would produce “terrorist actions.” “Dexterpoint” decried “lefties” who he imagined to be anxious to confirm that the terrorists were not Muslims.
At its worst, this is the territory of the unqualified conclusion and the fantasized conspiracy: often a stream-of-consciousness unburdening of personal demons unchecked by the kind of self-monitoring individuals usually apply in the presence of others. Turned outward, this reactive rhetoric is often a jumble of histrionics from persons who seem to want a stage and an audience, but lack the mettle to do more than offer taunts from behind the curtain.
Aristotle observed that an individual’s character is perhaps their most valuable asset. He subscribed to the conventional view that you reach others best when you offer an olive branch and the assurance of your good name. Instead, the oppositional language of denigration fills a simpler expressive need. What was once the art of public comment on national and community issues now seems more like an unintended registry of disempowerment. It’s easy to account for the attractions of screeds posted with abandon and without interest in preserving even the remnants of a civil self. But if there are pleasures in delivering anonymous and wounding responses, they make a mockery of the familiar cant that the “internet wants to be free.”
The functions of criticism and accusation come with the duty to be fully present. To engage in these forms without disclosing our identity amounts to a kind of moral crime.