Tag Archives: algorithms

red concave bar

Putting Only 45 Cards on the Table

                              Source: Annalect

This is what the erosion of personal choice can look like.  We may not act on the best options available, because others are choosing them.

In the West we cherish the idea of individual freedom.  We act on the belief that we have ‘agency.’ That’s to say, the choices that matter are ours alone to make.  It’s a basic tenet of American life. Within the broad boundaries set by a civil society, no one has the right to deny or ignore them. But we are beginning to hear more for mathematician’s and others about “decisions” that are not as informed or self-generated as we may think.

Hannah Fry, a mathematician at University College, London has written about “tiny decisions on our behalf” that can be made without our blessing or our awareness. In an interview in reproduced in Vox, she notes that algorithms now effect many functions within our lives: “From what we choose to read and watch to who we choose to date, algorithms are increasingly playing a huge role. And it’s not just the obvious cases, like Google search algorithms or Amazon recommendation algorithms.”(Vox, October 1, 2018).

The broad palette of options presented to us from internet-based material have obviously been preselected, first and most obviously by advertisers who pay for high search placement, but also by algorithms used by internet providers to ostensibly match our interests.  If you have puzzled over why certain Facebook feeds go to some and not to others, you may have a growing sense that someone else is dealing the cards from a stacked deck. What someone sees at a given site is always a mystery: partly a function of the digital footprints we leave every time we double-click, but also because of unknowable algorithms.  We already know this, but its easy to forget about choices we never see.

By shifting decisions to mathematical formulas composed of triggering conditions we do not know we have essentially given up some of our autonomy.

At its most basic, an algorithm is just a formula for content selection that seems appropriate for a given consumer or class of consumers. It’s an efficient gatekeeping tool. And, to be sure, we have always had gatekeepers channeling some content in our direction and filtering out other items. But most of those decision-makers, especially in the news business, presumably use journalistic or source credibility standards for winnowing content.  Yet based on what I’ve seen from various feeds, its clear that those standards have been replaced by various triggers that have little to do with the quality of a given story.  For example, I see lots of stories of celebrity gossip from unknown “publications” on my Google Play, even though my interests lie elsewhere.

By shifting decisions to mathematical formulas composed of triggering conditions, we do not know what we have essentially given up.  Even a system truly based on probabilities and past practices is bound to yield results that are less than they should be. So when we are given “choices”—ranging from the best Asian restaurant “nearby,” to the most qualified news source for a specialized story—the recommendations are based on criteria we generally do not know.

All of this suggests that we have less to fear from robots than from programmed servers that only appear to be offering targeted information. In the 21st Century this is what the erosion of agency can mean.  Too often we are acting on options that have been set by others.