At the BBC, the approximate equivalent of going to the wrong door to greet a visitor is not that unusual. Such as the nature of setting up newscasts using “Smart” A.I.
Those fearing what will happen when artificial intelligence takes over more complex human functions can look to a lot of evidence to see that humans will still matter. Advanced A.I. technology offers astounding opportunities to pass off fakes as real. For example, film scenes are now often composed by putting actors against a green screen in an empty studio and electronically inserting a digital background from virtually anywhere. These kinds of economies used to be obvious in films using rear screen projections. Somehow even the great Hitchcock didn’t see how hackneyed they looked. But it can now be hard to tell if an actor is indeed gazing over a spectacular view of the Golden Gate, or just clutching a hand rail mock-up in Culver City.
Most of us already deal with corporate A.I. on almost a daily basis. But their synthetic nature of chatbots are pretty easily revealed in their inabilities to listen, and their laughable indifference to the complex human cues of our “otherness.” (“Press 1 to hear these choices again” is often about as good as it gets.)
Computer Code Calling the Shots
In both funny and interesting ways, nothing so easily represents the increased chaos that awaits us all than the “smart” cameras that have been used by BBC television news. These key devices occasionally go rogue, leaving confusion in their wake. To be sure, there are still news readers trying their best in the relatively new spaces within London’s Broadcasting House. But the management of what is arguably the world’s best broadcast news organization has remained committed to producing daily newscasts with software that manages most sound and video on their news sets: first, in the large circular space of what was Studio E, and more recently, in a newer version of it on a lower level. The original set encircled a news reader in a ring of automated cameras on rails, with sometimes funny outcomes. Without planning it, BBC World News occasionally runs its own version of “The Show that Goes Wrong.” Certainly not all the time, but still often enough to be enshrined in any number of YouTube clips.
The most obvious problem was that the cameras-on-tracks would leave news presenters to chase down a place in front which ever one was “live” at the time. Sometimes a presenter was only partly in the frame. At other times a rogue camera crossed into a shot, leaving viewers puzzled and presenters apologizing for the unwanted intrusion. Not infrequently, a news presenter was the last to know that where a camera was aimed and if their mic was on. As one cheerfully noted while trying to run to another part of the set, “You can pretend that you haven’t noticed.” Others complained of “gremlins” running the show. When things do not go as planned, the results can be the approximate equivalent of going to the wrong door to greet a visitor. Interestingly, the current group of automated cameras from the Norwegian company Electric Friends even have a face-reading capability.
Luckily, the BBC’s computer bugs are usually self-revealing, and a useful a reminder that our intelligence is reasonably quick in detecting situations that lack veracity.
We are well into in an era when idiot computers have made a hash of some routine functions. The real danger is when their presence is not easily detectable. A new vocabulary will need to be developed to communicate our displeasure at the appearance of misrepresentations and robots passed off as the real thing. Given its nature, electronic fakes can be obvious or harmless, but they can also be another form of wire fraud passed on by human originators as they real thing.