All posts by Gary C. Woodward

Midcentury Musical Innovators

It would have been fun to have been in the room when this natural garage tinkerer stumbled on to what has since become the standard of multi-track recording. 

One of the pleasures of a writing project on how sound has been used and misused has been the discovery of some unlikely heroes who promoted key audio innovations.  Here’s five.

Bing, a Mic, and a Tape Recorder.  In his day, Bing Crosby was a mega-star with hugely popular recordings and radio shows.  In the 1930s his relaxed approach to singing in front of a microphone redefined what a public performance can be. In the older form of vaudeville, the energy of the performer did most of the ‘selling.’  In radio, the microphone made those over-the-top performances unnecessary.

Crosby eventually grew tired of the time constraints of live radio. So it was natural to seek a way to make recordings of shows and re-edit some others. The problem was that, in this period, powerful radio executives thought it was unacceptable for their networks to foist recordings on an unsuspecting public.  They expected their stars to show up in their living rooms in real time. Bing eventually broke the “no recorded programs” taboo, gaining more time to do other things in life, like playing golf.  The solution came in the form of the Ampex Corporation’s development of tape recorders modeled on pioneering work done by the Germans during World War II. Tape was a far superior medium for recording than “cutting” records directly to heavy shellac masters. In addition, it was possible to easily edit the quarter-inch PVC tape to delete mistakes or add additional material. He liked tape so much he bought shares in the company and gave away recorders to friends. The networks eventually succumbed.

Les Paul and Mary Ford ‘invent’ Modern Recording

Guitarist Les Paul received one of Bing’s tape machines.  At the time Paul was beginning to use a recently improved Gibson electric guitar: one of a new breed of solid-bodied instruments with two “pickups” that amplified the sound of the strings electronically rather than through the acoustic body. Because of a technical mistake he made while learning to use the recorder, Paul discovered that he could record sound over an existing track. It was then just a small additional step to record parts of one song on different tracks that played back together, often with a slight delay.  It would have been fun to have been in the room at the time when this natural garage tinkerer stumbled on to what has since become standard in the industry: multi-track recording.  We may no longer use tape for all recordings; computer audio files do more or less the same thing. But we still count on the ability to “build up” a recording in a studio from many separate tracks.

Adding rhythm and bass tracks in a multitrack recording of his own previously recorded melody line turned Les Paul into a one-man band, with no set of innovations coming together so clearly than in “How High the Moon,” released in 1951. Mary Ford’s slightly delayed voice melded with the multi-tracked guitar to produce a groundbreaking hit that wore out jukeboxes across the nation.

A Surprising Audio Pioneer.  Anyone visiting the Sony Studios in Culver City will find what is perhaps the most honored room for recording music in the United States, the Barbra Streisand Scoring Stage.  It is surely an honor to have her name associated with the space, which was the location where most of the spare-no-expense MGM scores of the 50s were recorded, not to mention music for more recent films as well. The largest session ever done on the historic stage used an 80-piece orchestra with a 100-person choir (Empire of the Sun). 

       MGM-Sony-Streisand Scoring Stage in Los Angeles

The official reason for the honor was to give credit to the singer who regularly used the space to record sound tracks for Funny Girl and many of her 49 Gold Albums. There’s also a less official backstory as well. When Streisand was finishing her version of A Star is Born in 1976, she wanted to use the still-new Dolby surround format for the film.  But that meant pushing for better playback equipment in theaters scheduled to show the release. Many of them were still barely able to reproduce stereo sound. Using her considerable clout, Streisand demanded that bigger theaters improve their audio systems.

Please don’t cut me in two with that flashlight.  One classic piece of Hollywood lore is how the fabled sound designer Ben Burtt created the iconic effects of “lightsabers” slashing through the air. The distinctly electronic noise from the Star Wars series is now burned into our cinematic memories. The sabers were the preferred weapons of the future, but also a throwback to the swashbuckler films of the 1930s and 40s. Every action film needed a master dualist who could slice his way to dominance against villains who were as unlikable as Lord Vader.

Burtt came up with a blend that included sounds of an old movie projector motor he remembered from his days at USC, in addition to a nasty interference hum discovered when his microphone got too close to an old television set. Back then, a household filled with radios and televisions was an endless source of spurious electric interference that could rival Victor Frankenstein’s laboratory. That intrusive hum picked up by the microphone gave off a noise of bleeding electrons not so different from what someone might hear standing near a high-voltage substation. But a lightsaber was not fixed in place. Its sound needed to change when it sliced through the air. Burtt found that if he took what he recorded from his two sources and played them back, he could then wave another mic around and near the speaker, creating a Doppler effect where the pitch slightly raises and lowers as the mic moved by.  It’s indeed movie magic when a given sound turns a flashlight into an iconic movie weapon that has taken on a life of its own.

For more on recording and film sound see, available from and as an open-access file on this site.

Notes on Viewing Art from Another Era

                   Gregory Peck as Atticus Finch

I see the question of how we treat the moral and ethical shortcomings of earlier art as partly understandable in terms of the intentions of the original artists.

When looking at the past, it is easy to react with shock at what we see as huge moral lapses on display in older cultural products. No one today can look at a filmed scene with a character in blackface without a cringe and a sense of regret. Portrayals of race in public art and media are always an interesting case.

Even while the very different film To Kill a Mockingbird was considered a breakthrough account of justice denied to African Americans (book 1960, film, 1962), a modern view is that it’s still a “white narrative,” with a focus on Atticus Finch as the redeeming moral agent. The cultural rot that denied African Americans the right to live as equals was mostly overlooked in the story. We surely needed Mockingbird. Atticus is a wonderful character; his behavior enacts a form of empathy that is timeless. But we must turn to other sources to understand how depression-era Alabama was often a prison for its black residents.

And then there are the antique forms of racism intentionally or accidently on display in other films like Guess Who’s Coming to Dinner (1967) or the older Gone With the Wind.  It’s interesting that HBO will only show Victor Fleming’s 1940 multi-Academy award winner with the understated disclaimer that “it denies the horrors of slavery.” But that warning is hardly sufficient to prepare us for what is now a completely sentimental and tone-deaf narrative about the southern insurrection. In fact, live long enough, and it’s hard to find a film or novel that does not offer plots and characters that were once admired, but now labeled as racist or sexist.

On a parallel track, some younger academics and musicologists are throwing in the towel on the great canon of western classical music, unable to comfortably explore the works of Brahms or Haydn–among many others–who indirectly benefited from eighteenth and nineteenth century patrons who counted humans as their property. Does our interest in exploitative and unjust social structures require that we abandon the artistic products made possible by these benefactors?

                    Paul Robeson as Joe in Showboat

I see the question of how we treat the moral and ethical shortcomings of older art as partly understandable in terms of estimations of the intentions of the original artists. This is soft ground to be sure, since knowing the motivations of others is fraught with misperceptions. But if the answers are difficult to find, the questions are always worth asking. Did an artist seek to harm or exploit their subjects? Were characters created in ways to devalue their social identities? Were whole groups written off as less worthy of our understanding? As noted, it can be hard to know. But it is easier to feel better about the music of an old epic like Showboat (1936), knowing that Oscar Hammerstein’s lyrics enact his deep respect for diverse cultural pathways. One of the best moments in the work is given to Joe, an African American deck hand who sings about the misery of his limited life. “Old Man River” can be heard as a lament of racial injustice: a theme that is also carried out in a plot line that challenges the absurdities of old miscegenation laws.

By contrast, the character of Mammie in the screen epic, Gone With the Wind, is more one dimensional.  She is an African American house slave with opinions that are freely given, but her character is not fleshed out in ways that are fully dimensional. The key film process that asks us align ourselves with one or two characters is loaded in favor of the privileged Scarlett, played by Vivien Leigh. For the record, The Ambassador Hotel where the ceremony was held required her to sit at a separate table away from others in the cast, suggesting something less than benign intentions from her Hollywood peers. More and more, the film has become watchable only as a dead specimen about the ostensible inconveniences faced by white folks at the end of the civil war.

Our tendency in the current climate of identity politics is to dismiss the earlier work of others if we detect personal slights that extend to a whole class of individuals. This rejection is not necessarily a bad impulse, but it is also an intellectual sleight of hand that matches the racist logic we abhor in others.  Any time that one is meant to stand for all we are on even softer ground. These kinds of synecdoches distort the actual world and obviously ignore individual differences.

But on a broader scale we still can note is that those living in previous eras are indeed captives to the norms and horizons of the culture they were in. It seems obvious that we must permit creators of art to exist within the parameters of the only world they knew. We too easily exempt ourselves from this limitation, preferring the alternate fiction that we are free agents not captive to the low horizons of our own tribe. In due course our heirs will correct this misimpression; some of what we take for granted will surely be rebuked by future generations.

In the meantime, thorny issues remain. Do we let Thomas Jefferson off the hook for not releasing all of his slaves, even on his deathbed?  His behavior was not necessarily a violation of the norms of the landed class at the time, but I can’t ignore this ethical lapse. It is clear from his partial creation of the Declaration of Independence that his horizons extended high enough to see the atrocity of owning others. He gave voice to values of equality and freedom; he only needed to enact them in his own behavior. Fully honorable intentions should have allowed him to leave this world not with a final business decision, but on a note of grace.