The Perverse in the Popular

At its best, American popular culture possesses a vitality that belies the facile criticisms of both Right and Left. At its worst – as in Jerry Springer’s daytime talk show...
in which private misery and family dysfunction become public spectacle, a cockfight with psyches instead of roosters – popular culture seems to pose incalculable risks to what used to be called public morality.

In discussing both the vitality and the danger, we keep returning to the same dispiriting cliches. There’s more sex and violence than ever, yet sex and violence sell. Young people are being exposed to material that would have shocked their grandparents, yet there seems no way to protect them from it. We call for positive programs, yet our mass obsessions – murder trials, political scandals – focus almost entirely on the negative. Not surprisingly, we throw up our hands.

At this juncture it is natural to turn to the scholars in the social sciences and the humanities who study popular culture and the electronic media. Popular culture includes novels, magazines, and other printed matter, it in most discussions the term chiefly refers to the realm of electronic media: radio, records, films, television, video games, and now the ubiquitous Internet. Many of our received ideas about popular culture so defined come from three sources of academic expertise. Communications theory focuses on the psychological impacts of media. Cultural studies is concerned with the role of popular culture in reinforcing and expanding the existing social order. Traditional philosophy emphasizes the perennial difficulty of sustaining excellence, or even decency, in a culture seemingly devoted to the lowest common denominator.

Each of these intellectual perspectives contains more than a grain of truth. But none addresses the most serious problem facing popular culture: the democratization, now on a global scale, of what I call “perverse modernism.” To the familiar vices of the popular audience – notably, vulgarity and kitsch – perverse modernism has added a new twist: a radically adversarial stance toward society, morality, and art itself. That stance has gone from being the property of a tiny avant-garde a century ago to being an accepted part of the cultural mainstream today.

Perverse modernism is not the whole of modernism, by any means. But it is the easy part. Millions of people who cannot grasp the formal innovations of cubism have no trouble grasping the publicity stunts of dada. To the extent that today’s popular culture uses shock and scandal as a way of attracting attention and boosting sales, it is the child of perverse modernism. The “cutting edge” keeps shifting, of course. To perform in a bra was considered shocking when Madonna did it in the early 1980s; by the late 1990s it was part of the Mexican-American singer Selena’s “mainstream” image. Even many creators of popular culture who are not on the cutting edge assume that “pushing out the envelope” of sex and violence is the very definition of “creativity.”

Communications theory begins with what the media scholar W. Russell Neuman calls “the perception of a helpless mass public." Many of our received ideas about media, both inside and outside the academy, come from Marshall McLuhan’s bold hypothesis that “the medium is the message” – that the electronic media in this instance, like the print media before them, have the power to retool the human sensorium and, by extension, transform human consciousness.

McLuhan was by turns optimistic and pessimistic about this transformation, so it should come as no surprise that communications theory today has its optimists and its pessimists. In this era of the Internet, the optimists dominate. They predict a bright future in which every human being on the planet will be “empowered” by instant access to every other human being and to the species’ shared information cornucopia. The pessimists, whose heyday coincided with the rise of television, foresee a gloomier future, in which the endless distractions of the screen will bring the death of literacy, reason, and civilization as we know it.

Both optimistic and pessimistic communications theorists embrace McLuhan’s somewhat paradoxical assertion that the human mind is weaker than the media it creates for itself. How well grounded is this assertion? Neuman ventured an answer in The Future of the Mass Audience (1992), the product of a five-year study conducted for several major media companies. Noting that McLuhan raised important questions, but that it was “not his style” to research the answers, Neuman surveyed the available evidence and found what advertisers and educators already knew – that most human beings are “obdurate, impenetrable, resourcefully resistant” toward any message, regardless of medium, that does not fit “the cognitive makeup of the minds receiving it.”

Anticipating the vast potential of the Internet, Neuman suggested that the same pattern of obduracy would be repeated. To judge by the evidence (including a decade of dot-com overreaching), the Internet has not caused a radical change in the way people relate to media. Despite the ubiquitous image of the perpetually cybersurfing teenager, the vast majority of us mortals do not seek complex interactivity or deep information retrieval. Writes Neuman: "The lesson from the mass psychology of media behavior is that learning is partial, for the learner is selective and semi-attentive. The mass citizenry, for most issues, simply will not take the time to learn more or understand more deeply, no matter how inexpensive or convenient such further learning may be." People want from the Internet what they have always wanted from media: easy access to material of general interest and, especially, entertainment. The pattern may change with the next generation. But then again, it may not.

Is that regrettable? Only if you were hoping that the new media would transform human nature for the better. If you were expecting the opposite, it should be reassuring to think that it is likewise beyond them.

While communications theory zeroes in on individual psychology, cultural studies focuses on the political and social impacts of media, and it too has its pessimists and its optimists. The pessimists take their cue from the Frankfurt School – that band of influential German-Jewish emigre intellectuals, spooked by the Nazis' skillful use of radio and film, who argued during the 1930s and 1940s that American “mass culture” was itself a new totalitarianism, all the more powerful for being so subtle. In the minds of Theodor Adorno, Herbert Marcuse, and other Frankfurt School thinkers, American popular culture could not possibly produce true works of art, because all of its products were by definition commodities manufactured by the advanced capitalist "consciousness industry."

The optimistic branch of cultural studies emerged in the 1960s, when the leading lights of the German New Left, Jurgen Habermas and Hans Magnus Enzensberger, seized upon the ideas of another Frankfurt School theorist, Walter Benjamin. In a famous 1936 essay, “The Work of Art in the Age of Mechanical Reproduction,” Benjamin had argued that the electronic media, especially film, could in the right hands (not Hollywood’s) be used to mobilize the masses in favor of socialist revolution. During the 1960s, Benjamin’s idea inspired British and American cultural theorists who had grown up with television and movies, not to mention rock 'n' roll, to begin a passionate debate about whether particular works of popular culture were liberating or repressive, marginal or hegemonic, oppositional or dominant, and so on ad dialecticum.

Although its sex appeal has since faded somewhat, this branch of cultural studies now rules within the humanistic disciplines. Its academic practitioners place all “cultural products” – including objets d’art as traditionally defined, along with the artifacts of popular culture – on the same level, as specimens to be analyzed, not evaluated. Indeed, the concept of evaluation is itself regarded (theoretically, at least) as another datum to be analyzed.

This approach is not altogether bad. We live in an incredibly complex and dynamic cultural economy that delivers all kinds of objects, images, texts, and performances to all kinds of people, who respond to them in all kinds of ways. The intricate workings of this economy are fascinating, and as far as I can tell, cultural studies is the only field that makes a serious effort to map them.

But as anyone knows who has read an academic paean to the “transgressive” antics of Madonna, cultural theorists do not refrain from making judgments of value. What they do refrain from is basing those judgments on the standards of excellence worked out by artists (and critics) within a certain tradition. Instead, they apply their own standards, which begin with the assumption that all cultural products are ultimately about power and possess value only to the degree that they attack the established social order. The result, when translated into public discourse about the arts, is the now familiar culture war between moralists who insist that kitschy television shows like Touched by an Angel are genuine art because they preach family values, and academic apologists who celebrate decadent horror films like Hannibal because their graphic depictions of gross criminality promise to epater le bourgeois.

It would be nice to think that traditional philosophy provides the key to understanding what’s wrong with popular culture. But here again, there is a pronounced academic tendency to miss the point. Because most traditionalists in the humanities dismiss popular culture as the unappetizing fruit of democracy and commerce, they sidestep the urgent question of what makes it good or bad.

What would constitute a democratic model of excellence? I can sketch only a faint outline here. But one aspect would be the lack of a single center, of a geographic and aesthetically authoritative capital. In all high civilizations, the existence of a center has been a deeply rooted expectation. Even the rebellious romanticists and modernists who dissented from the Academie Francaise quickly re-created it in their own image. It was a short step for the impressionists from the Salon des Refuses to the walls of the Louvre. The alternative, it has always seemed, is relativism and a long messy slide into decadence and chaos.

Such worries apply with special force to popular culture, which is generally understood to have no center, no tradition, and certainly no understanding of excellence apart from profitability. But is that understanding accurate?

It has long been evident that, for good or ill, American elite culture lacks a capital. No matter how hard the practitioners of cultural studies try (and some of them try pretty hard), they have not proved convincingly that standards of artistic excellence in the United States emanate from a single (and by definition repressive) social-economic-political center. There is, of course, the National Endowment for the Arts. There is, of course, New York City. But there are also Chicago, San Francisco, Milwaukee, and a hundred other regional centers where good work is being done, and any one of them may well generate the next big trend.

It’s not just the geography of cultural production that is decentralized and in flux. What else could one expect in a society committed to the moral and political equality of its citizens and to a marketplace model of culture? The question is whether such a society necessarily drives out excellence. The novelist Ralph Ellison noted that “in this country, things are always all shook up, so that people are constantly moving around and rubbing off on one another culturally." He admitted that this can be confusing, even disquieting. “There are no easily recognizable points of rest, no facile certainties as to who, what, or where (culturally or historically) we are,” he wrote, adding that “the American condition is a state of unease.”

Yet as Ellison went on to argue, American diversity and unease are more often than not the parents of American excellence. Jacques Barzun, no admirer of popular culture, lends weight to the case when he reminds us that “the arts” are at best fragmentary and plural – not monolithic, as implied by that grand but misleading abstraction, “Art.”

It is not relativism but realism to make the same observations about popular culture. The entertainment industries are full of cultivated, intelligent people who think about their work in a much more traditional way than academics do. Recording artists ponder melody and rhythm; film and television scriptwriters wrestle with plot and dialogue; production designers worry about color, texture, and line; actors and directors compare themselves with admired predecessors in film and theater. The language these people speak is a craft language, directly descended from that of the older performing arts. In other words, each craft has its own center of excellence.

These people understand the depredations of commerce. But they also strive for that rare prize, the chart or ratings or box office success that is also a work of art. Such miracles don't happen every day, or even every year. But they do happen. And what's more, they last. In this time of dispute over the elite cultural canon, there is surprising agreement about what belongs in the canon of popular culture. The songs of Cole Porter, the compositions of Duke Ellington, the films of John Ford, the comic strips of Walt Kelly, the novels of Dashiell Hammett, and the 39 episodes of The Honeymooners that ran on CBS between 1955 and 1956 are just some of the works now described, without irony, as classic.

Given this sanguine picture of popular culture, why not stop worrying and learn to love it? What, after all, is the problem? The problem is perverse modernism. Not postmodernism (as some call it), because every item on the cultural agenda that currently bedevils us – rejecting tradition, attacking standards, provoking the audience, blurring the line between high and low and between art and life, and (last but not least) commandeering the mass media for subversive purposes – has been present since the dawn of modernism. This is the revolte impulse in modernism, rooted in the belief that if an artist makes the right anarchic gesture in the right place at the right time, he or she will help to spark social and political revolution. In this spirit, the German expressionist playwright Frank Wedekind staged scatological one-man shows in Munich’s Café Simplississimus at the end of the 19th century, the Italian futurists called for the razing of Venice in the years before World War I, and the dadaists between the wars turned cabaret into the precursor of what we call performance art.

Severed from any viable expectation of revolution, the bold, outrageous gesture remains the true and only form of “creativity” for many people who have the wherewithal to know better (critics and pundits), and many more who do not (teenagers). In its present form as the guiding impulse of cutting-edge popular culture, perverse modernism goes beyond the usual run of sex and violence into a deliberate, intellectualized attempt to make sex and violence as offensive as possible. That means treating such primal experiences (the stuff of all great art, after all) in ways that are unfeeling, indifferent, detached from the consequences of actions, and contemptuous of moral concerns.

Perverse modernism would be a nonstarter today without obscenity. Gone are the days when audiences could be provoked by free verse, loose brush strokes, pounding rhythms, or vivid descriptions of lovemaking. In America, most people accept the right of the artist to do whatever he or she wants, because they know all too well that even if some fussbudget tries to drag an artist into court, the law contains a loophole big enough to drive a Hummer through. If 2 Live Crew’s As Nasty As They Wanna Be, Robert Mapplethorpe’s X Portfolio, and other controversial landmarks of the past 20 years can all be said to have “serious artistic value” in the eyes of the law, then blood-soaked video games and pornographic Web sites are home free.

That Americans are still (mildly) shocked by obscenity does not mean that the culture is still puritanical. Obscenity violates our sense of shame, and in puritanical cultures the slightest reference to the body causes undue shame. Shedding puritanism does require that we extirpate all shame, or that we abandon the concept of obscenity.

By obscenity I do not mean hard-core pornography but something broader, a concept that encompasses violence as well as sex, and that does not exempt material judged to be of "serious artistic value." I take this definition from the political theorist Harry M. Clor, who makes it the basis of a principled argument for greater censorship. But that is not my purpose. My purpose is to expose perverse modernism for the cheap gimmick it has become.

In Clor's view, obscenity does not reside in any particular bodily functions or conditions, but in the angle of vision taken toward them: Obscenity “consists in a degradation of the human dimensions of life to a sub-human or merely physical level... Thus, there can be an obscene view of sex; there can also be obscene views of death, of birth, of illness, and of acts such as...eating or defecating. Obscenity makes a public exhibition of these phenomena and does so in a way such that their larger human context is lost or depreciated.”

D.H. Lawrence made the point very lucidly when he said that repression and obscenity are two sides of the same coin. Repression, he argued, led to "sex in the head," or the inability to move beyond fantasy. Hence the infantile preoccupation with pornography that is, in Lawrence's famous judgment, "an attempt to insult sex, to do dirt on it."

When challenged for trading in obscenity, today’s perverse modernists wrap themselves in the mantle of the great modernists – Flaubert, Stravinsky, Monet – who suffered opprobrium and even censorship because of their formal innovations or sexual candor. But that is nonsense. The great modernists were original without being obscene; today’s charlatans are obscene without being original.

Our situation is unprecedented because never before in the history of culture has so perverse a view of art been so widely popular. One could argue that this is good news, because as perverse modernism flows into the mainstream, it faces something it has never had to face before: a plebiscite. Although I would not place undue faith in the artistic judgment of the millions of consumers who will cast the deciding votes, my Ellisonian side says better they than the “arts community,” with its mindless reverence for offense. In the past, at least, the philistine public has weighed the claims of art against those of civility, decency and morality.

Yet a plebiscite may also be bad news, because as the grim history of the last century shows, the worst kind of culture war is between artists who hate morality and moralists who hate art. Push the envelope hard enough, and you invite popular revulsion, which can lead all too swiftly to backlash, censorship, and worse. Despite the silliness of those in the arts community who equate any moral criticism with totalitarian repression, playing with fire is still playing with fire. To judge by the atmosphere at many college campuses in recent years, the human urge to censor is alive and kicking.

Equally distressing is the widespread failure of cultural stewardship among prominent citizens who seem to find it more advantageous to fan the flames than to dampen them. Two years ago, Mayor Rudolph Giuliani touched off a media firestorm by attacking the Sensation exhibit at the Brooklyn Museum of Art, with its now infamous painting of an African Madonna replete with elephant dung. But if Giuliani was really concerned about the religious sensibilities of New York, Catholics, why didn’t he act 10 months earlier, when his administration signed off on the proposal to mount the exhibit?

I’m not suggesting that Giuliani conspired with the sponsors and organizers of Sensation. But surely these sophisticated individuals understood that they were investing in a publicity windfall. The pattern is all too familiar: Third-rate art is shot into orbit by a first-class media blitz. In exhibitions such as this, you can forget the mediocre objects on display. The point of the exercise, the real masterpiece, is the PR.

To repeat, it was one thing when the outrageous public gesture shocked a small number of haute bourgeoisie café and gallery goers. It is quite another when the same mentality dominates the makers of popular culture. Last May, Robert Wright, the president of NBC, wrote a letter to his industry colleagues complaining about the unfair advantage HBO’s hugely successful hit series, The Sopranos, enjoys in the race for audiences and awards. What did Wright point to as the reason for the series’ success? Not to its extraordinarily high level of writing and acting but to the regulatory environment that allows cable shows to show more (you guessed it) sex and violence.

Is The Sopranos a huge hit because it offers bigger doses of sex, violence, and profanity than network shows? Think about it for a minute. If the formula were really so simple, then wouldn’t every trashy program be a hit? This is the intellectual fallout from perverse modernism: a preoccupation with “pushing out the envelope” that excludes from consideration any other definition of what makes a program good and successful in the marketplace. Yet last year, when The Sopranos triumphed in the ratings and swept the Emmys, the producers of the show had consciously reduced its quotient of sex and violence.

The real danger is this: As the game of artist vs. moralist intensifies, it will drive everyone else off the stage. Jesse Helms against Robert Mapplethorpe, the Reverend Donald Wildmon against Marilyn Manson, the Gay & Lesbian Alliance Against Defamation against Eminem. Who benefits? The answer is obvious: the players. Politicians and preachers get to posture on C-Span; fat-cat art dealers and auction houses get fatter; Hollywood titans get to quote from the ACLU edition of the First Amendment; Johnny-come-lately dadaists, neglected outer-borough museums, and obscure record labels hit the big time; and a legion of lawyers get to sling the kind of dung that does not come from elephants.

And who suffers? Again, the answer is obvious: in the elite arts, the many poets, painters, and performers who strive to move audiences, not disgust them; in popular culture, the countless hardworking craftspeople (and handful of genuine artists) who go to work every day hoping to create not just another product but something of lasting value. And, of course, the rest of us suffer too – the vast, unwashed, imponderable democratic audience, whose good judgment may or may not lead us out of this predicament.

(First published in the Wilson Quarterly, summer 2001)

March 31, 2005 7:19 PM | | Comments (1)

Categories:

1 Comments

Powerful analysis of a nasty predicament. One of the main problems with expecting the "vast unwashed" public to get us out of trouble is that so many young people are never taught to recognize quality in the arts in the first place. If corporate radio stations systematically exclude excellence,and the movies and television continue to follow the perverse-modernist model, where are the young supposed to learn the difference between an Ellington or a Dylan and the latest cheap thrill? Taste needs examples on which to excercise itself in order to develop, and the mere fact of widespread availability does not guarantee exposure to excellence in art. Most of my (college) students don't know Ellington, Basie, John Ford, or the Honeymooners even exist.

Soundtrax

PRC Pop 

The Chinese pop music scene is like no other ...

Remembering Elvis 

The best part of him will never leave the building ...

Beyond Country 

Like all chart categories, "country" is an arbitrary heading under which one finds the ridiculous, the sublime, and everything in between. On the sublime end, a track that I have been listening to over and over for the last six months: Wynnona Judd's version of "She Is His Only Need." The way she sings it, irony is not a color or even a set of contrasting colors; it is iridescence.

Miles the Rock Star? 

Does Miles Davis belong in the Rock 'n' Roll Hall of Fame? Here's my take on his career ...

Essay Contest 

Attention, high school jazz listeners ...

more trax

Me Elsewhere

Edward Hopper 

Painter of light (and darkness) ...

Dissed in Translation 

Here's my best shot at taking Scorcese down a few pegs ...

Henri Rousseau Revisited 

"Henri Rousseau: Jungles in Paris" appeared at the National Gallery of Art in Washington this fall ...

Paul Klee's Art 

Paul Klee was not childish, despite frequent comparisons between his art and that of children...

Our Art Belongs to Dada 

Rent my "Dadioguide" tour of the Dada show (before it moves to MoMA) ...

more picks

Blogroll

About this Entry

This page contains a single entry by Douglas McLennan published on March 31, 2005 7:19 PM.

Oscar Night was the previous entry in this blog.

Video Virgil: Self Portraits is the next entry in this blog.

Find recent content on the main index or look in the archives to find all content.

Creative Commons License
This weblog is licensed under a Creative Commons License.