main: August 2007 Archives

This week has yielded a number of reports and opinions about the direction of daily newspapering, even a prediction of what the industry will look like in a mere 13 years.

Over at Editor & Publisher, the industry organ, this report shows that financial predictions for the industry were far worse than expected. Fitch Ratings anticipated a bad year for classifieds, but not as bad as it has been, expecially for publically-traded newspapers. The report cited four major chains in particular: Gannet, Tribune, McClatchy and Dow Jones.

July results from publicly traded newspaper companies, in particular, "are obviously cause for concern," Fitch says.

Is anyone really surprised?

Meanwhile, one of the papers owned by McClatchy, the Minneapolis Star-Tribune, got a big fuck-you this week when a new web-based publication in the Twin Cities announced that it had hired more than 20 veteran journalists, some of whom had taken buy-outs from the Star-Tribune.

Oh, and two of the 20-plus reporters and editors are Pulitzer Prize winners. One of those is John Camp, aka John Sandford, author of the "Prey" mystery novel series. The name of the publication is MinnPost (minnpost.com). It has about a million dollars in start up money. It plans to launch later this year. By the way, it's a nonprofit. Shareholders need not apply.

A bold step somebody needed to take.

While newspapers have taken a beating since the first of the year, so has literary criticism. Morris Dickstein recounts the book ban, so to speak, in this optimistic blog post over at the National Book Critics Circle blog, called "Critical Mass." The damage includes:

1. Teresa Weaver got the axe as book review editor of the Atlanta Journal-Constitution

2. The Associated Press closed its book review desk

3. The Raleigh News-Observer eliminated the post of full-time book review editor

4. And cutbacks at the Minneapolis Star Tribune, the San Diego Union-Tribune, and, most dramatic of all, the Los Angeles Times Book Review, which ceased being a stand-alone Sunday section and was folded into the Ideas section

Fortunately, for people who like to read about books, there's this development in New Haven, Conn., where a intrepid bunch of writers is sick of books getting the knife. They've started (and hope by the slimmest of threads) to continue a publication called The New Haven Review of Books. It was started by the former editor of that city's alternative weekly newspaper, Mark Oppenheimer, a contributor to the Huffington Post. He said they were inspired by the decline of newspaper coverage of books, but also a renewed interested in -- wait for it --

Localism.

This model won't replace the big-city, big-time book reviews; we still need them. And unless some angel comes along to fund another issue, this may be the last you hear of the New Haven Review of Books. But we're in an age of renewed attention to localism and regionalism, and book reviews -- like farmers' markets, or even local currencies -- can do their part.

A resurgence of localism and a renewed push for grassroots literary criticism (or arts journalism or fill in the blank with whatever aesthetic, philosophical, intellectual pursuit -- in a public forum -- of your choice) is a kind of analysis-based journalism of the kind recommended by Mitchell Stephens in that terrific piece for the Columbia Journalism Review in January. I keep referring to it, but it's good stuff. He writes:

The extra value our quality news organizations can and must regularly add is analysis: thoughtful, incisive attempts to divine the significance of events -- insights, not just information [my emphasis]. What is required -- if journalism is to move beyond selling cheap, widely available, staler-than-your-muffin news -- is, to choose a not very journalistic-sounding word, wisdom.

It seems almost spooky that we get similar thinking from this piece by Dave Morgan, the chairman of Tacoda, a marketing firm specializing in something called "behavioral targeting." He goes out on a limb to envision what the newspaper industry will look like in 2020, less than 13 years from now. I don't know what "behavioral targeting" is, but Morgan is on the cutting edge. Here's an edited down version of Morgan's prediction.

What does this say for critics and journalists? I've added emphasis to highlight points we've addressed here in Flyover.

• All media will be digital. There may still be some analog components in the supply chains of media companies, but analog will be a very small part of the business. There have been great advances in the development of digital, paper-like materials that are readable and can connect to digital networks; most important, I believe that the eco-consciousness that we are beginning to see is here to stay.

• Consumer attention will continue to fragment. Our news and information products won't be large, comprehensive and "averaged" for mass consumption as they are today in a newspaper*. Consumers will get best-of-breed information services from many different providers.

• There will be many, many different digital media devices. Many of these devices will be portable; all will be networked. And most devices will permit users to communicate and create, not just consume.**

• Media brands will matter -- but old brands will matter less. Consider how fast Yahoo and Google were able to build well-known, trusted brands.

• News and information applications and services will be more important than underlying data and news. Discovering, editing, synthesizing, analyzing news and information and advertising is what will attract and retain consumers.*** Sending someone to a city council meeting for three hours to file a four-paragraph recitation of events will be worthless in 2020.

• Competition will be fierce, particularly in large metro markers. In a world where digital distribution is low to start and cheap to expand, the barriers to entry that have benefited newspapers for many decades won't exist in 2020. The competition in the local news, information and advertising business will be fierce.

• There will be lots of winners. Consumers will be the big winners. They will get more, better, more diverse and much more accurate news, information and advertising than they receive today. Advertisers will also win. They will pay much less to reach their target consumers, with more relevant messages and offers than they can provide through today's analog media channels.

• Newspaper companies are very likely not to be winners. The characteristics of the companies that will win in 2020 are very different than the characteristics of newspaper companies today.

Footnotes:
* Web 2.0: good for the arts, good for journalism
** Adjusting to an active audience
*** Analytic journalism

August 31, 2007 8:53 AM | | Comments (2)

I'm going to depart from both of the two planned entries that I have in partial draft for today's post.

Instead of talking about weightier issues, I'm going to tell you about a party I went to last night. And if you'll bear with me, you'll see that it's not totally off subject for Flyover as the party had to do with both art and journalism.

The soiree took place at the historic Gem Theatre in downtown Detroit, in a beautiful section of the city near the new field for the Lions. More than 200 people packed into the cabaret-seating style theater enjoying a cocktail reception, a show, and then a dessert reception.

The show was the Oscar Wilde Award Night, sponsored and produced by a Detroit weekly, Between the Lines, to recognize excellence in local professional theater. The newspaper puts on an excellent party and representatives from the theater community from Detroit, Ann Arbor, Jackson, and Lansing turn out to celebrate.

The three reviewers from the newspaper reviewed 96 productions from 31 different professional companies. When announcing the nominations, Don Calamia, staff arts critic wrote:

For despite the second worst economy in the nation that seemingly kept a noticeable number of paying customers out of the seats and a governor who reneged on several million dollars of previously promised grant money, not one theater that Curtain Calls reviewed over the past few seasons shut its doors for financial reasons - despite numerous rumors to the contrary.

Instead, the Williamston Theatre set up shop in a sleepy little town near Lansing, and Who Wants Cake? snuck into Fabulous Ferndale with The Ringwald, a renovated home all its own. And StarBright Presents Dinner Theatre doubled its venues, one in southern Oakland County and another in northern Macomb County.

Call them crazy - and you wouldn't be the first - but these brave souls reflect the attitude of ALL Michigan thespians who believe the mitten state is a great place to live, work and raise a family - despite some major obstacles. So they stay and struggle - oftentimes for little money and even less recognition.

The staff at this newspaper understand their community and the environment it works in. More than one person expressed genuine surprise at receiving an award because, as they said, they didn't know anyone knew them. There was an amazement and gratitude that someone saw them and recognized their work.

For any in the journalism world who might wonder whether the arts community has noticed the cutback in arts coverage, let me share a moment with you. The master of ceremonies asked the newspaper's publishers to come to the stage to give out the publisher's award of excellence. Before the two women arrived at the lectern, the audience was on its feet, giving them a standing ovation. They recognized the commitment this newspaper has made to its art coverage and were eager to express their appreciation for it.

Earlier in the evening when I was talking to those same publishers about how thorough their arts coverage is, one of them said, "We're gay, we have to cover theater." A nearby actor came back with, "No, you don't have to, that's why it's great that you do."

Between the Lines is a newspaper that recognizes how vibrant the theater community is and how much coverage matters. Earlier this year, Calamia went to the publishers and said, "We need to do more." This from the critic who reviewed more than 70 shows, outstripping the two Detroit dailies and every other newspaper in that town. The publishers agreed with him. So they're soon launching Encore Michigan, a Website that will contain daily updates with new reviews every Monday morning.

Between the Lines is a shining beacon lighting the way to what is possible for newspapers to do. They demonstrate how to be part of the community while providing outstanding arts coverage. They understand their role in the ecosystem. While the artists may not always like what they say, they're grateful that someone is out there talking about their art, letting them know that they were heard, and telling others what is happening.

August 30, 2007 10:04 AM | | Comments (3)

Howdy folks, long time since I chimed in here. It's been a crazy month in and out of Montana for me. The first week of the month found me heading off to a family reunion in Kentucky, where I indulged in old ham and fried catfish, caught some bluegill, and generally lazed around by a lake. I returned to discover my adopted state on fire. Friends were evacuated from their homes, Missoula was smothered in a dense pillow of smoke, my four-month old baby was grumpy from breathing the lousy air, and half my co-workers were on vacation, leaving me with double my normal workload. Crazy times.

Fortunately, some things didn't fall apart in my absence, notably Flyover, which has seemed on fire in its own way, what with spirited discussions of generational issues and interesting analysis of intellectual hoity-toityness and so on. I feel almost intimidated to wade back in.

So I thought I'd share a personal experience that I thought some here might appreciate. A couple of weeks ago, at a neighborhood barbecue, I had one of those surprising conversations that only happen once in awhile -- where out of the blue, in an unexpected place, you bond with someone over an insight.

It began with my neighbor complaining about what he perceived as a dearth of great political music today. Gone are the days, he argued, when bands like U2 and the Beatles and Neil Young (his personal favorite) unleashed battle cries for the politically motivated and musically engaged youth, cries that rallied masses of people out of complacency and into political action.

I pointed out that such music is still being made, by well-known bands ranging from Michael Franti & Spearhead to John Mellencamp to Eminem.

"But it just doesn't seem like that music is really making a difference," he complained. "It's not what everybody is listening to."

"But," I pointed out, "it's only really been during the 20th century that 'everybody' could listen to one particular song and use it as a form of cultural currency. That was as much a result of the concentration of media power and the entertainment industry as it was about the importance of that music. What you're seeing is just a return to the natural state of culture, where not everybody listens to the same thing because tastes are increasingly localized and dispersed. The only difference is that, today, 'local' means something different."

Our conversation veered farther afield, but I will stop there because I'm curious: Am I the only person who believes that we'll never see another mass cultural phenomenon like the Beatles or the Stones? In a world where music by my band is theoretically as easy to access as music by 50 Cent or the greatest recording ever of Mahler's Second Symphony, and where the influence and presumed authority of a handful of critics at the New York Times and Rolling Stone is being undermined by tens of thousands of bloggers, isn't it now a given that tribalism and taste dispersion is the new cultural paradigm?

And if so, how should that affect what we do as journalists and critics?

August 28, 2007 9:31 PM | | Comments (6)

The following exchange is with Stephen Durbin, a sharp-eyed nature photographer whose work, to my unsophisticated eye, is in the Ansel Adams vein (that right, Stephen?). Anyway, he lives in Bozeman, Mont. We started this give-and-take about my post Monday, which argued that literary and aesthetic intellectuals are being replaced by scientists as public intellectuals and that perhaps the reason for that is our inability to argue on solid ground, i.e., we're for the most part trapped in, as my historian friend called it, "a whirlpool of relativist goo." The bottom-line for Stephen that agreeing on solid principles would be fine, but it's not necessary. What matters is the power of the arguments we make.

Stephen Durbin, Aug. 27, 2007
One thing that scientists learn early on is how slippery the notion of "truth" is. Look no further than the current debate on Edge (an excellent citation on your part) regarding global warming. They also learn that the duality "objective truth" vs. "one [truth] being just as 'good' as the other" is a false one; we must look at the arguments. So I'll subscribe to your points 1, 2, and 4, but please don't shoot yourself in the foot by harping on 3. That's just asking for another brand of orthodoxy like the one Gary rightly complains of.

Stoehr, Aug. 27, 2007
I think you might misunderstand what I mean by "objective truth," Steve, or perhaps I haven't made myself very clear. You're right in that global warming is a topic of debate and the "truth" of the issue will be fleshed out according to the quality, character and power of the arguments.

I'm not talking about controversy, however; I'm talking about external reality, like gravity, as Sokal said in his piece for Lingua Franca, explaining why he submitted a parody to Social Text. His complaint was that the editors of that journal did not question his assertion that gravity was a social contruction. How can gravity be a social construction? Yet the editors swallowed it whole. And how can gravity, or our concept of gravity, possibly have political implications? It's gravity for Chissake!

I entered graduate school only a couple of years after the so-called Sokal Affair. I was told Sokal didn't prove anything about lax intellectual rigor among postmodern theorists. Instead, I was told he actually proved the postmodernists right.

The "orthodoxy" you say I'm asking for is the kind that uses some common sense, but the orthodoxy of postmodernism -- and any other school of thought, for that matter, that flies in the face of the completely obvious -- can make the argument -- a logical and erroneous argument -- that the world is flat.

The "orthodoxy" I'm advocating is one that's basic and fundamental: Let's agree on what external reality is (i.e., that gravity is a force with established scientific causes), let's get away from the relative, subjective kind of thinking that's paralyzed the intellectual left from taking action for so long, as Gary P. notes above, and let's engage the public again about things that matter. --J.S.

Stephen Durbin, Aug. 28, 2007
If you want to see a truly "religious" argument, try asking some physicists what exactly gravity is (don't even think of asking why there is gravity). I agree there are things that are reasonable for us all to accept about gravity, but I believe that: 1) even the question of which things is very hard to understand for a non-physicist; 2) even if we all agreed, that wouldn't get us very far towards engaging with the public about things that matter. The problem is not that we don't all agree on the same version of "external reality," but that some apparently fly to the opposite extreme that all positions are equal (or think their audience does). I don't believe in either extreme, but in the power of argument. By all means, let someone argue that the world is flat. If we can't counter that with a more convincing argument, then we're also out of touch -- or lack courage. I think you're right that there is sometimes a reluctance to make those arguments we believe in for fear of offending. That may not always be wrong, but it can certainly go too far.

August 28, 2007 6:34 PM | | Comments (0)

The AP reports today that a new web-based newspaper run by a non-profit organization has enlisted more than 20 veteran reporters in the Minnesota area who had taken buy-outs from one of the main dailies, The Star-Tribune. Two of the writers are Pulitzer Prize winners. The "newspaper" will launch later this year on Minnpost.com.

Once again, innovation springs forth from the American Outback. And the one thing I like most is that the publication will specialize in news and insight articles, a sign that the website is going in the direction suggested by Mitchell Stephens in the Columbia Journalism Review earlier this year in a piece advocating something he calls "analytic journalism."

As Stephens writes:

The extra value our quality news organizations can and must regularly add is analysis: thoughtful, incisive attempts to divine the significance of events -- insights, not just information. What is required -- if journalism is to move beyond selling cheap, widely available, staler-than-your-muffin news -- is, to choose a not very journalistic-sounding word, wisdom.
Here's more historical precedent: In the days when dailies monopolized breaking news, slower journals -- weeklies like The Nation, The New Republic, Time -- stepped back from breaking news and sold smart analysis. Now it is the dailies, and even the evening news shows, that are slow. Now it is time for them to take that step back.

Elsewhere, Stephens writes of an American newspaper committed to deploying this strategy: The Times Herald-Record, in Middletown, N.Y.

Stephens continues:

No one is suggesting that reporters pontificate, spout, hazard a guess, or "tell" when it is indeed 'too soon to tell.' No one is suggesting that they indulge in unsupported, shoot-from-the-hip tirades.
"It's not like talk radio," explains one of the champions of analytic journalism, Mike Levine, the former executive editor of the Times Herald-Record in Middletown, New York. Levine died earlier this year. "But it's not traditional American journalism either."

Update: This post seems to be getting a lot of clicks thanks to being linked to Minnpost.com. So I thought I should take advantage of the attention to point out that innovations of the kind taking shape in Minnesota may be one of the trends we'll see in the coming years in newspapers.

I don't mean, however, that newspapers will go digital (I think that's inevitable, at least in part) or that they will be better run as non-profits (I think this is debatable, to be determined). What I mean is that innovation won't come from the coasts where Big Media is so big and so cumbersome and so lumbering as to be unable to adapt nimbly to change.

That means innovation will come from the Land We Love here in Flyover -- the cities and towns between the coasts, between the megalopolises of New York and Los Angeles, Washington, D.C., and San Francisco. This is where newspapers are smaller and nimbler.

Also, the stakes are much higher, because the status quo is much more fiercely protected in small towns. The problem for Flyover newspapers is maintaining the status quo, in my view, won't keep readers. MinnPost is just one example of smart people coming together who see something is wrong in their world and who want to change it for the better.

They can't do it through traditional means, because the traditional means, in the case of MinnPost's new newsroom staff, asked some of the more than 20 veteran journalists -- including two Pulitzer winners -- to bow out gracefully. Thanks for your time, here's your gold watch and here's your buy-out from the Star Tribune and the Saint Paul Pioneer Press. See ya.

There's still a lot of important work to be done and Joel Kramer, editor on MinnPost, knows it and has the acumen to persuade others of the need for tough, fair and probing journalism. That's how he raised raised more than $1 million in startup funds from four local families who clearly believe in the cause. And that's how he persuaded top talent to lend a hand.

My feeling is that this trend will continue for another reason.

That reason is brain drain.

As the industry continues to cut jobs, resources and transform into newsrooms antagonistic to creative, determined and intellegent journalists (I'm thinking of the recent comments by a Wall Street Journal reporter in reaction to Rupert Murdoch's purchase of the paper; he said he and others wouldn't stay at a WSJ-cum-Fox News), people, the best and brightest, will leave.

Where will they go? People will settle into other lives and careers no doubt. They'll head for foundations, univeresities and nonprofits. What's likely is that there will be an opportunity here for someone with resources, vision and the ability to lead, organize and motivate.

MinnPost is doing that. So is the New Haven Review of Books and the New Haven Independent, projects started by people sick of the way news and literary heritage are being covered by mainstream media. Neither claims to be a replacement for mainstream journalism, but both should be watched and learned from, because they are a sign of what's happening.

Part of what's happening is that people are simply turning away from Big Media. I don't think newspapers will go the way of the CD and suffer from the perception of being nearly without value. But people are turning away and have been turning away for about 20 years.

That's why there are so many "alternative" weeklies. Even though many are now owned by a national company (New Times), they are still a viable alternative to a city's dominant daily. And they viable, even vital, because of what they have always done: followed the course explained by Mitchell Stephens in the CJR by practicing interpretive or analytic journalism.

And let's not forget about the blogs. As Lisa Harris notes in the comment below, that's where a lot of the future is going to be, recapturing the spirit of a proud American tradition.

August 27, 2007 9:13 AM | | Comments (1)

I'm going to attempt another of those free-wheeling posts in which I try to make some connections among articles, ideas and writers I've been reading lately. What I hope to accomplish is the beginning of a proposal, a modest call for attention, to establish a new movement for intellectuals, those who think and feel something is not right in world of art, literature and creativity.

A menu of possible assertions:
1. Intellectuals need to talk less with each other and more to everyone else
2. Scientists have taken the traditional place of the public intellectual
3. Intellectuals need to re-establish the self-evident reality of objective truth
4. As newspapers recede, and the traditional hubs of intellectual activity recede with them, a new grassroots movement of intellectuals is needed to take its place.

Act 1: Theoretical bullshit
I'll start with something that I've returned to often (here and here and here, for instance): the disconcerting intellectual phenomenon that asserts that there is no such thing as objective reality, that epistemology is subjective, that facts are conditional.

I suppose I keep writing about it because without an fundamental agreement about what truth is -- and for that matter, what constitutes deception, equivocation, obfuscation, bullshit and outright lies -- how can we as critics, as mere human beings, accomplish much that is constructive, meaningful and significant?

Please don't get me wrong. I lean left, not right. I'm not trying to defend the high walls of Western Civilization. In fact, I argue that intellectuals need to re-establish the self-evident reality of objective truth as someone once ensconced in the Ivory Tower.

During my time at the University of Cincinnati, I became indoctrinated by literary critical theory. I came to believe in the postmodern condition of American culture. I believed in my ability to "read" anything like a "text," even non-semantic things like fashion, architecture and medical procedures. I suspected Enlightenment ideals like Reason and Truth were vestiges of imperial European colonization. Every subject -- whether porn, pulp fiction, romance novels, comic books -- all boiled down cynically to struggles for political and social power.

While I am grateful for postmodernism as a strategy for dismantling, or deconstructing, formerly entrenched ways of thinking, it's no humanist philosophy. There's little concern for people in it; there's little concern for morality in it. While postmodernist readings of, say, advertisements for Marlboro cigarettes (which I smoked) made "logical" sense, I felt that at its heart, postmodernism was a game of rhetoric, an argument over words and their struggle for meaning.

I left class many, many times feeling a kind of existential disorientation, a kind of out-of-body experience caught between a world constructed by language and a language that's always indeterminate. Hence, the world was indeterminate, like an illusion. If the world is indeterminate, possessing no ontological center independent of human consciousness, authoritative truth matters very little. Instead of one truth, there were many truths, with one being just as "good" as the other.

This kind of thinking is not exclusive to universities, or to people interested in and sensitive to intellectual inquiry. This postmodern relativism has trickled down to popular culture as well. Consider the book "Thank You For Smoking," Christopher Buckley's brilliant 1995 parody of Big Tobacco's downfall. The main character, Nick Naylor, is a master of postmodern relativism. No matter how much he was guilty of the sins of spin, by the judgment-free rules of postmodernism (it's a descriptive strategy, not proscriptive), his truth is as valid as any other, even if it's destructive bullshit.

And even if this kind of thinking is becoming passé in academe, which it is, it's influence lingers beyond the hallowed halls. Consider this response to our dearly departed Molly Ivins, who had offered one last cautionary tale about letting the amateur efforts of bloggers be confused with the professional, gritty and pain-in-the-ass tenacity of beat reporters. This reader was responding to Ivins' suggestion that bloggers try their hand at reporting a highway accident, just to see how difficult, challenging and rewarding determining the truth can be.

"If there is no objective truth, but only subjective truth (hence your five-car pile-up analogy) -- then what difference does it make if someone was a reporter or not? I am able to state subjective truth at a moment's notice -- it's always true for me!"

Act 2: The sins of our intellectuals
I don't think that it's overstating the case when I say that this kind of thinking is the result of academics and other intellectuals abandoning objective truth. And this attitude doesn't stop with fiction and the cranky comments of a Molly Ivins fan.

Harry G. Frankfurt, the moral philosopher formerly at Princeton, said the attitude is ubiquitous among a great many writers, artists and intellectuals in his 2005 treatise titled "On Truth," a follow-up to his bestselling book, "On Bullshit." In it, he said that "we live in a time when, strange to say, many quite cultivated individuals consider truth to be unworthy of any particular respect. ... this attitude -- or, indeed, a more extreme version of it -- has become disturbingly widespread even within what might naively have been thought to be a more reliable class of people."

He continued:

"Numerous unabashed skeptics and cynics about the importance of truth ... have been found among best-selling and prize-winning authors, among writers for leading newspapers, and among hitherto respected historians, biographers, memoirists, theorists of literature, novelists -- and even among philosophers ...

And:

"These shameless antagonists of common sense -- members of a certain emblematic subgroup call themselves 'postmodernists' -- rebelliously and self-righteously deny that truth has any genuinely objective reality at all. They therefore go on to deny that truth is worthy of any obligatory deference or respect. ... the postmodernists' view is that in the end the assignment of those entitlements is just up for grabs. It is simply a matter, as they say, of how you look at it."

In other words, it seems the intellectuals have failed us.

How can we talk about issues, debate points of view, engage in any kind of public conversation if there is no agreement on reality independent of human whim, desire, interest, folly, fear and ignorance? The intellectuals are suppose to talk about our country's important issues. Instead, for the past 30 years, they've turned inward, addressed themselves, left the pulpit to the pundits and undermined our ability to talk coherently, objectively and constructively about the things that matter most.

The failure of the intellectuals, some say, has lead to America's cultural and political decline. Dana Gioia, the chairman of the National Endowment for the Arts, noted in a widely read speech to graduates at Stanford University in June that such decline has occurred even as our economy has flourished and renewed itself since the '60s.

" ... surely artists and intellectuals are partly to blame. Most American artists, intellectuals, and academics have lost their ability to converse with the rest of society. We have become wonderfully expert in talking to one another, but we have become almost invisible and inaudible in the general culture."

Gioia continued:

"This mutual estrangement has had enormous cultural, social, and political consequences. America needs its artists and intellectuals, and they need to reestablish their rightful place in the general culture. If we could reopen the conversation between our best minds and the broader public, the results would not only transform society but also artistic and intellectual life."

In 1963, the novelist and chemist C.P. Snow wrote a book that provided a vision of just the kind of intellectual transformation that Gioia talks about. It was called "The Two Cultures: A Second Look," a follow-up to his 1959 book "The Two Cultures." In the first book, Snow talked about the division between literary intellectuals and scientists, how each didn't understand the other. In the second book, he envisioned a middle way, a "third culture" that was a consensus in which intellectuals talked with scientists, scientists to intellectuals, feeding the expertise and creativity of each other.

But that never happened.

Act 3: Being replaced by scientists
"The traditional American intellectuals are, in a sense, increasingly reactionary, and quite often proudly (and perversely) ignorant of many of the truly significant intellectual accomplishments of our time. Their culture, which dismisses science, is often nonempirical. It uses its own jargon and washes its own laundry. It is chiefly characterized by comment on comments, the swelling spiral of commentary eventually reaching the point where the real world gets lost."

Those are the words of John Brockman, author, impresario and book agent for Richard Dawkins and Steven Pinker, writing on his website, Edge. Note such words as "reactionary," "nonempirical," "the real world gets lost."

In 1996, Alan Sokal did something that illustrated just how far the real world had gotten lost in the hyper-jargon of literary theory. A physicist at New York University, Sokal submitted a paper to Social Text, an academic journal devoted to the discussion of postmodern literary theory. In it, he argued that quantum gravity is a social construction with profound political implications.

In other words, it was utter nonsense. I'm not really sure I've paraphrased the paper well. But it doesn't matter, because the point is that Social Context thought he was serious, lending credence to suspicions that such things as honesty and truth don't matter. As Sokal wrote in an article in Lingua Franca explaining his "experiment":

In the first paragraph I deride "the dogma imposed by the long post-Enlightenment hegemony over the Western intellectual outlook": that there exists an external world, whose properties are independent of any individual human being and indeed of humanity as a whole; that these properties are encoded in "eternal"' physical laws; and that human beings can obtain reliable, albeit imperfect and tentative, knowledge of these laws by hewing to the "objective'" procedures and epistemological strictures prescribed by the (so-called) scientific method.

Why did Sokal do this? To make a point:

... What concerns me is the proliferation, not just of nonsense and sloppy thinking per se, but of a particular kind of nonsense and sloppy thinking: one that denies the existence of objective realities, or (when challenged) admits their existence but downplays their practical relevance. At its best, a journal like Social Text raises important questions that no scientist should ignore -- questions, for example, about how corporate and government funding influence scientific work. Unfortunately, epistemic relativism does little to further the discussion of these matters.
In short, my concern over the spread of subjectivist thinking is both intellectual and political. Intellectually, the problem with such doctrines is that they are false (when not simply meaningless). There is a real world; its properties are not merely social constructions; facts and evidence do matter. What sane person would contend otherwise? And yet, much contemporary academic theorizing consists precisely of attempts to blur these obvious truths -- the utter absurdity of it all being concealed through obscure and pretentious language.
Social Text's acceptance of my article exemplifies the intellectual arrogance of Theory -- meaning postmodernist literary theory -- carried to its logical extreme. No wonder they didn't bother to consult a physicist. If all is discourse and "text,'' then knowledge of the real world is superfluous; even physics becomes just another branch of Cultural Studies. If, moreover, all is rhetoric and "language games,'" then internal logical consistency is superfluous too: a patina of theoretical sophistication serves equally well. Incomprehensibility becomes a virtue; allusions, metaphors and puns substitute for evidence and logic. My own article is, if anything, an extremely modest example of this well-established genre.

Postmodernism had already been under attack by right-wing jeremiahs like Alan Bloom in "The Closing of the American Mind." What Sokal's experiment showed, however, was that postmodernism is not just a tool for exposing the power structures of the status quo, to be naturally attacked by defenders of that power, but also, at its core, a poor and perhaps even harmful foundation for intellectual inquiry.

While the editors of Social Context, including the luminous scholar Andrew Ross, author of the near-impenetrable tome, "No Respect: Intellectuals and Popular Culture," were busy accepting a hoax as serious scholarship, John Brockman was getting to work communicating with real people about things that really matter.

According to this piece in the London Guardian titled "The new age of ignorance," Brockman has done more than anyone to break down C.P. Snow's divide. But instead of encouraging literary intellectuals to communicate with scientists and then in turn communicate what they find to an engaged, educated reading public, Brockman has devised a "Third Culture" that doesn't need any help thanks.

"'The Third Culture' consists of those scientists and other thinkers in the empirical world who, through their work and expository writing, are taking the place of the traditional intellectual in rendering visible the deeper meanings of our lives, redefining who and what we are," he writes on his website, Edge.

He continued:

The role of the intellectual includes communicating. Intellectuals are not just people who know things but people who shape the thoughts of their generation. An intellectual is a synthesizer, a publicist, a communicator. In his 1987 book "The Last Intellectuals," the cultural historian Russell Jacoby bemoaned the passing of a generation of public thinkers and their replacement by bloodless academicians. He was right, but also wrong. The third-culture thinkers are the new public intellectuals.

In short, the scientists don't need the intellectuals anymore.

They're doing it themselves.

The Guardian article also notes that Ian McEwan is one of the few novelists to contribute to the Edge's ongoing debates and that he suggests the project is not so far removed from the "old Enlightenment dream of a unified body of knowledge, when biologists and economists draw on each other's concepts and molecular biologists stray into the poorly defended territory of chemists and physicists."

Why can't literary and aesthetic intellectuals talk like this anymore?

Act 4: The new intellectuals
Brockman, via the Edge and the Edge Reality Club, a kind of scientist's salon, is doing wonders for advancing the national conversation about science and scientific thinking. There are more magazines devoted science than ever more, more hunger for science and more books about science, even some that advance atheism.

But what about the literary and aesthetic intellectuals? What about them? They are around, but their influence seems to be shrinking even more drastically thanks to shrinking exposure given to them by Big Media, especially newspapers.

Book sections have traditionally been the forum for such conversations and we all know where these are going: newspapers in LA, Chicago, Minneapolis, Raleigh and Atlanta have all either sacked their books editors, reduced their book pages, consolidated them or even moved them from their historical place on Sundays.

Newspapers, in short, are not going to cut it. So what to do?

Perhaps an answer can be found in a new grassroots publication in Connecticut. Called the New Haven Review of Books, the publication is the result of numerous writers in that city who believe someone has to pick up where the newspapers have left off.

As Mark Oppenheimer, a former editor of the New Haven Advocate and author of "Knocking on Heaven's Door: American Religion in the Age of Counterculture," writes, these are times that require innovative thinking by innovative people who live just about everywhere, not just in LA and New York.

In an age of shrinking book-review holes in newspapers, we're going to have to find new ways to get the word out about great books. Some of those ways will be local, and small in scale. We may never publish another issue of the New Haven Review (our motto is "Published Annually at Most"), but by just publishing once, we've made a statement in support of literary culture. Wouldn't it be cool if other small- and medium-sized towns -- Austin, Des Moines, Albany, etc. -- decided they wanted local book reviews, too? [italics mine] Maybe such reviews would feature local writers doing the reviewing, the way ours does, or maybe they would feature reviews of books by local authors. Either way, they would be reminders that major urban publications do not have to be the sole instruments for book reviewing.
And that leads us to the second statement that even one issue of a small, local book review makes: there are writers everywhere. Just here in New Haven and the surrounding towns we managed to round up Alice Mattison, Bruce Shapiro, Debby Applegate, Deirdre Bair, Jim Sleeper, Amy Bloom, and a couple dozen other greats. Many of us have never even met one another. We don't have a literary "scene" in this modest city; there is no cocktail-party circuit. But there are writers.
This model won't replace the big-city, big-time book reviews; we still need them. And unless some angel comes along to fund another issue, this may be the last you hear of the New Haven Review of Books. But we're in an age of renewed attention to localism and regionalism, and book reviews -- like farmers' markets, or even local currencies -- can do their part.

Localism and regionalism: Where to find the new literary and aesthetic intellectuals.

Update: I wanted to add quickly a short list of websites of writers and thinkers and people who care about life beyond themselves happening around the blogosphere. Blogging, like the New Haven Review of Books, is an obvious source of a new intellectual rising up. Some of those include:

Crooked Timber
Quick Study
Culture Wars (in the UK)
Spiked (in the UK)
TPFCafe

August 26, 2007 11:13 PM | | Comments (3)

Here's an item that's near and dear to our hearts here in Flyoverland.

Because newspapers are largely owned by national conglomerates who fail to give communities what they really need, and because these same newspapers are gutting their news staffs and moreover trashing their coverage of books, literary culture and intellectual heritage, we now have this: a scrappy book review from Small Town USA.

Called the New Haven Review of Books, the thinking of Mark Oppenheimer, a contributor to the Huffington Post, is that if we can't get what we want from the dominant newspapers, we do it ourselves. Also, New Haven happens to be home to a stable of terrific writers. Oppenheimer asked them to write reviews, essay and reflections free of charge just to make some points.

One of those points is this: That there are smart, literate and savvy people living everywhere in this country, not just in the major urban centers, and if we can just get them together, ask them what they really believe in and find some kind of square-peg-in-round-hole consensus, brilliant things can happen.

In his own words:

We may never publish another issue of the New Haven Review (our motto is "Published Annually at Most"), but by just publishing once, we've made a statement in support of literary culture. Wouldn't it be cool if other small- and medium-sized towns -- Austin, Des Moines, Albany, etc. -- decided they wanted local book reviews, too? Maybe such reviews would feature local writers doing the reviewing, the way ours does, or maybe they would feature reviews of books by local authors. Either way, they would be reminders that major urban publications do not have to be the sole instruments for book reviewing.

Here's to Oppenheimer and his cohort. May the Bard smile upon you.

August 25, 2007 9:16 AM | | Comments (0)

In a mood of complete impulsiveness, we've decided to offer a question of the week.

Who knows where this will go if it goes anywhere. No doubt it will die on the digital page. Nevertheless, to honor those intepid literary and aesthetic pioneers before us, we offer this question while keeping in mind we'll never get answers to pressing questions unless we ask.

So here it is, our question of the week. The floor is wide open.

What accounts for the minimal role of value-judgments in contemporary reviews?

Discuss amongst yourselves . . .

Update: Glenn Weiss, of Aesthetic Grounds, asks that we clarify what we mean by "value-judgments." It's a slippery term indeed. But here we simply mean a critic's reluctance to apply or even avoidance in applying his or her judgment to the art in question.

But then again, we here in Flyoverberg also want the question to be as open to interpretation as possible. So perhaps we need to decide what "value-judgments" are too and how to apply them and in which circumstances and for what purpose. A conversation about coming to a consensus might be healthy, too.

Your thoughts?

Update No. 2: Glenn Weiss adds his two cents at Aesthetic Grounds. Bottomline? The question about values is a question of identifying and establishing values. Read more here . . .

I encourage our best and brightest to join or to form different groups around values. How else can we have a diversity of human experience with the coming global end to traditional geographic diversity? If we don't form more groups, then we may be left only with economic diversity.

Update No. 3: Late Sunday night, book/daddy came from the shadows to offer this tidbit in response to our Question of the Week. Check out his response in our comments section, based on Gail Pool's book, "Faint Praise: The Plight of Book Reviewing in America."

August 24, 2007 11:09 AM | | Comments (5)

The flaw logic of the relativist approach to art criticism? . . .

On John Carey's 2006 book, "What Good Are the Arts?"

Carey has a keen nose for elitism of any kind and delights in exposing what he regards as fraudulence in the art world. This includes, naturally enough, excesses and absurdities of the avant garde, such as the Italian artist Piero Manzoni, who produced as works of art labeled tins of his own excrement ...

By concentrating on such stunts, Carey shows little desire to produce a balanced historical account of modernism. He traces our current bad, elitist attitudes back to Kant, who, he argues, thought of beauty as a mysterious supersensible property of art works. Kant believed that standards of beauty were "absolute and universal." For Carey, Kant's aesthetics is a "farrago of superstition and unsubstantiated assertion," [emphasis added] and he cannot understand why anybody ever believed any of it.

What's worse, Kant infected western thought with the intellectual disease of imagining that works of art are somehow "sacred" objects set off from the rest of ordinary experience. Carey's account of Kant strikes me as embarrassingly partial and confused, but I doubt if Carey cares: for him, Kant is just another way to denounce the preciousness endemic in the art world.

Ruskin is dismissed for treating art as religion and a higher moral realm. "Taste," Ruskin declared, "is not only a part and an index of morality -- it is the ONLY morality." [italics mine]. Carey has little trouble demolishing this sentiment, though I wish he had found more original examples than Hitler's tastes in opera and the fact that Auschwitz guards could listen to classical music at night and go back to killing people by day.

All of the problems Carey discovers with art and its uses lead him to a surprisingly empty definition when he finally gets to it: "My answer to the question, 'What is a work of art?' is 'A work of art is anything that anyone has ever considered a work of art, though it may be a work of art for only that one person'." [mine] This aesthetic solipsism may be one way to respond to the snobbery of the likes Kenneth Clark, but it does not do much to advance Carey's argument. After all, if art is whatever you say it is, then the Tate can hardly be faulted for buying Manzoni's feces.

From Denis Dutton's review for the Press in New Zealand in 2006

August 23, 2007 2:29 PM | | Comments (0)

The terrible consequences of the short-sighted management of daily newspapers . . .

Through this kind of management, they've told their employees they don't care about them, they've told their readers they don't care about them, and they've sold out their duty to serve as the Fourth Estate for a quick buck. But even more than that, these CEOs have justified it all by claiming it's their fiduciary duty to stockholders that's made them do it. I don't agree. To defend themselves under the guise of protecting the stockholders makes no sense. If people bought in at $50 per share and the changed economic realities mean that shares are worth only $40 apiece, squeezing out short-term profits to artificially raise the stock price only means that people are going to get screwed if they don't sell their stock in time. Is it a CEO's responsibility to help his current boss (read: shareholders) swindle his future boss (read: shareholders) by stripping all the future value out of a company? I don't think so, and I think what's happened is morally reprehensible.

From "Greedy Vultures: Why corporate management of daily newspapers spells disaster for the news business, community debate and our way of government" by Jeff vonKaenel, published in the Sacramento News & Review in December 2006.

August 23, 2007 2:22 PM | | Comments (0)

I'm very glad that Bridgette and Jennifer have continued talking about some of the differences between generations. It's not a conversation that you hear enough about, especially in media dominated by the sensibilities, interests, fears and anxieties of the Baby Boomer Generation, those born between 1946 and 1964.

There has been an off-blog conversation going on that's concerned about sparking a "generational flame war," as one of us called it. But we think there is merit in discussing the differences between boomers and those of us in Gen X and Gen Y (all four of us in Flyover are in Gen X, though I might be on the cusp; I'm 33).

The first thing I'd like to point out is this: As the boomers get grayer, there's a gradual but ever-hastening uptick in what I'll call the "rhetoric of declension." That is to say, the ubiquitous talk that aims to persuade and, oddly, to reassure the participants of the conversation that the world, indeed, is going straight to hell.

My parents started me on that tone, as evangelical Christians who believed the emergence of a beastly Antichrist would be any day now and that the Son of God would then return to take back with him to heaven the twice-born and Lord-we-hope-so everlasting souls of the Saved.

Then there's everybody else saying: Moral values are in decline, education is in decline, the government is in decline, the arts are definitely in decline. I'd hear this from parents and family and teachers and professors and parents of my friends.

It's like some grand paradigmatic narrative that all the adults in my adolescent life bought into. And the benchmarks of this teleological tale -- the story and its characters are, after all, slouching toward one inexorable outcome -- came when JFK and Bobby and George Wallace and the Gipper got shot.

This rhetoric of declension is everywhere, and it's getting more intense as boomers grow old, retire to the Sun Belt (which is here on the Georgia coast, by the way) and begin to wax nostalgic for the good old days when things weren't so hectic and people were just plain nice.

If it's growing more intense in everyday life, the rhetoric is virtually screaming when it comes to the arts, where it is especially pernicious. George Cotkin, in an insightful piece for the Chronicle of Higher Education, provides an excellent example of how deep it goes by characterizing the state of cultural criticism from the view point of those who thinks it's going down the tubes.

The state of cultural criticism today, in the view of many, is debilitated, perhaps even moribund. For [Sven] Birkerts, Alvin Kiernan, Russell Jacoby, and others, there once existed a lively, deep, public, and engaged cultural criticism. Great critics -- Lionel Trilling, Philip Rahv, Clement Greenberg, Alfred Kazin, and Dwight Macdonald -- roamed the roadways of criticism, stopping to dispense sage or impassioned judgments and to uphold standards. What happened?

According to this line of thought, our present generation of cultural critics, arriving after the assault of postmodernism and the increasingly widespread commercialization of culture, has been cast adrift, without any firm basis for judgments. Publications and institutions to support serious criticism, in this view, either no longer exist or are few in number.

Critics today, it is also claimed, are too cozy behind the ivied walls of academe, content to employ a prose style that is decipherable only to a handful of the cognoscenti. The deadly dive of university critics into the shallow depths of popular culture, moreover, reveals the unwillingness of these critics to uphold standards. Even if the reasons offered are contradictory, these Jeremiahs huddle around their sad conclusion that serious cultural criticism has fallen into a morass of petty bickering and bloated reputations.

Cotkin goes on to say that this is a revisionist perspective of cultural history and that there was no Shangri-La of Literary Criticism, that there was in-fighting, vanity and narcissism even among the ranks of High Modernists perched atop the Ivory Tower.

Cotkin was inspired by an essay by Sven Birkerts, a well-known literary critic who in 2004 (when Cotkin wrote his piece) saw occasion to once again mourn the decline of intellectual discourse after the Partisan Review closed up shop.

But even the venerable Partisan Review wasn't immune to displays of self-aggrandizement, folly and cravenness, as Cotkin points out. Besides, there are many outlets for intellectual inquiry today. Cotkin's list includes Reason magazine, the New York Review of Books, the Claremont Review and the Boston Review.

Remember, Cotkin's piece was written three years ago. Much has happened since then. I would add to that list NextBook, Prospect, SignandSight and the many thoughtful and skillfully managed blogs we read -- PBS's MediaShift, Culture Lust, Theatre Ideas, Bookforum's Shelf Space et al. and, of course, Artsjournal's blogs.

Back to my original point: generational differences. Where do you think this rhetoric of declension leaves those of us in Gen X and Gen Y when we have to deal with this gargantuan belief that everything's going to tarnation?

What's left for us when persuaded to believe that the reason the theater, orchestra and dance company are struggling is because people don't support like they used to.

What's left to believe in if all of this is true?

I hope it's clear that I'm being somewhat hyperbolic and somewhat facetious when I say all this. But I stress -- somewhat. With the media being run by boomers and arts groups being consumed with the theory of "get-'em-while-they're-young," I get a sense that Jennifer was onto something when she said it looks like Gen X/Y is being written off.

I also get a sense that boomers don't understand how Gen X/Y perceives the world. Moreoever, many boomers, as they get older and more entrenched in their ways, seem practically drunk on the sweet nector of nostalgia for having been the generation that "changed the world."

Elton John provides a nice example of the latter when he expressed his wish for the internet to be shut down because it keeps "people from going out and being with each other, creating stuff."

"Instead they sit at home and make their own records, which is sometimes OK but it doesn't bode well for long-term artistic vision.

"It's just a means to an end.

"We're talking about things that are going to change the world and change the way people listen to music and that's not going to happen with people blogging on the internet.

"I mean, get out there -- communicate.

"Hopefully the next movement in music will tear down the internet.

"Let's get out in the streets and march and protest instead of sitting at home and blogging.

"I do think it would be an incredible experiment to shut down the whole internet for five years and see what sort of art is produced over that span.

"There's too much technology available."

Note that to be considered a social force one must "march and protest." I don't know about anyone else, but I'm really tired of this. It's as if Elton and his ilk didn't know that the social upheavals of the 1960s were fought on the op-ed pages as much as on the streets. And they used words in newspapers, just like blogs do.

The reason Boomers don't understand the way Gen X/Y thinks has to do with, oddly enough, the Grammys. This year, we put a wire story about the results right on the front page the day after the Grammys awards. The reason we did that has a lot of do with the cynicism at the heart of the rhetoric of declension: Pop music is what those kids like. Put it on the front page and maybe they'll read the paper. Maybe if we keep putting all this celebrity crap out there, these whipper-snappers and their confusing ways might step up like they're supposed to and be the new generation of newspaper readers.

Well, they don't read the newspaper. What's the point? They already knew who won and who lost the Grammys hours before the press run that night. There are those who might argue that popular music is popular for a reason: lots of people buy the CDs of these artists. But that doesn't take into account the fact that Gen X/Y are not buying CDs. They're downloading. The people buying CDs are boomers, mostly.

And they're already subscribing to the paper.

August 22, 2007 1:14 AM | | Comments (9)

I'll jump in to a conversation kicked off by Bridgette last week and continued by numerous commenters and John. Bridgette wrote about how authenticity is one of the hallmarks of what Gen X arts audiences are seeking. She used a Michigan theater company as an example of a group that's pursuing a successful and artistically worthwhile strategy.

Commenter Mike Boehm and my co-blogger John raised the issue of trying to reach Gen X through their kids. Is this strategy for arts marketing advisable? Does it even work?

To be blunt, my own take is that writing off Gen X adults in favor of their kids is misguided and dispiriting. First off, the obvious: not all X-ers have kids. I'm one of 'em. So trying to reach me about the arts, a political issue, what-have-you "through my kids" is a non-starter.

Second, I think John was on to something when he stated: "Is it possible that at some point, given the intensity of arts marketing being leveled at children, that the arts will be something that's considered child's play? Something kids do, not adults?"

I don't know if things are quite that drastic (and John's not necessarily saying that they are), but I do worry about a view of art as primarily something enriching for kids, an educational nugget to be digested along with long division or something to boost ACT scores in a testing-focused climate.

Don't get me wrong: I am not knocking arts education for kids. I am all for it, both in the schools and through whatever parents may have the means to provide after school. I am a product of public schools and, although I don't recall much being available on the elementary level, I did have access to art and music in middle school and ceramics and creative writing electives in high school. Through my parents, I had ten years of private piano lessons, plus a year or two of ceramics courses at the local art museum. As a middle-class kid, I was lucky in that regard. I know how important these formative experiences can be.

Yet I believe the arts--both witnessing and doing--are equally important to adults. That is why I believe we can't write off a segment of the adult population as arts participants. One figure whose ideas guide me in that area is Wisconsin's Robert E. Gard.

Gard (1910-1992) was the author of The Arts in the Small Community, Grassroots Theater: A Search for Regional Arts in America and dozens of other books. He championed rural arts and the belief that everyone had something to contribute; all people's lives could be enriched by making art. He argued that all people had a right to create art that was an expression of themselves and their places.

Gard was the recipient of the first rural arts development grant from the NEA. His project nearly wasn't funded, but no less than Leonard Bernstein, who was sitting on the review panel, argued in favor of it and the rest is history. What Gard started in 1964, the School of the Arts at Rhinelander--a summer art school for adults in a northern Wisconsin community of about 8,000--continues to this day.

August 21, 2007 6:00 AM | | Comments (6)

I spoke today to the director of the Beaufort Performing Arts Center in Beaufort, S.C. Among other things, we talked about how she has succeeded in attracting Gen X and Gen Y to the theater by appealing to their parental instincts.

"I get them through their kids," she said.

She organizes story hours, puppet theaters, interactive performance art. And every fall there's the ubiquitous "Nutcracker" in which every child and her cousin auditions for and gets a part, packing the theater with proud parents.

This venue director is not alone in targeting children. Every major arts organizations in the Savannah region devotes significant resources to attracting children -- by bussing kids in for concerts, quartering off a section of an art museum for kids or partnering with other arts groups to provide arts activities.

The thinking is: Get hooked on the arts when they're young. But does this really work? Or is this a desparate strategy of an arts community in desparate need of a strategy that works in the face of the millions of other cultural options being offered to children? Is art special enough to warrant special attention? And if it is, can this attention be maintained into adulthood?

Is it possible that at some point, given the intensity of arts marketing being leveled at children, that the arts will be something that's considered child's play? Something kids do, not adults?

Our man in LA, Mike Boehm, had some thoughts on this and other things that he shared with us last week. I wanted to bring them out front for further discussion. -- J.S.

Re: marketing to Gen X: The common wisdom based on RAND and other studies is that when it comes to creating interest in the arts, if you don't catch 'em young, you lose 'em for good.

Obviously, it's important to optimize marketing approaches to the cohort born in the '60s and '70s, but this is a generation that, generally speaking, got the short end of the stick in its arts education, due to assorted economic and political reasons.

They also came up at a time when the arts were being culturally marginalized. The Boomer vibe was influenced by the highbrow Kennedy "Camelot" aura and a celebrity culture in which Richard Burton and Elizabeth Taylor were the ultimate Hollywood glamour couple, and Beverly Sills, Leonard Bernstein and Nureyev and Baryshnikov were mass-media figures; the latter two not just great dancers, but Cold War heroes.

Gen X, to the extent it was paying attention, saw political leaders make a whipping boy of the NEA; they were sold Hollywood glamour couples who were not apt to lead them to Tennessee Williams and Edward Albee, and instead of the Beatles' omnivorous example of prominently using classical influences to enrich rock, they grew up with the pointless musical debate over what it meant to be a true punk, and whether music that didn't hew to some outsider Independent/Alternative party line could ever be "authentic."

I don't think anything has changed in the '80s and '90s that might have enabled kids born in those decades to get a contact high on the arts by inhaling the cultural air around them.

So: should the institutional arts groups practice a form of triage and put everything into educational programming and educational lobbying, in hopes of winning the children born in this century and the tail end of the last one?

Under this strategy, they could rope in some Gen X and Y parents via their kids, all the while praying that the Boomers and pre-Boomers now sustaining the arts as audiences and donors will live long and prosper.

Or do they spread scarce resources more thinly and target the Xs and Ys along with the oldsters and the kiddies? Not an easy call.

Overhanging the arts, along with society as a whole, is whether America's leaders tackle the health care and Social Security crises in a way that avoids all-out generational warfare and reaches an equitable and economically sustainable solution.

Suggestion for cultivating Gen X-Y and doing a worthy deed: lifetime passes to museums, theaters and concert halls for Iraq War veterans. Maybe lifetime newspaper subscriptions, too.

August 20, 2007 3:30 PM | | Comments (4)

Art is doing philosophy? . . .

An interesting set of ideas about art, its context and its relation to philosophy comes from the American philosopher and art critic Arthur Danto. What makes something a work of art is not, says Danto, to be found by looking at its obvious properties. Danto believes that what "makes the difference between a Brillo box and a work of art consisting of a Brillo box is a certain theory of art. It is the theory that takes it up into the world of art, and keeps it from collapsing into the real object which it is."

What are we, however, doing when we ask about the difference between a Brillo box in a supermarket and a Brillo box in an art gallery? Danto's answer is that we are asking a philosophical question. Art now prompts us to do philosophy. Much of art today is about boundary testing of 'art': "Can this object be considered art?", "What is art?" Danto argues that art is doing philosophy; art is collapsing into philosophy.

G.W.F. Hegel in the nineteenth century declared that art would in future no longer be a predominant mode of expression for human beings. Danto seems to agree: Art has nothing left to do. It has run itself out, and has as its only project a philosophical one, the definition of art. And that would much better be left to the philosophers.

From "Aesthetics and Philosophy: A Match Made in Heaven?" by Anja Steinbauer for the September/October 2006 issue of Philosophy Now

August 20, 2007 11:50 AM | | Comments (1)

The most reasoned discussion of blogging's value yet . . .

But [bloggers] are, more often than not, trademarks of the kind of journalism that makes a difference. And if there is anything bloggers want more than an audience, it's knowing they are making a difference in politics. They are, to give them their due, changing what is euphemistically called the national "conversation." But what is the nature of that change? Does it deepen our understanding? Does it broaden our perspective?

From "All the noise that fits: The hard-line opinions on weblogs are no substitute for the patient fact-finding of reporters" by Michael Skube for the Los Angeles Times

August 20, 2007 8:15 AM | | Comments (1)

A new critical consensus? . . .

The democratization of criticism -- as in the Amazon system of readers' evaluating books -- is a messy affair, as democracy must be. But the solution to the problems of criticism in the present are best not discovered in the musty basements of nostalgia and sentiment for the cultural criticism of a half-century gone. Rather the solution is to recognize, as John Dewey did almost a century ago, that the problems of democracy demand more democracy (against the corporatization of culture), less nostalgia for a golden age that never was, and a spirit of openness to what is new and invigorating in our culture.

From "The Democratization of Cultural Criticism" by George Cotkin for the Chronicle of Higher Education.

August 20, 2007 3:32 AM | | Comments (0)

This past week I've spent a lot of time researching for two books I'm working on. Both are college textbooks, one on hospitality housekeeping management and the other on the world of spas.

An article in Lodging Hospitality on "Reaching Out to Generation X" talks about marketing to Generation X:

Turns out they're married and have kids and are pushing 40. So even though they remain high on style and don't know where (let alone how) to draw the line between business and leisure, they've settled down and are looking to nest wherever they travel. That drive reflects the times they came of age, when many of their Boomer parents divorced and many of their Boomer dads lost their jobs. No wonder they value community, connection and connectivity. No wonder it's de rigueur to appeal to their need for the total experience. No wonder product-driven marketing is so 2000."

Community, connection, and connectivity. And they're talking about business marketing and not art?

Later, the article says:

Hospitality marketing rarely addresses Gen X's complicated approach, Rach suggests. It also doesn't take into account this demographic's fine-tuned, ironic sense of humor. "They don't want advertising that tries to fool them or promises things that can't come from that product," she says, calling a recent campaign for Sprite that ordered, Obey your thirst, a successful marketing effort. "The idea was that if you were thirsty you needed to drink something, not that by drinking it you would become something.

Marketing in this industry tends to be extraordinarily product-focused, when what this generation is looking for is an experience based upon relationships," she says.

As one of those Gen X members who is pushing 40, this resonated. I have a finely tuned bullshit radar. Marketing that lies simply doesn't work. My generation grew up on commercials and learned early on that our toys didn't live up to the hype. What we look for really is about connection and community. We want to belong and we want to have meaningful, authentic relationships.

Authenticity is a word they don't address, but I would add to the list. Neither art nor advertising is an excuse for lies. If you're not real, if you're not authentic, then you're not likely to get our ear for very long. Once we figure out you're fake, you're done.

We have had several new theater groups spring up in the past ten years in Lansing, most of them founded by members of Generation X or those on the border. The ones that have succeeded are those that are committed to the ideas of connection and authenticity.

This past season I was wowed by a production of Hedwig and the Angry Inch done by a relatively new group called Peppermint Creek Theatre Company. Later, when I was talking to someone about the show, they expressed the opinion that there was a generational difference in those who liked the show. I'm starting to come around to the view.

In many ways, it comes down to the idea of connections. Why would I, a Midwestern, white, middle-aged mom who grew up in the suburbs feel such a deep, strong connection to a character like Hedwig? Perhaps it is because I ignored the character's self-created hype and looked at the heart of the person being portrayed. Sure, I've never had a botched sex change operation, but I (and every other person on this planet) have struggled with issues of identity and self-definition. I've taken extreme actions to escape what seemed to be an inescapable situation only to learn that a little patience would have mitigated the problem. I've been disappointed that others don't see me the way I see me. These are the things that Hedwig is about far more than the bitterness a transgendered performer feels about having been left with a mutilated penis.

Hedwig is a powerful musical because it makes those connections with individuals. It doesn't arrogantly preach to the audience telling them that they're too obtuse to get it. If you don't like us, don't ask us to join in the experience of your art.

The show's director and artistic director have a deep respect for their audience, a respect that shows in every production they do. They're not preaching to an audience they think is hateful and stupid. They're inviting people they respect to engage in a dialogue about the world we live in.

It's not about the product. It's about the relationship.

Take a look at their season last year. They did The Pillowman, The Goat or Who is Sylvia, 9 Parts of Desire, Hedwig and the Angry Inch, and The Last Days of Judas Iscariot. The plays include the torture of children, bestiality, war in Iraq, transgender issues, and theology.

It could have been disastrous. Had they been preaching from a pulpit in a contemptuously dismissive attitude, it would have been a failure. The audience would have soon figured it out and stayed away. Instead, Peppermint Creek begins with respect, a respect that is obvious in every interaction with them. They trust their audience to handle challenging topics and don't try to make a controversy out of every brave choice they make.

The result is that their houses were packed last year and they reaped multiple awards from everyone who gave them out. Despite the subject matter of their plays, I was never shocked at what I saw. I was engaged. I saw things that mattered to me and that stayed with me long after the play was over.

Theater has much to offer that Generation X wants. So why aren't they coming to the theater? Perhaps there needs to be a change in marketing focus for the arts world as well. Quit hyping the product. Tell us about your true value--the connections that we will make with each other and with the art form. Don't tell us what we'll get out of theater, because you have no way of knowing. Tell us instead, that there will be a unique experience every night because we will be a part of the creation. Invite us to come engage our brains and hearts in our community and trust us.

Theater is unique in what it offers. If it can find a way to focus on these connections rather than the product, who knows what revitalization it might find?

It's a topic for another blog entry, but a case might even be made that this is why there is such a ground swelling of theater throughout the country even while it stagnates in its traditional homes. Could it be because theater that is local is communicating the message of connection and community rather than the really great product that they're selling on their stage?

August 16, 2007 2:22 PM | | Comments (5)

The times they are a-changin' . . .

From Tim Adams' interview with James Watson, the co-discoverer of the double-helix, the molecular form that DNA takes. He wrote this piece updating the debate first established by C.P. Snow in a classic essay about the "Two Cultures," one literary intellectual and one scientific intellectual, for the London Guardian last month.

I asked [James Watson] how he thought the climate of scientific research had changed since he made his fateful discovery of the structure of life in 1953. As ever, he came at the question from an unusual angle. He doubted, he said, that in today's world, he and Francis Crick would ever have had their Eureka moment.

'I recently went to my staircase at Clare College, Cambridge and there were women there!' he said, with an enormous measure of retrospective sexual frustration. 'There have been a lot of convincing studies recently about the loss of productivity in the Western male. It may be that entertainment culture now is so engaging that it keeps people satisfied. We didn't have that. Science was much more fun than listening to the radio. When you are 16 or 17 and in that inherently semi-lonely period when you are deciding whether to be an intellectual, many now don't bother.'

Watson raised an eyebrow, fixed me again with a look. 'What you have instead are characters out of Nick Hornby's very funny books, who channel their intellect in pop culture. The hopeless male.'

As James Watson knows perhaps more clearly than anyone alive, biology works in mysterious ways.

August 16, 2007 11:19 AM | | Comments (0)

The future of old-school journalism . . .

Most of what matters about the coming media age is already being decided outside of traditional newsrooms, on YouTube and countless other Web sites, or in the advertising agencies that calibrate Google search results with the mouse-clicking habits of young consumers. Perhaps Google or its ilk will find it profitable or desirable to fund independent, expert foreign correspondence; or to support investigations of corporate and government power; or to train the sort of journalists who feel free to call out their employers' pay packages on the proverbial front page, although there are no signs of this yet. Or perhaps the Sulzbergers and the Grahams can adapt their public trusts successfully to the new technologies. And even if their efforts fail to become profitable these families might still preserve their newsrooms' independence by converting them into nonprofit foundations, similar to what the Poynter family did with the St. Petersburg Times, in Florida.

From "Read All About It," the comment in this week's New Yorker on the takeover of the Wall Street Journal by media baron Rupert Murdoch by Steve Coll

August 16, 2007 7:31 AM | | Comments (0)

From Lori Ortiz, a dance critic and new reader of Flyover. She comments here on yesterday's post ("Hyperlocalism et al.: a note of caution") about the dubious nature of "citizen journalism" and "reader reviews."

Her last sentence really struck a chord with me.

Company blogs promoting their product over others, and screen names for specially interested groups and individuals, are putting holes in public trust.

Have you ever checked out "reviews" of hotels? It's too easy for someone, or more than one, to plant a bug in the consumer's bonnet about the competition.

A new program, just out, tracks Wikipedia sources. That should help a bit. The best defense is to seek many sources, and credentials.

The responsibility falls on the consumer to look for the truth. [emphasis added]

August 16, 2007 6:50 AM | | Comments (1)

Another take on the issue from nearly four years ago . . .

Are bloggers journalists? Will they soon replace newspapers?

The best answer to those two questions is: those are two really dumb questions; enough hot air has been expended in their name already.

From "Emerging Alternatives: Blogworld, the New Amateur Journalists Weigh In" by Matt Welch for the Columbia Journalism Review September/October 2003.

August 15, 2007 3:00 PM | | Comments (0)

Can readers really tell the difference? Or does this devalue journalism? . . .

Once again, good thinking from our man in Lexington, Ky., Rich Copley, in reaction to this morning's post about the potential pitfalls of hyperlocalism, citizen journalism and many of the other current trends in newspapering.

I actually got into a conversation with a musician about this last week. He was saying that his big concern with the Internet wasn't so much that people were stealing his music but that with services like MySpace and YouTube, "anyone can be as legit as anyone else. You can watch someone's video online, but that doesn't tell you whether they can stand up in front of an audience and deliver the same thing live," or whether its been digitally manipulated in some way.

I thought, "I know exactly what you are talking about," because the whole citizen journalist-reader review thing makes it harder for readers to discern between a trained professional journalist and the person who just posts something on a lark or to advance an agenda. I wish papers would be more careful not to compromise their good brand names, but that doesn't seem to be happening.

August 15, 2007 11:14 AM | | Comments (1)

Perhaps it's time the case was made for a new definition . . .

It has been on the endangered list for ages, as management experts and talking heads pecked at it. ... "Culture" used to be the word used to describe activities such as listening to Bach or going to the theatre and art galleries. ... Now when you see the word "culture" it is generally being used as an all-purpose insult for people who have fallen behind the times and are reluctant to embrace new practices introduced by boardroom barbarians or government gorillas. "Management culture", "institutional culture", "workplace culture", "four-wheel-drive culture", "a culture of neglect"... In the past week, I have heard governments being accused of having "a culture of resisting disclosure of information", welfare authorities said to have "a culture of indifference" that affects vulnerable children, and an "anti-adoption culture" standing in the way of people wanting to adopt from overseas.

Culture has become what you have when you have a problem. ... "Culture" has become a virus, infecting government, union and PR statements and emails sent around by management, HR and consultants whose earnings are doubtless linked to the number of times they use it negatively in their final reports.

From "The respectable word that fell in with the wrong crowd" by Jenny Tabakoff for the Sydney (Australia) Morning News.

August 15, 2007 8:03 AM | | Comments (0)

Art is being separated from the people who make it. News is being separated from the people who report it. The consumer has control; she is in charge; she doesn't really care where her media -- news, music, movies -- comes from.

Just as long as she can get it.

Case in point is the most recent report regarding illicit downloading. This study from Europe finds that teenagers there are downloading songs for illegally because everyone's doing it, even their parents. The more people who do it, the more people will do it. That's why, despite being aware of the potential for legal repercussions, teenagers have little fear. In fact, they say they'll likely do more.

I've been drawing a lot of parallels between the music industry's woes and the issues facing the newspaper industry. The reason is that the changes taking place in one seem to mirror those in the other -- and the changes center on the consumer.

As I noted last week, a fundamental component of that change is a shift in position, in the case of newspapering, of the reader from a place of passive consumption to active participation. It seems any effort to discover a new business model to save an ailing newspaper industry will have to address this shift in the reader's position.

When USA Today launched its new website earlier this year, it addressed this issue head on by aiming to "create a community around the news." In this way, USA Today was structuring itself similarly to social networking websites like MySpace and Facebook and Bebo, which give users tools for active participation.

"Using the new features, users can see other news sources directly on the USA Today site; see others readers' reactions to stories; recommend content and comments to each other; interact using comments and in public forums, upload digital photographs to the site; write arts and culture reviews of their own; and interact more with the newspaper's staff," according to the paper's PR in March.

By giving users so many ways of interacting with the news, the newspaper and each other, the hope is to drive traffic toward the website and keep it there. The thinking is that the information and virtual experience will be so valuable that readers will endure advertising, as they do with TV, if they want to get at the goodies they value.

It sounds like a great idea and goodness knows we need good ideas right now. But I'm skeptical. As I've mentioned before, the new technologies of Web 2.0 are having the same affect on us right now as the phonograph did in the early 20th century: Just as the phonograph changed the way we hear music, Web 2.0 is changing the way readers uses the news. There is a fundamental change in consciousness underway.

As the new USA Today website makes clear, there are already serious attempts in the works to harness that change. And it may turn out to be an emulated model. Indeed, if it catches on, "creating a community around the news" will be a major shift from the way 20th Century Mass Media was structured.

Is it a good shift? I don't know. But one thing's for sure: Just as capital will naturally try to consolidate, capital will also try to get something for nothing.

What do I mean? I mean those in control of most of the capital will naturally try to seize a non-marketplace idea and press it into the service of the marketplace.

Let put it this way: The spirit that animates believers of Wikipedia is a good spirit in essence. That spirit wants to see information be free, democratic, accessible and used in ways that benefit the lives of the countless people who most need it.

That's a non-marketplace idea and it's a great one. The reality, however, is human nature. If anyone can access Wikipedia, then anyone can commit whatever acts of dishonesty, slander, et al. that they wish (until at least it's detected and corrected).

Wikipedia has been compared to the Encyclopedia Britannica. Some say its just as accurate. Some don't. The vital difference, however, is mediation: One has the benefits of experts in their fields sifting through information, making it quality stuff.

The question isn't about accuracy, really, because as I said before Wikipedia will fix acts of dishonestly. The real question is more fundamental: Which do you trust?

I think this "creating a community around the news" thing is interesting, but I fear it could create a situation like Britannica vs. Wikipedia. (I know it wouldn't be exactly like this; no one can vandalize news reports by USA Today staff like they can on Wikipedia; I'm just trying to make a point here, jeesh.) Will "communities around the news" challenge our trust, too?

I'm also worried that newspaper companies looking to do journalism on the cheap -- and we know there are plenty of those out there -- may take the otherwise positive notion of "creating community," a notion that appeals to our native American sense of fairness and equality, and use it for far less noble ends.

The buzz words now are hyperlocalism, citizen journalism, iReporting and so on, but mostly hyperlocalism and citizen journalism. On CNN, they're calling it "iReporting" or something like that. Anyway, it's sounds great, so empowering to the community, but guess what? It's also really cheap.

Cheap is the magic word to boardroom VPs of publicly traded news companies looking for a rationale to acquire information, images and video that previously, in order to get them, required paying a professional journalist a salary and benefits.

Now, backed up by the ideology of "hyperlocalism" et al., execs can talk up the case for more "reader-generated content" while talking down the fact that it's free of charge. We arts journalists should be concerned. We're already feeling pretty insecure about our relative value to the news hole. Well, with the new USA Today website, users can "write arts and culture reviews of their own."

Cheap is as cheap does but that don't matter when shareholders come a-calling.

And while management is focusing on getting more readers involved in the newspaper (everyone is blogging, blogging, blogging), they're failing to provide the human and financial resources needed to give readers something to read.

In my newsroom alone, the staff has shrunk by half since I arrived. Many have left for other newsrooms, some have been fired, others have left the field entirely. One reporter left to head a nonprofit in education. Another to be a PR flack for the city of Savannah. One is taking a post with a research foundation. And another, a veteran sportswriter, resigned yesterday to work for the local TV station.

Local TV news! That's got to be a new low.

Readers are taking notice, too. We have this feature called the Vox Populi. Readers can call in to voice their opinions anonymously. A recent caller complained about the lack of reporting on local news and issues in the Savannah Morning News: "Why do you have a multimillion dollar building but only run stories from the AP?"

I guess I'd better get used to repeating myself, but here it is again: While I understand why some are celebrating the potential of emerging technologies, I'm still skeptical. We don't live and work in a vacuum. While the technology, like the phonograph, may change our consciousness, we still have to make a living.

I don't mean to sound alarmist or even Marxist. I just think there's reason to worry as long as profit-driven, growth-oriented companies run newspapers. They are going to do what's best for shareholders, not journalism and not for readers, as John S. Carroll, the former editor of the LA Times, noted in a speech cited in Russell Baker's essay on the state of newspapering.

Bottom line: the bottom line trumps all, and if we're not careful amid our celebration of the potential benefits of emerging technologies on arts journalism, we may, like our friends the musicians, end up finding out later on that no one's getting paid.

August 15, 2007 6:22 AM | | Comments (3)

I've never understood those music fans who have an obsessive need to know their idol's favorite color or birth date. I adore Rufus Wainwright and Andrew Bird, but I don't care much about their personal lives. Their music is what matters to me. Similarly, I seldom seek out information on the personal backgrounds of writers I admire, but this profile of Whitney Gould, from the current issue of Milwaukee Magazine, caught my attention.

Gould is the architecture critic for the Milwaukee Journal Sentinel. It feels good simply to type that because so few papers have architecture critics, and Gould is an especially fine one. A Wisconsin native, she writes with intelligence, passion and a clear point of view. First and foremost, I think of Gould as someone who demands authenticity and honesty in architecture. She doesn't despise faux-historical buildings out of some middlebrow sense that they're ticky-tacky; she objects because virtually all architecture worth its salt is, in some way, genuinely of the time and place in which it was built.

The Milwaukee Magazine profile, despite some odd word choices ("grandmotherly," anyone?), is well worth reading. Gould's family background and entry into journalism are intriguing.

And for a taste of Whitney Gould's own writing, a recent column called "We should care about good design" directly addresses why the public should be invested in the built environment surrounding them. Indirectly, it also makes a case for architecture criticism, since people in Gould's profession have a plum opportunity to raise issues and shape the debate. And through periodic online chats, Gould engages in real give-and-take with the public.

Gould closes her column with a quote from Winston Churchill ("We shape our buildings; thereafter, our buildings shape us") and this observation of her own: "[Buildings] affect the quality of life in our neighborhoods; they establish the identity of our cities; they color our work days. If we don't make it our business to care about such things, we will deserve the awful results."

I'd argue that in smaller to mid-size cities (like Madison, where I live), good architecture is even more crucial since a single major project has a bigger impact on the overall look and feel of the city.

Since Flyover (namely, John) has been in a quotin' mood lately, I'll throw out a favorite of my own, something I've had tacked near my computer for ages:

"Our culture is first of all an urban one, the city the place of our history and our social life--factors that have impressed themselves inextricably upon the face of the houses, but also in the structures of the streets and plazas." (Christoph Schreier, from the exhibition catalog "Thomas Struth: Strassen--Fotografie 1976 bis 1995)

This idea of the city as the locus of our history in a very physical way, with that history literally written upon its face, has long affected me, and it's why I share Gould's view that good architecture matters--as does thoughtful writing about architecture.

(Note: The Schreier translation from the German is mine. Here's the original for my fellow Germanophiles: "Denn unsere Kultur ist zuallererst eine städtische, die Stadt der Ort unserer Geschichte, unseres soziales Leben, Faktoren, die sich unauslöslich in das Gesicht der Häuser, aber auch in die Strukturen der Strassen and Plätze eingeprägt haben.")

August 14, 2007 6:00 AM | | Comments (1)

Why historians, and the rest of us, avoid Big-C Culture . . .

What happened to [historians'] assumption that cultural history was crucial to comprehending America, past and present? Basically, the post-World War II conceptions of what constituted both culture and history crumbled in the 1970s. The civil-rights and women's movements, together with the more-relaxed immigration laws that inspired a new wave of ethnic migration, largely from Latin America and Asia, forced historians to ask: Whose culture? Whose history? The answers led not only to a sharper focus on the social history of those groups previously neglected by scholars and teachers, but also to an anthropological definition of culture [my italics]. What counted now was the culture of daily life -- how people behaved in saloons and department stores, what kinds of clothes and cosmetics they bought, whether they were active or passive when they listened to the radio, and above all how they were manipulated by the ideology of consumerism . . .

. . . Traditional cultural history was clearly under assault by the 1970s and 1980s. But, ironically, no cluster of scholars did more to undermine the field than the cultural historians themselves [mine]. While [Warren] Susman continued to highlight cultural issues in his collection of essays, Culture as History: The Transformation of American Society in the Twentieth Century (Pantheon, 1984), [Christopher] Lasch became more interested in psychology and social criticism, as in such best-selling books as The Culture of Narcissism: American Life in an Age of Diminishing Expectations (Norton, 1978).

Then [Lawrence] Levine published his most influential book, Highbrow/Lowbrow: The Emergence of Cultural Hierarchy in America (Harvard University Press, 1988). He portrayed the high-cultural venues of the late 19th century -- theaters, opera houses, concert halls, libraries, and art museums -- as sanctuaries for the rich. Having failed to elevate the tastes of the masses, who were seduced by disreputable entertainment like vaudeville and the movies, the wealthy (according to Levine) escaped into their own luxurious asylums, shielding themselves from the chaotic and alien babble in the streets. Behind closed doors, they resolved to serve as the sentinels of high culture [italics mine], guarding the fortress of art, literature, and music. Thus, for Levine, high culture became less a shared possession of the entire society than a refuge for snobs . . .

. . . Levine surely did not intend to turn his colleagues and students away from cultural history. Indeed, he continued to write about American culture throughout his career. But Highbrow/Lowbrow implied that high culture was inherently esoteric, class-bound, and somehow "undemocratic" [my italics]-- in short, antithetical to the values social historians championed.

Yet if high culture seemed elitist in the eyes of many American historians, popular culture was insufferably commercial [italics mine] -- and therefore equally distasteful as a subject of study. In the earliest days of cinema, as some historians noted, movies had been aimed at an immigrant, working-class audience. But soon the moguls took over (though they were immigrants, too) and converted an egalitarian art form into a money-making machine. Similarly jazz and the blues were once the creations of African-American musicians and performers with deep ties to the black communities in Chicago and New Orleans. Then white record producers, promoters, and agents transformed an authentic folk music into just another big business. So to write or teach about popular entertainment meant that you wound up exploring not the history of culture but the history of capitalism [my italics].

From Richard Pells' "History Descending a Staircase: American Historians and American Culture" for the Aug. 3 edition of The Chronicle of Higher Education.

August 14, 2007 1:26 AM | | Comments (1)

Just what you'd expect from an artist, right? . . .

According to a column in Sunday's Pittsburgh Tribune-Review, a local art teacher named Jaison Biagini lost his job after winning a radio contest. The contest was hosted by the satellite radio program called "Bubba the Love Sponge," whose host apparently enjoys spraying carburetor cleaner on penises (I'm just repeating what I've read). Biagini's prize was a trip to Tampa, where, as an artist and teacher of art he naturally wanted to visit the Salvadore Dali Museum. What got him in trouble though was his escort: a woman named Akira, an online porn actress.

The oh-so-clever columnist, Eric Heyl:

For an online hussy, Akira certainly has varied interests. Visit her site (Google it yourself, this is a family newspaper), and you'll discover there is more to her personality than all the bawdy stuff.

"My favorite things in life are weed, the Internet, getting naked and just being downright bad," she states on the site. There. Doesn't that fracture the stereotype of the typical adult Internet starlet?

Monessen Superintendent Cynthia Chelen acknowledged Friday that district officials "received feedback" from the community regarding Biagini's less-than-lurid encounter with Akira, portions of which were broadcast on Bubba's show.

Some people apparently felt it wasn't appropriate for a high school teacher to keep such company."

All this sounds pretty funny until we learn more about Biagini.

In a brief interview, Biagini, confined to a wheelchair after an accident several years ago, denied he was forced out. He said he has contemplated retirement for two years.

Asked why he is leaving now, after 14 years with the district, he said, "I have severe arthritis in my shoulders. My range of motion is limited. I thought it might get better over the summer, but it hasn't."

And then this, the pithy and downright mean-spirited ending:

Despite Biagini's explanation, you can't help wondering if he would still be teaching had he preferred the work of someone other than Dali.

The outcome might have been decidedly different, had he just gone to the Andy Warhol Museum [located in Pittsburgh].

Man can't walk. Dude's riddled with disease. Likely his dick don't work. But ridiculing him publicly is OK. He brought it on himself for wanting companionship and the feeling of being wanted by an attractive woman who presumably gave him her undivided attention.

Getting fired doesn't inspire in the columnist any sense of injustice either. For what reason. This is just the kind of degenerate bohemian godless hippie shit you'd expect from some kind of degenerate bohemian godless hippie artist. We don't care if he'd crippled, endures chronic pain, suffers from crushing loneliness, depression and despair. Weirdos deserve what they get.

Such is the kind of understanding of the arts we face here in the American Outback.

August 13, 2007 9:43 AM | | Comments (2)

A sentiment still alive today? . . .

"Art is from the outset naturally not for the people. But one wants to force it to be. Everyone is supposed to have their say. For the new bliss consists of the right to speak: free speech! Oh God!"

Arnold Schoenberg, quoted in Alex Ross's book, "The Rest Is Noise," due out in October.

August 13, 2007 7:46 AM | | Comments (0)
"That does bring up a subject I wonder if you guys might address on Flyover: For journalists who do blog, what kind of ground rules and expectations have publications given them, and how have they worked blogging into their work schedules?"

That's from Rich Copley, the copious culture writer for the Lexington Herald-Leader in Kentucky. He was responding to last week's post about the journalistic value of blogging.

His question -- a very pragmatic one, I might add, pragmatism being a rare topic of discussion here on Flyover -- has come up a lot in my newsroom and I'm sure in many others.

When any new means of communication arises, the natural question, for those of us who believe in and comport ourselves according to a relatively strict set of ethical guidelines, is: "What are the guidelines?"

Furthermore, what are the ethics of blogging, the legal strictures of blogging, the tone of voice, the appropriate subject matter? More fundamentally, perhaps, is the question of objectivity: As cultural journalists, to what degree can we assert professional points of view, knowledgeable perspective? To what degree does the critic's voice live among the impartial fact-finding of the journalist? This question obviously applies to writing for print, but is it different for blogging?

I agree with Rich that Flyover would be a good place to sort out these questions. Or at least to figure out what sorts of questions to sort out. Please let us know what you think. After we've gotten a number of responses, I'll put them all together in one post for easy reference. -- J.S.

August 13, 2007 6:16 AM | | Comments (5)

Who knew he could be so lighthearted . . .

O O O O that Shakespeherian Rag--

It's so elegant

So intelligent

From T.S. Eliot's "The Waste Land," a snippet of popular song recalled amid a wife's mental breakdown. The levity doesn't last long. She accuses him of never talking to her:

'Speak to me. Why do you never speak? Speak.

'What are you thinking of? What thinking? What?

'I never know what you are thinking. Think.'

His unspoken reply:

I think we are in rats' alley

Where the dead men lost their bones.

August 12, 2007 6:17 AM | | Comments (1)
Does anyone else find it ironic that we're conducting a discussion of the trustworthiness of blogs by means of a blog? If I'm to believe some of the statements I see here, I shouldn't believe the statements I see here.

It's also ironic that I find myself playing the role of devil's advocate/apologist for blogs, since I very seldom look at blogs for the kind of journalism -- or even the kind of topics -- that I would expect to find in a newspaper. About the only significant exception was during the early war period, when I believe I learned more about the situation in Afghanistan and Iraq through firsthand blogs than through the mainstream US media.

Personally, I suspect that the reports of the death of the Press are exaggerated, but I certainly share Mike Boehm's concerns. I would suggest that "How much is information worth?" is not the only question. We need to also ask, "How will we know?" Trust is central. Without claiming these questions have been resolved by any means, I note that many people buy freely from web sites and ebay who wouldn't have dreamed of it five or ten years ago.

Steve Durbin

Dear Steve,
You've noted this a couple of times, so I wanted to let you know we're listening. You're right, it is ironic. Or it would be if we were talking about all bloggers. But we're not.

Many bloggers, for instance Terry Teachout and Alex Ross, are as good at what they do as anyone who works in conventional newspapers. There are others, as Teachout has pointed out numerous times, who are not part of the media machine, but who nevertheless offer insight and careful commentary on music, visual arts, fiction, etc.

The bloggers Ivins had in mind, and I would guess that Mike Boehm takes issue with, are the armchair journalists who opine from their keyboards but don't do, and wouldn't know how to do, the work of shoe-leather journalism.

I presume Molly and Mike don't mind a little pontification from bloggers since people have been shouting into the void in the letters-to-the-editor pages and the op-ed pages of newspapers for a long time. That's what Americans do really well. We let you know what we're thinking.

The concern, which I share with Mike and the late, great Molly, is the apparent movement -- or least the abundant yammering right now, hopefully faddish -- regarding bloggers, and other "citizen journalists," ushering in some kind of new era of journalism, especially political journalism.

But voicing an opinion based in facts gathered by reporters on the street does not a journalist make.

I agree with you that analysis, commentary, observation and insight are valuable and I applaude and encourage each. If more people offer intelligent commentary on the communities they live in, more power to them. At risk of pointing out the obvious, that's what we do here at Flyover: We offer our perspective on the arts and arts journalism from our vantage points in the American Outback. Granted, some of us might offer some reporting but nothing like what we do for the newspapers we write for.

Again, writing an opinion is different from reporting the facts, but not too different. And as Ivins noted, opinion writing should be approached with the same schemata (though I'm sure she wouldn't go anywhere near that word; I'm not sure I should go near it either) used by the reporter covering a seven-car pile-up on the highway -- balanced perspectives, established credibility, accuracy, fairness, piercing the veil if need be, being able to impartially see if a veil exists in the first place that might need piercing, etc.

Writing opinion is about providing clarity, synthesis and, ideally, about extracting a clear sense of meaning from the ambiguities of reality for the benefit of others, not oneself.

Molly Ivins always gave voice to the issues most important to those who had no voice. That's what made her special. That's what separates her from the Blogger Legion. That's why people, many who shared her sympathies and points-of-view, found they could trust her.

That said, this report suggests that blogging may be a fast fading trend, as least among the quick-hit opinion-mongers, as Molly might say.

Time will tell.

With sincerity and respect,
J

August 11, 2007 6:18 AM | | Comments (1)

This is from Mike Boehm, the hard-boiled (meant with ironic affection) arts reporter at the LA Times. He comments on Flyover regularly. What he has to say matters. Like Gary Panetta's comment today, this comment is too good and too relevant to be lost in the comments section. Mike B. was inspired to write to us by one of our Hinterland Diary entries on Molly Ivins.

Ivins expressed her wish that all bloggers -- or "opinion-mongers," as she calls them -- were required to spend time as beat reporters. That way, she said, they'd have more understanding of the difficulty of gathering facts and writing about the truth. Per the usual, however, Mike touches on an array of topics, mostly the true value of committed, impartial and ethical journalism. Thanks, Mike. -- J.S.


God, I wish all the bloggers who think they can generate the news as well as comment on it were forced to spend a month interning as cub reporters. See what it's like to cover a cop shop or a courthouse, folks, and smell a news story in the flood of daily crime reports, lawsuits and indictments. Or try to get at the facts of an environmental rape being covered up under reams of verbiage in a developer's bogus environmental impact report.

Just because you can punch up the info in a few keystrokes on your computer doesn't mean the info itself was easy to come by, or to shape into clear and coherent form, complete with context that makes its significance intelligible. It takes skill, commitment, experience and a passion for the democratic process to be a good reporter.

It can't be done properly from the sidelines, when your day job permits, and, I.F. Stone notwithstanding, it is very rarely done without the support of a well-funded corporation or institution.

Molly Ivins, rest her soul, surely knew that; there wasn't anybody more independent and feisty, yet she was not, to my knowledge, a freelance journalist.

Yes, the corporations and institutions devoted to the news are in a panic, er, a period of transition, so it has become fashionable to declare them easily replaceable on the cusp of a great new era of cheap and easy information.

New technologies may make it easier to extract oil or precious metals, but to say that they alone permit the extraction of facts against the wishes of those who have an interest in hiding facts is to take a naive leap of faith.

A society that's naive about the capital and personal commitments required to mine pertinent facts (I'm deliberately not saying "truths," because pertinent facts are hard enough to come by) will get the kind of information and the kind of governance it deserves.

The end of the press as an Estate standing apart from the other powers that be will make the other powers more powerful and immune to public opinion (and yes, I'm aware that all too many news outlets fail miserably to maintain the independence and skepticism toward power that our system presumes as the press's role, but many good ones are still up to the task).

Can you imagine a blogger trying to report on a car company, or a drug company, or a corrupt sheriff, or the iffy business practices of a major arts philanthropist, without the backup of libel lawyers and a corporate budget to pay them?

Can you imagine what would have become of Woodward and Bernstein's newsgathering if they'd been independent freelance reporters and the Nixon administration had the DC police trump up some penny-ante trespassing or drunk-driving charges against them?

Maybe the future lies in large new newsgathering institutions based on the Web -- newspapers by other means -- but I certainly hope it doesn't lie in some fanciful utopian free-for-all of passionate amateurs whom the powerful will be able to swat away at will.

The bottom line issue is: How much is information worth, and will the public ante up what it takes to supply the local/national/international/cultural information needs of a robust democracy?

August 10, 2007 12:02 PM | | Comments (6)

This is from Gary Panetta, the culture writer and resident philosopher at the Peoria (Ill.) Journal Star. He offers further commentary on this morning's post. It's too good to get lost in the comments. Thanks, Gary. -- J.S.

Thank you for this post and for raising the issue again.

"Truth" can be broken down in several ways.

(1) What's logically true or true by definition, i.e. mathematics. Start with certain premises and certain conclusions must follow.

(2) What's empirically true, i.e. what is factually the case. The standard here is "true beyond reasonable doubt." No history can claim to be the definitive, once-and-for-all time account of an important event. But some histories are more factually accurate than others, some histories are better informed than others. Just because we lack godlike, absolute knowledge doesn't relieve us from the responsibility of making reasoned choices about what we believe.

(3) What's morally or ethically true, i.e. what reflects the deep truths about what makes us human and what helps us succeed or fail to live together.

This kind of truth is difficult because it doesn't exist in the abstract -- it is always embodied in the concrete. Unless I've experienced a loving act or deed, abstract discussions of love will do me no good. Unless I can embody love or justice in my own life, just thinking about such things abstractly is pointless.

This is why the arts, especially theater and storytelling are so important. They put flesh and blood into what are otherwise abstract discussions about conscience and struggling with conscience, with guilt and the need for atonement, generosity and tight-fistedness, etc.

Such things can't be embodied in our lives or in the arts once and for all -- they have to be embodied again and again. We have to discover what they mean again and again.

Notice the word "discover." I may think I know what it means to love or to sacrifice, but experience is going to teach me whether I was correct in my presuppositions. In other words, there is such a thing as "objective truth" although we humans have a hard time grasping it.

We forget sometimes the continuity of human experience. King David is a product of an ancient, alien culture that I don't understand. But his temptation to lust and murder is perfectly contemporary. Non-Western societies have no trouble grasping the essentials of Shakespeare. There is something called collective human wisdom -- so much the worse for us if we ignore it.

August 10, 2007 10:10 AM | | Comments (0)

A familiar scene of art in America today?

"The entire discourse surrounding the Viennese avant-garde demands skeptical scrutiny. Certain of these 'truths' -- fatuous generalizations about women, obnoxious remarks about the relative abilities of races and classes -- fail to impress the modern reader. . . . As in prior periods of cultural and social upheaval, revolutionary gestures betray a reactionary mind-set. Many members of the modernist vanguard would tack away from a fashionable solidarity with social outcasts and toward various forms of ultranationalism, authoritarianism, even Nazism. Moreover, only in a prosperous, liberal, art-infatuated society could such a determinedly anti-social class of artists survive, or find an audience. The bourgeois worship of art had implanted in artists' minds an attitude of infallibility, according to which the imagination made its own laws. That mentality made possible the extremes of modern art."

From Alex Ross' new book, to be published in October, "The Rest Is Noise: Listening to the Twentieth Century."

August 10, 2007 6:50 AM | | Comments (0)

In need of more than just 'essential truth'

A couple of weeks ago, I wrote a post that touched on multiculturalism and its tendency toward a kind of relativism that inspired incredibly thoughtful comments. Two have stayed with me, because they are about the center that continues to lack the strength to hold. That is, the primal human need for universal principles seems to be daily challenged by the dynamic forces of contemporary American society. I'm not alone in this observation.

From Gary Panetta:

"The trendy philosophies undergirding multiculturalism, ironically, abet this market-driven mentality. By 'multiculturalism,' I do not mean the inclusion of non-white or non-male artists and writers into what used to be the literary and artistic canons. I mean instead of all those obtuse academic arguments that try to maintain that one thing is just as good as another and that all social relations boil down to power.

"Such arguments are not progressive but reactionary. If 'excellence' is strictly relative, then there's nothing wrong with letting the marketplace decide what's good. You can only critique the market if you have some place to stand outside of the marketplace. But no such place exists if the very idea of excellence (or justice, or mercy) is reduced to mere opinion. Sanity becomes statistical."

From David Sokolec:

"The fear of somehow ignoring the next Van Gogh has made idiots of us all, and when a doctoral candidate and the author of a book entitled 'You Are Not the Boss of My Words' can say, as was reported in today's New York Times in an article on a children's book that "perfect grammar is whatever works," then we have a very dim future ahead.

To blame any of this on multiculturalism, however, is a mistake. Multiculturalism simply tried to show that art and stories from countries or segments of society not usually heard from, or coming from a non-Western tradition was as valid as the more traditional western based art.

This seems to me a good idea. The problem is that no standards by which that art was judged were applied. That it was a good idea does not mean that any given work was good simply because it came from that segment.

Again the refusal to establish some means for judging the work, in fact, does a disservice to everyone involved. Were we to send a high school band and the New York Philharmonic to some non-Western country, we would rather hope they heard the difference in quality."

For David and Gary, and in appreciation those seeking more than "the essential truth" and the rewards of conversation beyond the relativist confines of what is "true for me," as James Frey might say, I offer this piece. -- J.S.

A million little pieces of postmodernism
Three books shed new light on the troubled state of "essential truth"

When Oprah Winfrey expelled James Frey from her book club last year after he confessed to fabricating large portions of his best-selling memoir, "A Million Little Pieces," the move sparked a nationwide conversation about the importance of truth and the categorical value of not being indifferent to it.

At that point, Frey was the latest, and most conspicuous, in a long line of fabricators that included Jayson Blair of the New York Times, Stephen Glass of the New Republic and Jack Kelly of USA Today, all of whom were exposed to have invented stories, undermining journalism's raison d'etre.

None of these garnered the same kind of attention as Frey, because none of them had ever been given Winfrey's seal of approval, which means big money is at stake. Frey's book sold 2 million copies in three months. It was the fastest-selling book in the history of Winfrey's club.

Not long ago, Winfrey may have accepted Frey's defense that his book, about his battle with addiction and search for redemption, represented the "essential truth" of his troubled life. That details were inaccurate, or made up in toto, was irrelevant to a kind of profound truth one cannot express when hemmed in by mundane details.

In the mid-1990s, no one, including Winfrey, seemed to suffer pangs of conscience when John Berendt, after the publication of "Midnight in the Garden of Good and Evil," admitted to "rounding the corners" of fact to provide better narrative grip. The book ranks among the best-selling works of nonfiction of all time.

Something has changed and 2006 may prove to be the benchmark of that change. This week, the New York Times' Website posted the most widely read book articles of 2006. Third from the top, beating out the 100 most notable books of the year, is, you guessed it, Winfrey's rebuke of Frey. Of the 10 most viewed stories, in fact, three are about Frey's fall from grace.

In light of this, two new books -- one a philosophy, the other a history -- take two completely different approaches to truth. These books follow a landmark debate, reprinted recently, by two eminent 20th-century philosophers arguing over the essential question of human nature, a conversation that's perhaps more relevant to the Frey affair than we think.

The first is Rohan Kriwaczek's "The Incomplete History of the Art of the Funerary Violin," (Overlook Press, $24.95, 224 pages). On the face of it, the book, published last year in the United Kingdom and this month in the U.S., is a work of historiography outlining the rise and fall of a lost funeral music derived from the Protestant Reformation and suppressed by the Vatican by the middle of the 19th century.

Sounds interesting. Problem is, the book, energetically written and elaborately illustrated, is a complete falsity, a hoax, according to the New York Times and the London Guardian. The newspapers report there was no such funereal music and there was no such papal conspiracy to eradicate it.

The author and publisher have since admitted the book is spurious. Peter Mayer, the book's publisher, told the Guardian that he didn't know if it were true or false but that either way it's "a work of extraordinary nature." Kriwaczek said his critics misunderstood his intention to write a "serious artistic statement" and a "musical philosophy."

Like Frey's memoir, what matters to them is the book's "essential truth," not its factuality. Unlike Frey, however, Kriwaczek's tome is not a personal meditation on subjective experience, but a history that's supposed to include footnotes and a means of verification. It doesn't and it can't be.

Qualified truth like this worries people like Harry G. Frankfurt, a former professor of moral philosophy at Princeton University and the author of our second book, "On Truth" (Knopf, $12.50, 112 pages), published in November.

Kriwaczek's book comes during a era in which, Frankfurt writes, "what a person regards as true either is a function merely of the person's individual point of view or is determined by what the person is constrained to regard as true by various and inescapable social pressures."

Frankfurt calls purveyors of this viewpoint "shameless antagonists of common sense -- members of a certain emblematic subgroup of them call themselves 'postmodernists' -- (who) rebelliously and self-righteously deny that truth has any genuinely objective reality at all."

We expect this kind of attitude from politicians, businessman and publicists, Frankfurt writes, but recently, a "more reliable class of people," including journalists, historians and memoirists, many of them self-regarding postmodernists, has tolerated or endorsed a relativist attitude toward truth.

"As for the entitlements of deference and the respect that we ordinarily assign to fact and to truth, the postmodernists' view is that in the end the assignment of those entitlements is ... simply a matter, they insist, of how you look at things."

Is postmodernism really to blame for the likes of Frey and Kriwaczek? Does a 35-year-old intellectual movement skeptical of universal truths and Enlightenment ideals (like truth, justice and objectivity) but devoted to a quantitative assessment of the nuances of power explain how someone like Frey could believe the "essential truth" to be true enough and someone like Kriwaczek could believe a made-up history to be an artistic statement?

Perhaps.

By happenstance, our third book, reprinted in September, is one of the benchmarks of postmodern thought, "The Chomsky-Foucault Debate on Human Nature" (New Press, 128 pages, $14.95). The book, among many other things, strips down the concept of human nature into two camps.

One, expressed by Noam Chomsky, the linguist and political commentator, believes human nature to be a universal attribute. The other, posited by French philosopher Michel Foucault in this 1971 debate on Dutch television, holds that the existence of human nature doesn't matter.

What matters for Foucault and postmoderns is how the concept of human nature is employed by those in positions of power. With a Realpolitik stance on ideals, they say: What is just or true depends on the socio-political position of who's telling you something is just or true, and the socio-political context in which the statement is made.

Thus, in this postmodern view, truth (or justice, or objectivity or fill in the blank with the universal principle of your choice) depends, to paraphrase Frankfurt, on how you look at it.

A major flaw of this brand of thinking, of course, is moral bankruptcy: The postmodernist never has to take a stand. In the bargain, a principle like truth becomes malleable, supplying Frey the intellectual undergirding for claiming his memoir is mostly true, nevermind that some of it is false.

Which is why Frankfurt felt it necessary to follow up his 2005 treatise "On Bullshit" with this new book. A sequel was needed, he said, because the first book, which did define why indifference to truth is deleterious, however failed to define why truth is important.

The strength of his case rests on his assertion that truth has practical utility, and that judicious application of the truth is the hallmark of a just society. "How could it possibly flourish, or even survive, without knowing enough about relevant facts to pursue its ambitions successfully and to cope prudently and effectively with its problems?"

On balance, though, postmodernist thinking can be argued to have done more good than harm. Many Americans used to think they were being objective in their assessments of African Americans. We now know that perspective was erroneous, if not criminal.

And Frankfurt does allow for the possibility of something good to come from something like Frey's and Kriwaczek's books, "showing, in other words, what conclusions those statements would rationally warrant if they were actually true rather than false."

However, why bother? Frankfurt writes: "This display of reasoning might be an entertaining and even, perhaps, an impressive exercise ... Under ordinary conditions, however, there would not be much point to it."

This article appeared in slightly different form in the Savannah Morning News Jan. 28, 2007

August 10, 2007 6:50 AM | | Comments (2)

If only more liberals sounded like this . . .

"Ah, for the good old days before the hologram [i.e., Big Media] and its hyperstimulation of 'consumer affluence,' the days of 'America's teeming masses,' that sweat-soaked, beer-farting mob of ordinary working Americans who didn't have a pot to piss in by today's standards, much less a credit card, but still knew bullshit when they saw it. Guys that looked like William Bendix and were unapologetic about earning their bread by their mitts and never heard of the word lifestyle. Women in curlers who would have laughed Martha Stewart off the map. Them was Americans, bub!"

From "A Feast of Bullshit and Spectacle: The Great American Media Mind Warp" by Joe Bageant, a lefty liberal born and bred in the Land of Dixie who's (ironically) waxing nostalgic for the good ole days when working-class America stood tall and didn't wake up everyday needing to face the "artificial collective product of the corporately 'administrated' modern state economy." He is the author of "Deer Hunting With Jesus: Dispatches from America's Class War." He wrote this piece for Alternet.

August 9, 2007 9:27 AM | | Comments (1)

Beware the 'rhetoric of relativism' . . .

If there is no objective truth, but only subjective truth (hence your five-car pile-up analogy) -- then what difference does it make if someone was a reporter or not? I am able to state subjective truth at a moment's notice -- it's always true for me!

A reader on a column about the decline of American newspapers by the late Molly Ivins. This comment takes issue with Ivins' assertion that bloggers are no replacement for what traditional journalistic organizations do. She wrote:

Bloggers are not news-gatherers, but opinion-mongers. I have long argued that no one should be allowed to write opinion without spending years as a reporter -- nothing like interviewing all four eyewitnesses to an automobile accident and then trying to write an accurate account of what happened.
From "The Slow Death of Newspapers" published March 23, 2006.
August 9, 2007 7:17 AM | | Comments (1)

What the hard sciences have to say about culture . . .

"The doctrine of the noble savage--the idea that humans are peaceable by nature and corrupted by modern institutions--pops up frequently in the writing of public intellectuals like José Ortega y Gasset ("War is not an instinct but an invention"), Stephen Jay Gould ("Homo sapiens is not an evil or destructive species"), and Ashley Montagu ("Biological studies lend support to the ethic of universal brotherhood"). But, now that social scientists have started to count bodies in different historical periods, they have discovered that the romantic theory gets it backward: Far from causing us to become more violent, something in modernity and its cultural institutions has made us nobler."

From "A History of Violence" by Steven Pinker for Edge

August 9, 2007 6:09 AM | | Comments (1)

Last fall, my husband and I began playing a game with several diverse groups of friends. We tried to find five books that everyone in the group had read cover to cover (skimming didn't count)--and children's books were included. Most of the time, we couldn't do it.

One particular instance stood out to me. We were at a wedding reception and all of the people at the table were well-read, highly educated people who were involved in one way or another with the artistic community. Many of them had liberal arts degrees, a few were teachers. There was an age range of about 15 years, touching upon different generations but still close enough to expect that we would have similar cultural experiences. We were able to come up with only four books--most of them children's books though, oddly, the relatively obscure Maus was the one adult choice. (The others were "Tom Sawyer," "Green Eggs and Ham," and "Charlotte's Web.")

When my husband and I played the game alone, the list stretched longer that we were able to mentally keep track of. Part of what contributes to the strength of our marriage is that we have a huge foundation of shared ideas. We've spent most of our lifetime engaged in critical discussion about those ideas. This has given us a common vocabulary, a vocabulary that lets us share humor at life's events and to work intuitively together in times of crisis.

What's true for a marriage can be true for the wider society. It's part of the role that art plays. It gives us a common language to speak so that we can appreciate the admirable qualities of those we live with and enable us to work together when challenges are laid before the community.

I'd also venture an argument that it is why mass media has been so compelling. It brings people together and helps them form a commonality in experience and language. People easily slip into conversations about "Lost" or "American Idol" because they make the presumption that the people they are speaking to has seen it.

It's also the appeal of the Web because people can start a conversation by sending the link. It's far easier than buying someone a book, waiting for them to read it, and then having the discussion. It's even easier than listening to the radio, going to the movies, or watching television.

Art has struggled in recent years in part because it doesn't reach the masses of audiences that popular culture does nor does it have the easy accessibility of Web offerings. However, within its communities, it forms a far tighter bond because the experience tends to be of greater intensity. It also ties them more directly to people whose faces and voice they recognize.

Several years ago I attended an outdoor production of Shakespeare's Richard III at the Michigan Shakespeare Festival. The scenes where Richard and Richmond extort their troops was done on opposite sides of the stage, switching back and forth between them. On this particular night, there had been a storm raging in and out. However, the audience stayed despite the pouring rain so the actors continued to perform. During that scene, while Richard talked of bad omens, the wind picked up and whipped his banner off its stick, blowing it away into a pile of water while Richmond's continued to wave proudly. Later, lightning cracked the sky above Richmond's troops as they marched in from the voms to the final, fatal battle. It was special effects by God that night.

Years later, anyone who was in that audience has an immediate connection to all others who were there. We may not know each other's names, but we talk about our shared experience with a passionate "remember when" that would rival any family reunion.

Also, those who hear the stories about live productions also get the chance to share in the experience and when the story is compelling and memorable enough, it becomes part of the shared culture that ties them to their neighbors.

It's one reason that I hesitate to measure art by the numbers. What happens with high art is important--even if it is not directly experienced by the masses or creating a profit that rivals other businesses or forms of entertainment. Art must be able to sustain itself because it, in turn, sustains the community and the people who live in it.

August 9, 2007 2:00 AM | | Comments (7)

Maybe he wasn't so bad after all . . .

"A prince should also show his esteem for talent, actively encouraging able men, and honouring those who excel in their profession. Then he must encourage his citizens so that they can go peaceably about their business, whether it be trade or agriculture or any other human occupation. One man should not be afraid of improving his possessions, lest they be taken away from him, or another deterred by high taxes from starting a new business. Rather, the prince should be ready to reward men who want to do these things and those who endeavor in any way to increase the prosperity of their city or their state. As well as this, at suitable times of the year, he should entertain the people with shows and festivities [italics mine]. And since every city is divided into guilds and family groups, he should pay attention to these, meet them from time to time, and give an example of courtesy and munificence while all the time, nonetheless, fully maintaining the dignity of his position, because this should never be wanting in anything."

From "How a prince must act to win honor," chapter 21 of Niccolo Machiavelli's "The Prince," translated by George Bull.

August 8, 2007 6:58 AM | | Comments (0)

A purely electronic culture
Last week, I argued that new technologies -- iTunes, Google and YouTube for instance -- are like the phonograph of the late 19th century. A new, strange and innovative kind of technology, the phonograph and its ability to bring music into the home would go on transform our cultural consciousness by changing the way we hear (and perform, compose, conceive, understand and interact with) music.

It's no surprise recordings -- wax cylinders, phonographs, magnetic tape, CDs -- have affected how we value music and what we value in a live performance. (Flamboyant sonorities rather than subtle sounds were able to pierce through the hiss and crackle of those early records.) Recordings also raised the standards of performance (musicians are more likely to compare themselves to a recording than a live performance).

And recordings may have accidentally groomed us for surprise, as Alex Ross observed in a 2005 piece for the New Yorker: "The reigning unreality of the electronic sphere can set us up for a new kind of ecstasy, once we unplug ourselves from our gadgets and expose ourselves to the risk of live performance."

That might be an optimistic point of view. Ross also noted more pragmatically that Glenn Gould might still be proved right. The pianist predicted, in a 1964 essay titled "The Prospects of Recording," that given the power and perfection of advanced recording technology that "the concert would eventually die out, to be replaced by a purely electronic music culture."

Change in consciousness, change in media
It's almost too obvious to point out that technology is changing the way we experience music, movies, news and more over the transom of the 21st century. I haven't owned an iPod long and already I wonder why I have so many CDs, so many CD cases, so many record jackets for which trees were felled. CDs take up so much space and they constantly need reorganization. I lose them, I lend them, I forget I even own them. There's just no good way to organize something so corporeal.

Anyway, the change in consciousness following the rise of the phonograph can be compared to the change in consciousness that we are currently feeling in the wake of Web 2.0, with one big difference: While the phonograph changed the world of sound, the emerging internet technologies changing our lives are reconfiguring the entire world of media: music, movies, journalism, publishing, what have you.

This change of consciousness might end up being so total that the rise of mass media during the 20th century, as my colleague Joe Nickell deftly observed last week, may seem increasingly like an anomaly as media history continues to unfold.

I really don't want to sound extreme here, but writers, journalists and commentators are already talking about this change. David Shumway, in this piece for the Chronicle of Higher Education, laments in a way the decline of mass media, because it means the decline of the once sacred status of the rock icon.

And as illicit downloading continues unabated (in fact, it's increasing according to this piece from the London Guardian), the power of musicians -- or the record label or the music publishing company -- to control the product gets lost in a feedback loop: the more music is downloaded illegally, the more it's downloaded illegally, because the more it's done, the less likely people are to pay for music.

Shumway writes: "In popular music, the decline of a genuine mass audience has meant that it is harder and harder for a performer to attain recognition beyond his or her niche. Those whose recordings now top the charts usually seem to be the least culturally significant, often lacking either the musical distinction or the political commentary that one can still find among less popular performers. But the bigger issue is that even this music reaches a small fraction of the total audience. One could argue that the term 'popular music' itself has become outdated because no style of music reaches a broad enough audience. My undergraduate students typically know the music from my college years -- the Beatles, Eric Clapton, Joni Mitchell, Led Zeppelin, and so on -- but it is often difficult to find more than a few who are all familiar with the same current releases. As a result of this audience fragmentation, popular music and its performers have lost the cultural centrality they once enjoyed, and that means that fewer people are interested enough to pay for the product."

Surviving by suing
A few facts that bear repeating: CD sales dropped by 20 percent in early 2007. Sales of songs went up slightly but sales of whole records went down. (It's appears, to me anyway, that people have figured out that you no longer have to buy the entire record to get two or three good songs.) Some reports say that 1 billion songs are traded illegally every month. I've read elsewhere that number is more like 2 billion.

The downward spiral is so out of control that the Record Industry Association of America has, according to this Post article, resorted to sending letters to college kids threatening legal action if they don't stop swapping music on their university computers. The Post goes on to say that the RIAA is putting more pressure on college administrators to keep kids from using college networks to exchange music they didn't pay for.

It's as if no one from the RIAA stopped to think about the obvious: that if a student can't download music on his school computer, he'll go to a local coffee shop with free wireless access and do the same thing sans letters from asswipe attorneys.

The record industry, I believe, doesn't want to change with these new technologies because it likes the old way better. The old way certainly made more money. But there's more to than that. The old way made more sense: The music industry was in control and with control came profits. A lot of that control centers on copyright, but while copyright likely won't change, a bigger thing has: Big Media.

A little fairy tale
Once upon a time, bands made music. Then they recorded the music for a record label. The label then sent the band over to MTV to make a video to go along with the record's newest "single" -- an archaic word describing a new song on the radio.

Meanwhile, the label made deals with all the radio stations owned by one or two corporations and with all the national celebrity-driven magazines also owned by one or two corporations. Sometimes the record label and the radio stations and the magazines were owned by one company, making all this much much easier.

Anyway, the song, the video and the magazine spread would all come out at the same time, driving record sales through the roof. Then the band would go on tour and all the regional newspapers and publications would scramble to get interviews with the band, because, you know, the band is on all the radio stations, the MTV and all magazines at the Kroger -- the band, given all this hubbub, must be popular.

It's easier to sue
This is what we used to call, with deep nostalgic feeling, Big Media. The old days are coming to an end and I suspect the industry knows it. That's why it's putting abundant resources into fighting court battles instead of finding new business models. Eminem is suing Apple over similar control issues and so is the National Music Publishers' Association. It's getting on YouTube's ass about music used on homemade video without permission. As if YouTube controls what its users do.

When acquiring music without having to pay for it becomes so commonplace that nearly half the population, according to that Guardian report, doesn't fear legal action, then it's mainstream. And once something gets to that point, the moral foundation has changed: Sure, it's still illegal, but it's acceptable. Like speeding, you can enforce the law on many people some of the time, but not all the time. It's just not possible.

But the music industry isn't alone is feeling threatened. Now newspapers are beginning to look more interested in taking positions against the "threat" of new technologies, like Google. In May, the San Francisco Chronicle reported that it was losing more than $1 million per week. As a result, it cut 100 newsroom jobs. In response, media scholar Neil Henry, in a piece appearing on that paper's op-ed page, called Google a threat to journalism's role as a public trust in a free society.

"It stands to reason that Google and corporations like it, who indirectly benefit so enormously from the expensive labor of journalists," Henry argued, "should begin to take on greater civic responsibility for journalism's plight. Is it possible for Google to somehow engage and support the traditional news industry and important local newspapers more fully, for example, to become a vital part of possible solutions to this crisis instead of a part of the problem?"

It's a good question, but not one Roger Moore, the movie critic, wanted to debate. He advised in a letter to the Poynter Institute that newspapers and newspaper guilds take the trail being blazed by an anxious music industry -- let the law-suits begin.

"It seems to me that while the big media companies might be reluctant to go after some of Google's billions, no newspaper would be facing Wall St. pressure if investors knew Google and Yahoo, et al., were going to have to part with a big portion of their billions in paying copyright fees, click-charges, or whatever, to those same companies Wall Street has written off as a doomed industry."

An active, not passive, audience
I don't know if newspapering is doomed, but there's certainly a change in consciousness underway, as I've said. A fundamental component of that change is a shift of positioning for the reader, in the case of newspapering, from the place of passive consumer to active participant. And it seems that any effort to discover a new business model to save an ailing newspaper industry will have to address this shift in the reader's position.

Doug McLennan, our host here on Artsjournal, hinted at this shift in positioning in his response to Roger Moore's complaint at Poynter. He said lawsuits are a false solution; the real responsibility for dealing with this change of consciousness lies on the shoulders of newspapering's publishers.

"If I was pointing fingers, I'd aim squarely at the business managers who are so locked into the old ways of doing things that they don't even understand what the new issues are, let alone solutions to them," McLennan wrote. "Journalists are being failed by those whose job it is to figure out the business side, and now journalists are paying the price for that lack of vision. Like somehow cheapening the product and giving readers less is going to attract more customers."

"Fast-moving web companies [like Google and Yahoo] have learned to move with audiences and make those audiences part of a community. Newspapers, for the most part, hold on to rigid models and jump on new tools (everybody blog now!) without understanding how those tools can be used."

Making the audience "part of a community" is a theme you see often in writings by commentators trying to understand the changes in Big Media. Neal Gabler wrote a piece headlined "The movie magic is gone" for the LA Times some months ago and he hit the nail on the head. With movies -- as with music and newspapers -- the producers of culture are not in charge.

The consumer is.

What's Brad Pitt got that I ain't got?
Stars used to be the center of attention, Neal Gabler wrote for the LA Times, but now it's the audience. Just as the music is being separated from the people making the music, the glamour of Hollywood and of being a celebrity is being separated from Hollywood and its actors.

"[Movies used to] provide a common experience and language -- a sense of unity. In the dark we were one. Now, however, when people prefer to identify themselves as members of ever-smaller cohorts -- ethnic, political, demographic, regional, religious -- the movies can no longer be the art of the middle. The industry itself has been contributing to this process for years by targeting its films more narrowly, especially to younger viewers. In effect, the conservative impulse of our politics that has promoted the individual rather than the community has helped undermine movies' communitarian appeal.

"All of this has been hastened by the fact that there is now an instrument to take advantage of the social stratifications. To the extent that the Internet is a niche machine, dividing its users into tiny, self-defined categories, it is providing a challenge to the movies that not even television did, because the Internet addresses a change in consciousness [my italics] while television simply addressed a change in delivery of content. Television never questioned the very nature of conventional entertainment. The Internet, on the other hand, not only creates niche communities -- of young people, beer aficionados, news junkies, Britney Spears fanatics -- that seem to obviate the need for the larger community, it plays to another powerful force in modern America and one that also undermines the movies: narcissism.

He continues: "It is certainly no secret that so much of modern media is dedicated to empowering audiences that no longer want to be passive. Already, video games generate more income than movies by centralizing the user and turning him into the protagonist. Popular websites such as Facebook, MySpace and YouTube, in which the user is effectively made into a star and in which content is democratized, get far more hits than movies get audiences. MySpace has more than 100 million users worldwide, and Fortune magazine reported that 54 million of them spend, on average, 124 minutes on the site for each visit, while 11.6 million users spend 72 minutes a visit on Facebook. YouTube's most popular videos attract more than 40 million hits, which is substantially larger than the audience for all but a very, very few movies.

And: "But these sites are arguably not only diverting viewers who might be attending the movies, they are replacing one of the movies' functions: If stars' lives are superseding movie narratives, audiences are superseding the stars. Who needs Brad Pitt if you can be your own hero on a video game, make your own video on YouTube or feature yourself on Facebook?"

August 8, 2007 12:02 AM | | Comments (0)

A familiar feeling some days . . .

Rip looked, and beheld a precise counterpart of himself, as he went up the mountain: apparently as lazy, and certainly as ragged. The poor fellow was now completely confounded. He doubted his own identity, and whether he was himself or another man. In the midst of his bewilderment, the man in the cocked hat demanded who he was, and what was his name?

"God knows," exclaimed he, at his wit's end; "I'm not myself--I'm somebody else--that's me yonder--no--that's somebody else got into my shoes--I was myself last night, but I fell asleep on the mountain, and they've changed my gun, and every thing's changed, and I'm changed, and I can't tell what's my name, or who I am!"

From Washington Irving's "Rip Van Winkle"

August 7, 2007 6:30 AM | | Comments (1)

What good critics try to do . . .

"The leader of the troop unlocked his word-hoard"

The hero Beowulf poised to speak in Seamus Heaney's bestselling 2000 translation of the epic Old English poem, "Beowulf."

August 6, 2007 1:01 AM | | Comments (0)

Form over function . . .

"The essence of art is form: it is to defeat oppositions, to conquer opposing forces, to create coherence from every centrifugal force, from all things that have been deeply and eternally alien to one another before and outside this form. The creation of form is the last judgment over things, a last judgment that redeems all that could be redeemed, that enforces salvation on all things with divine force."
The socialist philosopher and literary critic Georg Lukács, quoted in Alex Ross' upcoming book, "The Rest Is Noise: Listening to the Twentieth Century," which will be published on Oct. 23, 2007.
August 5, 2007 12:35 PM | | Comments (1)

I inadvertantly turned off the comment link for my post on Wednesday. I'm not confident that many folks would want to comment on very brief literary history of flatulence (as well as its tenuous connection to high culture, business culture, counterculture, mass culture and the rise of hip consumerism). But I did want readers to have the option. So here, with my apologies, is the post with the comment link fully enabled. Enjoy! -- J.S.

August 3, 2007 6:20 AM | | Comments (1)

Our readers often inspire me as much as our colleagues. Something that resonated with me this week was Steve Durbin's comment on Joe Nickell's post. He said that "people work for their passions, as well as for money." I'd have to agree with him. I'm not sure there is anything other than passion worth spending one's life blood and precious time for. I know that one of the reasons I do corporate writing is to subsidize the type of writing that I want to do--including arts writing.

Someone whose work has long resonated with me is Dorothy Sayers. She once wrote a play dealing with the topic of work and why we do it. Given that she considered art to be her work, I find it particularly germane to the discussion that has been going on here. She argued that we need to "estimate work not by the money it brings to the producer, but by the worth of the thing that is made."

Certainly artists and journalists alike will often say that they are called to a higher purpose than simply a bottom line. I need to make enough money writing so that I can continue to write--which means feeding my family and paying my bills. I'm not writing so that I can get rich (I wouldn't complain if that were to happen, but that would be a pleasant side effect, not the end goal).

Many artists would tell you the same thing. They're not creating so that they can become filthy rich, they're creating because they have to. To not create would be to psychically damage themselves. There is economic necessity that must be met, but as Sayers promotes, payment should be that which allows people to continue doing the work that they're doing.

This is often at odds with the capitalist society we live in. It's certainly at odds with the exaltation of acquisition above all else. It's perhaps where art often suffers the most as it is difficult to "possess" a performance.

When I was very young, my father was careful to ensure that I could distinguish between political and economic systems--and, because it was in the midst of the Cold War and we attended what I later learned to be a very conservative church--that none of them were good or evil; they were just ways of doing things.

However, our culture doesn't always draw such fine distinctions and we often try to apply our economic system to our politics and our politics to our culture. Rather than allowing them to influence each other as part of an ecosystem, there is a tendency to force one system's philosophy upon the other, to believe that what works for one will work for all.

Art suffers when we force upon it the same economic model that businesses operate under. If their goal must be the making of money rather than the making of art, they're going to fail. This doesn't mean that artists can be oblivious to economic factors or lack in all business sense, but it does mean they must choose their model carefully. How they manage their finances will say something about who they are and whether their art will be sustainable.

Art has traditionally relied much on government support and money from individual and corporate donors. There is an understanding that not all art will be commercially viable and succeed only through the price paid by those who consume it. We've accepted this because there is a societal value that far outweighs the burden of cost that any individual can afford to bear.

Indeed, like education, society reaps the benefit of art even when it is not the direct consumer. My life is made better when my neighbor goes to the symphony, even if I do not. As a member of my community, I want to see our communal dollars support what benefits all of us. I want to live in a society where people share a commitment to creation and to connection. Those are things that will spill over into politics and economics. Those are the things that will bring about a better world.

At the NEA Institute last winter, Ben Cameron talked about how theaters make communities a healthier place. He quoted a study that said high school students who have been in a single play are 42 percent less likely to support racist behavior than those who have not.

If arts were to operate on a purely capitalist model that encourages greater consumerism, it would miss out on its higher calling and the calling that makes it truly relevant and of value to the community.

The money has to be there, but it can't be the reason or the goal. Rather it is the set piece which makes the play possible, not the story itself.

August 2, 2007 7:51 AM | | Comments (2)

One of the great things about "Flyover" is that the four of us -- Joe, Bridgette, Jennifer and I -- share a lot in common but there's enough difference in our points of view that things never get dull. Case in point was Monday's post by our man in Montana (that's Joe). It was a thoughtful counterargument to a post I wrote last week.

My piece attempted, in a very ambiguous way, to connect a few changes occurring in the cultural landscape: the decline of newspapers, especially arts journalism at regional newspapers; the continued problems facing high art; the rise of amateur culture in the Age of Web 2.0 and its attendant attitude of anti-intellectualism; and a brain drain that I sense is taking place in newspapers everywhere.

Joe evidently felt this was doom and gloom. Perhaps it was. He also thought I was positing a position. I'm not sure I was, but maybe that's a moot point. The fact that my "Connections" post was able to draw so many intelligent comments, including Joe's, tells me there's something here and something important worth debating.

Among Joe's many thought-provoking notions was one conceiving mass media as a historical anomaly. I think he's right. I also think he's right in saying the problems we are facing as a culture stem from expectations established during mass media's long, long reign: We were the gatekeepers for a while, but now we're not. We used to force readers (or viewers or listeners) to pay attention to advertising as we enticed them our stories, pictures and design. But that's slowly losing its potency.

My editors kill me a little bit every day when they talk about developing a good "package," a word that to my ear rings with nostalgia for the good old days of mass media. Despite evidence to suggest people who read our website, a shift in reading patterns that my editors want to see more and more of, don't give a rat's ass about the "package" -- they want stories and their multi-media accoutrements.

Celebration or conquest?
As I have noted in previous posts, the bottleneck has indeed been broken, as Joe reiterates, and now the floodgates are open. "This is bad news for bad artists and second-rate journalists. It is good news, I tend to believe, for the truly skilled, the deeply passionate, the innovative, and the informed among us," Joe wrote Monday.

But here is where Joe and I part ways: Joe says these recent cultural changes, particularly the rise of Web 2.0, are cause for "celebration." I see the point. But celebration? Maybe. I guess I'm too skeptical about the forces of capitalism seizing control of these changes.

Indeed, the rise of a wireless, on-demand world is only beginning to reshape our consciousness and the opportunities for the innovative are only beginning to present themselves. This shift have historical precedent. The social upheavals of the 1960s were also thought to have brought America to the gates to a new world. As Morris Dickstein wrote, however, in the classic survey the literary works of that time "Gates of Eden," the gates to the new world remain closed, for now.

"The cold war, the bomb, the draft, and the Vietnam war gave young people a premature look at the dark side of our national life, at the same time that it galvanized many older people already jaded in their pessimism," Dickstein wrote. "Both the self and the world proved more resistant than the activism of the sixties dared to hope, but the effects of a decade of struggled are there to be seen."

Dickstein's "world [that] proved more resistant" than expected is partly, I would wager, the incredible adaptability of Corporate America. It recognized the shift in cultural sensibility and moved to exploit it. This is covered by Thomas Frank in his brilliant analysis of the last half century, "The Conquest of Cool: Business Culture, Counterculture and the Rise of Hip Consumerism":

" ... For some, Ken Kesey's parti-colored bus may be a hideous reminder of national unraveling, but for Coca-Cola it seemed a perfect promotional instrument for its 'Fruitopia' line, and the company has proceeded to send replicas of the bus around the country to generate interest in the counterculturally themed beverage.

He continues: "Nike shoes are sold to the accompaniment of words delivered by William S. Burroughs and songs by The Beatles, Iggy Pop, and Gil Scott Heron ("the revolution will not be televised"); peace symbols decorate a line of cigarettes manufactured by R.J. Reynolds and the walls and windows of Starbucks coffee shops nationwide; the products of Apple, IBM, and Microsoft are touted as devices of liberation; and advertising across the product category sprectrum calls upon consumers to break rules and find themselves."

And: "The music industry continues to rejuvenate itself with the periodic discovery of new and evermore subversive youth movements and our televisual marketplace is a 24-hour carnival, a showplace of transgression and inversion of values, of humiliated patriarchs and shocked puritans, of screaming guitars and concupiscent youth, of fashions that are uniformly defiant, of cars that violate convention and shoes that let us be us ..."

Read on for "Fart jokes: a very brief literary history" and "A change in consciousness"

August 1, 2007 7:31 AM | | Comments (0)

Recent Comments

Blogroll

About this Archive

This page is a archive of entries in the main category from August 2007.

main: July 2007 is the previous archive.

main: September 2007 is the next archive.

Find recent content on the main index or look in the archives to find all content.

AJ Ads



AJ Blogs

AJBlogCentral | rss

culture
About Last Night
Terry Teachout on the arts in New York City
Artful Manager
Andrew Taylor on the business of arts & culture
blog riley
rock culture approximately
critical difference
Laura Collins-Hughes on arts, culture and coverage
Dewey21C
Richard Kessler on arts education
diacritical
Douglas McLennan's blog
Dog Days
Dalouge Smith advocates for the Arts
Flyover
Art from the American Outback
Life's a Pitch
For immediate release: the arts are marketable
Mind the Gap
No genre is the new genre
Performance Monkey
David Jays on theatre and dance
Plain English
Paul Levy measures the Angles
Real Clear Arts
Judith H. Dobrzynski on Culture
Rockwell Matters
John Rockwell on the arts
Straight Up |
Jan Herman - arts, media & culture with 'tude

dance
Foot in Mouth
Apollinaire Scherr talks about dance
Seeing Things
Tobi Tobias on dance et al...

jazz
Jazz Beyond Jazz
Howard Mandel's freelance Urban Improvisation
ListenGood
Focus on New Orleans. Jazz and Other Sounds
Rifftides
Doug Ramsey on Jazz and other matters...

media
Out There
Jeff Weinstein's Cultural Mixology
Serious Popcorn
Martha Bayles on Film...

classical music
Creative Destruction
Fresh ideas on building arts communities
The Future of Classical Music?
Greg Sandow performs a book-in-progress
On the Record
Exploring Orchestras w/ Henry Fogel
Overflow
Harvey Sachs on music, and various digressions
PianoMorphosis
Bruce Brubaker on all things Piano
PostClassic
Kyle Gann on music after the fact
Sandow
Greg Sandow on the future of Classical Music
Slipped Disc
Norman Lebrecht on Shifting Sound Worlds

publishing
book/daddy
Jerome Weeks on Books
Quick Study
Scott McLemee on books, ideas & trash-culture ephemera

theatre
Drama Queen
Wendy Rosenfield: covering drama, onstage and off
lies like truth
Chloe Veltman on how culture will save the world

visual
Aesthetic Grounds
Public Art, Public Space
Another Bouncing Ball
Regina Hackett takes her Art To Go
Artopia
John Perreault's art diary
CultureGrrl
Lee Rosenbaum's Cultural Commentary
Modern Art Notes
Tyler Green's modern & contemporary art blog
Creative Commons License
This weblog is licensed under a Creative Commons License.