Home

Whipped Cream vs. Hamburgers

September 11, 2011

Invisible by Paul AusterIt’s ten years after 9/11, and are we still postmodern? You’d think a whack of hard reality would have propelled us into a new literary trend, but Paul Auster’s latest, Invisible, which I read because it was lying around my house, remains as metafictional, elusive, shape-shifting, and frustrating as the best/worst of the genre. 

I’m not going to summarize or review the novel, since those tasks have been accomplished so fully elsewhere. You can match a rave from the New York Times

http://www.nytimes.com/2009/11/15/books/review/Martin-t.html

with a pummeling, snarky putdown from The New Yorker

http://www.newyorker.com/arts/critics/books/2009/11/30/091130crbo_books_wood

and end up with the same bemused feeling that I had after finishing the book.

Entertaining as some of Auster’s devices may be, the postmodern conviction that reality is ultimately unknowable doesn’t, in my view, excuse plot turns that are arbitrary and unconvincing or characters who intrigue us into taking an interest but then fizzle out without much development.

This is literary whipped cream. In post-9/11 America, I’d rather have a hamburger.

The SwerveIn the past week, while dealing with earthquakes, hurricanes, tornadoes, floods, a lame dog, and fears of a second recession, I had the chance to read Stephen Greenblatt’s The Swerve: How the World Became Modern. Perhaps this proves that I have the same response to potential calamity as Art Dennison, the protagonist of my most recent novel: hide in a quiet room and read.

After I was intrigued by a piece based on Greenblatt’s book in the August 8 New Yorker (article summary here), a friend kindly passed along an advance reading copy. A Harvard scholar, Greenblatt is known for what I would call crossover works, ones that combine scholarship with popular appeal. His previous book, Will in the World: How Shakespeare Became Shakespeare, made a splash by combining Bard biography with a colorful portrait of Elizabethan England. In The Swerve he takes two historical figures about whom even less is known, the Latin poet Lucretius and the fifteenth-century papal secretary Poggio Bracciolini, and spiraling outward from those two, spins an intriguing tale of Classical philosophy, medieval copyists, Renaissance book hunters, and radical changes in the way we look at the world.

Intellectual history is great fun when written well, and Greenblatt is one of the best at it. His central point is that Poggio’s accidental discovery of a complete version of Lucretius’ De rerum natura (On the Nature of Things)—accidental in that Poggio was combing monastery libraries for copies of ancient texts but not that one in particular—played a central role in breaking down medieval religious thought and freeing the great minds of the Renaissance. It’s long been known that Lucretius influenced the humanists, but Greenblatt implies that this one masterful poem, in which Lucretius expounds on the Epicurean vision of the universe, smashed more barriers than any other Renaissance rediscovery and thus opened a broad road to modernity. Though I don’t know how many scholars of the period will grant Lucretius this extraordinary importance, I’m personally willing to allow it for the sake of the story.

One chapter offers an extended bullet list of the “elements that constituted the Lucretian challenge” to Church-molded thought. Here are some of the bullet points, without the author’s commentary:

  • Everything is made of invisible particles. [Lucretius didn’t call them atoms, but he was drawing on the atomism of Democritus and others.]
  • All particles are in motion in an infinite void.
  • The universe has no creator or designer.
  • The universe was not created for or about humans.
  • Humans are not unique.
  • The soul dies.
  • There is no afterlife.
  • All organized religions are superstitious delusions.
  • The highest goal of human life is the enhancement of pleasure and the reduction of pain.

From a churchman’s point of view, this list moves from suspicious to upsetting to burn-at-the-stake heretical, and of course many humanists played with these notions without committing outright to them. Still, even with clever concealments and subtle emendations, such ideas had the power to swerve society from medievalism toward modernity.

Greenblatt emphasizes that the Lucretian version of Epicureanism, like Epicurus’ own, did not equate living for “pleasure” with pursuing uninhibited passions and impulses. Rather, the highest pleasure comes from philosophical contemplation that “awakens the deepest wonder.” Kind of like hiding in your room to read while the hurricane roars by.

For me, one takeaway from this book is the renewed realization that human thinking goes in cycles, with old notions constantly being resurrected and reshaped. The body and soul may both dissolve into the mud, but ideas get into the water table and seep out in strange new places.

In fact, I’ve recently unearthed in the basement of my ancient house a mildewed manuscript that disintegrated as soon as I touched it. Only a few fragments remain legible, one of which seems to be attributed to a certain Sammus Gridlius, a previously unknown scribe of the first century A.D. Roughly translated from the Latin, it goes like this:

All that’s thunk has been thunk before. All that’s writ has been writ before. Peace, brother.

Circumstantial evidence indicates that a rediscovery of this text by John Lennon in 1968 influenced the late Beatles songs and consequently all of contemporary culture. I’m still working on proof for that theory.

In much of the new work I’m seeing, especially for the theatre, preposterous exaggeration seems to be the rule. Many writers seem to believe that the mainstream realistic tradition, loose as it may be, can’t convey the absurd cruelties of our world. So we get blood sprayed all over the stage in a grotesque parody of violence. Or wildly implausible plots, such as the one in which a former Guantanamo prisoner tracks down his ex-interrogator to demand half of her liver.

Most such work, however clever it may be, leaves me cold, or at best tepid. There’s little warmth in my heart for the characters, who often aren’t human enough to care about. The themes have less subtlety than Philadelphia politics.

Western literature has plenty of strong anti-realist precedents to draw on, including recent practitioners like Albee and Stoppard, and revivals of Ionesco or Beckett can still seem fresh and vital. So I’m wondering, what is it that makes some such work provocative, interesting, magical and other stuff just weird and off-putting?

One recent production offers a clue: Inis Nua’s version of Dublin by Lamplight, a play by Michael West written in 2004 to mark the centennial of the Abbey Theatre. Set in 1904, the year it mock-celebrates, the play features characters who are takeoffs on W. B. Yeats, Lady Gregory, and other landmark figures in the Irish cultural/political revival. The main plot, a melodrama, centers on the attempt to found the “Irish National Theatre of Ireland,” a redundantly named group whose first play will be a heroic melodrama about the mythical Irish hero Cúchulainn. Most of the characters fit stereotypes, such as the wardrobe assistant who yearns to be a star and the drunken actor who wants to blow up the British king.

Inis Nua’s staging, modeled on the original version developed by the author and collaborators at Corn Exchange Theatre Company in Dublin, is as exaggeratedly nonrealistic as anything else I’ve seen. The actors wear painted whiteface masks with grotesque features. They move in a jerky, staccato manner, and when one speaks, the others snap their heads toward the speaker like puppets on strings. In addition to overplaying all their emotions, the characters describe to the audience what they are doing and feeling.

The original director at Corn Exchange, Annie Ryan, is known for her commedia dell’arte style, and in Tom Reing’s version at Inis Nua many other influences could be pointed out: vaudeville, kabuki, physical theatre, “metatheatre,” Charlie Chaplin, you name it.

These techniques are played for broad comedy, and the script is indeed hilarious—lots of one-liners, not to mention the ludicrousness of the plot lines. Even with no more than a rudimentary knowledge of Irish history, the audience understands the satire on a romanticized, pretentious, knee-jerk, and ultimately violent form of nationalism.

If it stopped there, the play would be a reasonable success in a Brechtian way. The audience, fully conscious of the actors as actors and the play as an artifice, gets the intellectual point. And yet this work manages something more. After all the farce and ridiculous melodrama, the bomb goes off, the police retaliate, and it turns out that the wrong people are killed. As the Sunday Independent put it in reviewing the Corn Exchange production, “The laughter is effectively and sharply choked into silence, the actors’ white faces, clown mouths, and marionette movements serving only to heighten the sense of loss and futile violence.”

I came away feeling the play’s meaning as much as thinking it, and I could tell that most others in the audience did the same. How was that possible when we had been kept so deliberately at an emotional distance?

I can pick out a few possible reasons: (1) Because the play was so funny, the audience was hanging on every line; even if we were “alienated” in a Brechtian sense, we were mega-attentive. (2) Audiences have become accustomed to rapid switches between comedy and tragedy, as in the evening news. (3) The real tragedy dawned in one fine moment, almost understated; no fake blood splashed around the stage. (4) One character who died—I won’t say who it was—was actually, in spite of all the stylization and stereotyping, rather likeable as an individual, although I didn’t realize this until the body hit the boards.

“The mantra of Corn Exchange,” says Tom Reing in a video clip, is that “you have to dance on a razor’s edge between the grotesque, the heartfelt, and anything-for-a-cheap-gag. That makes it limitless.” Yeah, and that dance suits an age when reality seems inconceivable and our best heroes come from comic books. But if you don’t get the dance steps just right, the “limitless” turns into the merely ludicrous or gruesome. All those involved in Dublin by Lamplight, from the original creators to the new American troupe, deserve a tip of Charlie Chaplin’s absurd little hat.

Though the run in Philadelphia has ended, Inis Nua will remount Dublin by Lamplight in New York at the First Irish Festival in September.

Celebrating Death?

May 3, 2011

If we’re done chanting “USA! USA!” in response to the death of Osama bin Laden—as Phillies fans did during an otherwise frustrating game Sunday night—maybe it’s time to do some serious reflection, as Annette John-Hall does in her column in today’s Philadelphia Inquirer. “I simply can’t celebrate death,” she writes, “even if it’s the demise of a despot who orchestrated the deaths of so many innocent people.”

Revenge tastes good but is less filling.

Tiger mom and orchid child?

After the recent flap about so-called tiger moms who drive their children to achieve no matter what emotional turmoil it entails, it was interesting to stumble across a December 2009 Atlantic article by David Dobbs about the complex links between genetics and behavior. Though that issue probably littered our dining-room table at some point, my days of keeping up with magazines are long past, and I missed the piece when it was new. However, I’m heartened to think that in this one area I’m less than a year and a half out of date. (Don’t ask about my musical tastes or fashion sense.)

Dobbs first lays out the genetic theory of “vulnerability” that has become part of our cultural consciousness—the idea that some gene variants predispose us to specific diseases. Applying this to psychological maladies, he points to research indicating that certain genetic variations, such as one governing serotonin processing, enhance our risk for depression, anxiety, drug addiction, etc. These genes interact with the environment, so that the malady occurs “if, and only if, the person carrying the variant suffers a traumatic or stressful childhood or faces particularly trying experiences later in life.”

Then Dobbs goes on to develop a newer idea:

Recently, however, an alternate hypothesis has emerged from this one and is turning it inside out. This new model suggests that it’s a mistake to understand these “risk” genes only as liabilities. Yes, this new thinking goes, these bad genes can create dysfunction in unfavorable contexts—but they can also enhance function in favorable contexts. The genetic sensitivities to negative experience that the vulnerability hypothesis has identified, it follows, are just the downside of a bigger phenomenon: a heightened genetic sensitivity to all experience.

This point leads to a metaphor distinguishing two types of children: the “dandelions” who do pretty well regardless of circumstances, and the “orchid” children “who will wilt if ignored or maltreated but bloom spectacularly with greenhouse care.” The startling idea here is that the SAME genes that make us vulnerable to neurosis or psychosis also make it possible to achieve great success. Supporting research comes from studies of both children and rhesus monkeys—the rhesus being the only other primate that shares humans’ adaptability to new environments. The reasoning ultimately reaches the conclusion that our “risky” genes endure because natural selection, far from working to eliminate these traits, actually favors them:

a genetic trait tremendously maladaptive in one situation can prove highly adaptive in another.… To survive and evolve, every society needs some individuals who are more aggressive, restless, stubborn, submissive, social, hyperactive, flexible, solitary, anxious, introspective, vigilant—and even more morose, irritable, or outright violent—than the norm.

I think of John Adams, the second U.S. president, who was apparently so bipolar that at certain key moments he froze into inaction; but when he came out of the funk and got to work, he was extraordinary.

So what does this mean for tiger mothers versus liberal, child-centered, nicey-nicey Dr. Spock mothers? Amy Chua, whose memoir and Wall Street Journal article set off the controversy, has raised high-achieving children by, she tells us, not settling for less. Screaming at the kids, insulting them, threatening, forcing them do what they hate (practice the violin, for instance)—all of these techniques are within bounds for Chua. Should we conclude that, if her daughters have “orchid”-type genes, their overstressed, abusive childhood will eventually make them socially or emotionally disturbed?

Maybe not. In his works on education, Parker Palmer emphasizes that various instructional “techniques” may work, but the sine qua non is “teaching with heart and soul,” connecting with your students and also with your own spiritual self. A key to Chua’s approach, if we can believe her, is that she does in fact connect with her kids; they never doubt how much she cares about them. Contrast this with a tolerant, permissive, but emotionally distant parent—which type is more traumatic for the child? 

I suspect that the unengaged type, the skittish kitty mom who’s too distracted to pay attention, or the tomcat dad who’s never home, is far more dangerous to youthful psyches than a tiger parent.

And—to return from felines to plants—if the characteristic of orchid children is their sensitiveness to different conditions, we might also suppose that they are (a) highly variable among themselves and (b) attuned to small variations in the environment. What looks on the surface like a harrowing childhood to you or me may not be so damaging to some orchid kids who thrive on tiny but well-placed raindrops of affection. 

As the saying goes, it takes all kinds, and now the geneticists agree. Still, I never want to take a music lesson from Amy Chua.

The author, metaphorically (photo by Jjron, from Wikipedia)

During the periods when I’m working too hard, which occur far more often than they should—a form of self-flagellation because, as my wife points out, no one required me to accept so many large jobs—a “last straw” feeling often overtakes me, the sense that, like the proverbial camel, I can’t manage one single additional task, no matter how tiny, or I’ll snap. Can I figure out why the copier insists it’s jammed when there’s nothing visible stuck in it? No, I can’t, not another chore, there’s just too much, it’s impossible, ask me next month or next year, or trash the damn thing, I don’t care, but I can’t spend five minutes on it, I can’t possibly do one more thing…

Of course I know this is nuts, a result of overstress, a psychological imbalance, and when the work eases a bit, I recover my good humor and willingness to delve inside the copier in search of a stray paper clip.

Yet the more I look around at our culture, the less unusual my fits of stress seem to me. Not only do we have crazies who have gone well beyond the snapping point—the ones who shoot up malls or schools—but we sprout flaming lunatics on the radio and in Congress, nasty snipers on the Internet, road rage on the freeways… So many of us seem just two or three straws from the breaking point.

Lately I’ve been sampling advance proofs for a forthcoming book from Temple University Press, American History Now, edited by Eric Foner and Lisa McGirr, a volume that attempts to sum up the ways current historians view major eras and themes in the nation’s history. The main audience, I assume, is scholars from other fields who want a brief overview of what the historians have been up to lately. A chapter by Kim Phillips-Fein of NYU takes on the near-impossible challenge of summarizing recent scholarship on the era defined as “1973 to the Present.” The post-1973 years, she writes,

have been viewed as a time of economic uncertainty and widening inequality as compared to steady growth; as an epoch of ambivalence, skepticism, and even hostility toward politics, in contrast to the idealism and optimism of the 1940s, 1950s and 1960s; and perhaps most of all, as an age of conservatism that rejected the liberalism of the postwar years.

Of course, economic uncertainty, inequality, and even skepticism are hardly unique to our era; these are recurring themes throughout the country’s history, dating to before the Revolution. What’s different now, I think, is that we’ve become terribly anxious and insecure—more so, for instance, than in the 1950s when we had Joe McCarthy, nuclear bomb scares and sheriffs clubbing civil rights leaders; or in the 1960s, when we killed two Kennedys, Martin Luther King, Malcolm X, hundreds of thousands of Vietnamese and tens of thousands of our own soldiers. Is it 9/11 and terrorism that have made us so jittery, brought us so close to the edge?

I think it goes deeper than that. The “economic uncertainty” that Phillips-Fein points to probably plays a role. We got accustomed to being prosperous, and now that we seem to be running faster and faster merely to stay in the same place—those of us who still have jobs, that is—we lack the emotional resilience to handle the situation. Then there are the many social changes we’ve absorbed in a few short decades. It’s great, for instance, that we can define a “family” in more ways than before, so that two adults of the same sex with three adopted children from various countries are as much a family as any other; yet all of our families, old-fashioned and new-, have become fragile, with divorce rates approaching 50 percent for first marriages and soaring far above that for second and third marriages. (Luckily, by the time you have a sixth wedding like my father, your life will probably end before your marriage.) The best way we’ve found to reduce the divorce rate is for couples not to marry in the first place, a solution that doesn’t do much for psychic stability and calm.

It’s a truism that, after all the gains made by women and minorities in the past decades, we have plenty of stick-in-the-muds uncomfortable with such change, or worse, severely ticked off about it. Yet I also wonder if our recent social advances make the gainers themselves feel more secure. If you’re a minority person in a corporate position previously closed to those of your ethnicity/gender, does your triumph bring peace of mind, or does it give you one more thing to worry about?

It’s my theory that such economic and social conditions have produced a long-running, low-level anxiety that is always with us, magnifying our personal and public stresses and bringing us closer to the snapping point. You might test this idea next time you’re in a traffic jam. Doesn’t one part of your psyche wish you could whip out an assault rifle and start shooting?

Being untrained in psychology, social science, history and just about every other subject matter, I’m totally unqualified to form such opinions. But this too is a prevailing American characteristic, the having of strong, unfounded convictions. There’s a reason bigoted talk-show hosts are popular.

… Oh hell, dammit, the smoke alarm over the stairwell started beeping again. I changed the battery two weeks ago! Now I’ll have to search for another new battery, fetch a ladder from the basement, climb up there and perch precariously while I try to fix the thing with clumsy fingers—or else drive to the hardware store and buy a new alarm, maybe the whole thing is defective—but that store sold me this cheap junk in the first place—no, it’s too much, dammit, I can’t take this anymore! Gimme a broom, I’m gonna knock that stupid piece of crap off the ceiling, take it out on the porch and jump up and down on it till it’s a thousand plastic splinters…

There! Whew! Now I feel better. I’m leaving the shards on the porch to teach all goddamn smoke alarms a lesson.

What was I saying?