I had this idea in my head for a blog post that would explore my conflicted emotions over having decided to stop running in races. Pretty much every spring I’ve signed up for a local 5K that benefits Poplar Spring Animal Sanctuary, a wonderful place in Poolesville, Maryland, where scores of farm animals live out their lives in bucolic bliss rather than being slaughtered for food or abused by people who see them as commodities. When race day rolled around earlier this month, however, I was registered for the walk, not the run. I expected this to make me feel at least a little guilty, sad and old.
But it didn’t at all. In fact, it made me feel happy. And mature—not in a wizened, solemn way but in a, “Well now, that certainly made good sense” way.
I started running in the mid 1980s as a way to lose weight. Back then I was a single newspaper reporter living in the home of a friend and co-worker in High Point, North Carolina. I was a healthy eater, meaning not that I ate wisely but that I ate a lot. And I got pretty much no exercise. Even in a profession with notoriously low standards for fitness (and grooming, personal hygiene, and fashion), I’d come to really dislike what I saw when I looked in the mirror. Ultimately, I reached some tipping point and decided I needed to eat less and work off considerably more calories.
My friend and landlord Phil, an indescribably strange but gentle soul, dropped dead of a heart attack several years ago, when he was only in his 50s. But I repeated his enduring legacy to me just this morning in Northwest DC. It’s a short series of stretches I complete before and after every run. Leaning against his living room fireplace, he demonstrated those moves to me one day in the spring of 1986. Then he sent me on my way, huffing and puffing up and down the winding, hilly suburban streets of the self-proclaimed Furniture Capital of the World. There was a country club about a half-mile away, and in my early days I was walking by the time I reached it. I imagine the sight of me—sweaty, fat, dressed in an improvised clown version of running wear—heightened the ruling class’s sense that their private club offered blessed sanctuary from a very ugly outside world.
But somewhat to my own surprise, even incremental progress was enough to keep me going. I slowly increased my distances, and gradually the pounds came off. While this did not have the hoped-for effect of attracting the ladies—my grooming, hygiene and fashion challenges remained, and wouldn’t be definitively conquered until Lynn assumed stewardship of me—it did get my competitive juices flowing. By the late ‘80s I was running in several races a year, at distances of up to 10 kilometers, or 6.2 miles. I never was a threat to win, by any stretch, but I didn’t embarrass myself, either. (I even was dressing a little better by then—wearing actual running shorts rather than work-in-the-yard shorts, and running shoes instead of uncushioned sneakers.) At first I finished in the far middle of the pack, but by the end of the decade I’d reached the near middle.
Never mind that the only aspect of running in races that I truly liked was drinking as much beer as I wanted afterward. Though it seems a tad counterintuitive in hindsight, in those days most race promoters thought it was a great idea to reward runners for their virtuous hard work with a bottomless tap. I’d bust my ass running the race, hating every minute of it and sorely tempted at every hilltop to join the quitters I saw walking at the side of the road, their heads bowed but their respiration unlabored. But then, as I approached the finish line I would see behind it, shining like a beacon in the wilderness, a huge inflated beer bottle bearing the logo of Budweiser, the Miller Brewing Company or Anheuser-Busch. What it signified, I knew, was that within minutes my agony would be replaced by a mellow, tingly buzz, as a body depleted of oxygen and not yet fed that day would be filled with refreshing, mood-enhancing booze. To this day, I savor the happiness of those early post-race celebrations, when I was high with relief and fermented hops.
Looking back, I should’ve retired from races when the taps ran dry, which had happened by the early ’90s as both the health craze and liability litigation exploded. But by then I’d gotten a little faster and liked seeing my name listed in the top third of finishers. One time I ran an 8K in sub-8-minute miles, which is nothing to elite runners and a joke to an Ethiopian or a Kenyan, but seemed miraculous to me. True, I did nearly pass out from exhaustion afterward, but had I died I might have had a smile on my face. Another time, in the mid ’90s, I ran a half-marathon course on Long Island that brushed the city limits of Bellmore, where my mom grew up. I felt somehow like I was doing my family proud.
By the 1990s I also was married, and I often dragged Lynn along to races in and around DC to buck me up, hold my wallet and cheer me on. She was great about it, but she quickly picked up on the fact that running in races was all in all a miserable experience for me before, during (especially) and after (if I didn’t meet my personal goal). She noted how I dreaded the upcoming event for weeks, was inconsolable with worry and nervousness the morning of the race, looked absolutely dreadful as I ran past her gasping like I’d been waterboarded, complained afterward about how awful it had been, and brightened only when I was later sipping coffee at Starbuck’s, the whole nightmare behind me.
Still, I kept competing in several races a year through the ’90s and into this century. At a certain point, though, I stopped requiring that Lynn come along to endure my whining and be bored as hell while she waited for her masochistic husband to approach from the distance looking like he’d soon be making her a widow.
Actually, the beginning of the end of Lynn’s participation came around 10 years ago, when I finally took the step of entering a marathon—the Marine Corps event in October. I thought I’d trained sufficiently for it, but it was a miserable experience. I started losing steam about halfway through and then somehow injured my foot, such that I had to walk the final 8 or so miles. It was the first time I’d ever walked as much as an inch in a race, and every runner who subsequently passed me seemed to add mockery to my pain and horror. I’d imagined beforehand that I’d finish in 4 hours at the very worst, but as I trudged toward the finish line at the Iwo Jima Memorial that day I couldn’t so much as manage the short sprint necessary to beat 5 hours. A few years before, Oprah Winfrey had finished the same course a good half hour earlier. Oprah!
I was hurting and humiliated when Lynn and I found each other through the throngs of finishers and well-wishers. She’d sacrificed a huge chunk of her Sunday for the privilege of waving to me at mile 15 or so as I shouted dejectedly “I’m fading!” It got hot that day, and she’s never been a fan of either heat or crowds. As she drove her mopey mess of a spouse back home, I started rethinking this please-share-my-hell-with-me thing. A few months later there was a St Patrick’s Day 10K downtown on a 25-degree morning. I advised Lynn to sleep in, and took Metro by myself.
In recent years I’d cut way down on the number of races I’d entered. For one thing, age had caught up to me and my times were getting increasingly slower, which was annoying. For another thing, my friend Maryann, who’s six years younger than me and a much more serious and dedicated runner than I’ve ever been, went in just a few years from novice to kicking my ass in races, and that bummed me out. And for a third thing, Lynn’s mantra over the years that I didn’t have to punish myself in this way was finally starting to register. Recreational runs at my own pace were fine. But races were a bitch.
Finally, only a pair of 5K races each year—a Thanksgiving Day “turkey trot” and the Popular Spring event—stood between me and competitive-running retirement. The former went by the wayside last November, when, rather than killing myself to try to run 3.1 miles in under 26 minutes in DC, I worked off a larger number of pre-holiday feast calories running for an hour at a slower pace on my own in my neighborhood. Then, earlier this month, I walked and talked with friends while the Poplar Spring runners sweated their way toward the finish line on the partially shared course. And I felt good about it. Smart, even.
You know how you did stupid things as a kid and your mom said, “If I told you had to do that, you’d shout, ‘Not a chance!’” You understood her logic, but you kept doing whatever idiotic thing you were doing because it pleased you in some juvenile way.
Well, for the nearly 20 years of our marriage Lynn had been taking the more-direct approach of telling me I didn’t have to run in races and it was dumb to keep doing something I didn’t enjoy. I don’t know why I kept doing it anyway for so long, but maybe the fact that I’ve finally stopped now means I grew up at long last.
Anyway, it’s not like I’ve abandoned all running goals. Those goals just aren’t about times and distances anymore. I keep track of all the US states in which I’ve run at least once for an uninterrupted hour. I’ve been stuck on 30 for a few years, since the winter day I road-tripped to Kentucky specifically to jog around a sleepy coal town (and coincidentally dined at the worst Chinese restaurant in existence). But I have every intention of adding each of the 20 remaining states to my list.
I may be slow achieving it, and I’ll definitely be slow running it. But when I reach that particular finish line, it’ll feel really good. I might even have some morning beers for old times sake.
Friday, May 27, 2011
Thursday, May 19, 2011
Everybody’s Smirkin’ for the Weekend
An 89-year-old American preacher named Harold Camping with an infinitesimal following but an attention-getting message has predicted—based on some arcane mathematical equation he clearly deems far more reliable than his failed doomsday forecast of 1994—that the Rapture will occur this Saturday. That means, according to Camping, that about 200 million Christians will ascend to heaven and the rest of us—atheists, Muslims, Jews, Hindus and non-select Christians alike—will be subjected to five months of increasing horror (presumably even worse than the drip-drip of new Schwarzenegger progeny revelations and the threat of more serial blowhardery from a continuing Newt Gingrich candidacy), until the world finally explodes.
Meanwhile, the British physicist Stephen Hawking has come under fire from people of faith for recently telling the newspaper The Guardian the following: “I regard the brain as a computer which will stop working when its components fail. There is no heaven or afterlife for broken-down computers; that is a fairy story for people afraid of the dark.” Previously, in his 2010 book The Grand Design, Hawking had asserted, “Science predicts that many different kinds of universe will be spontaneously created out of nothing”—meaning that as Hawking sees it, in the Beginning, there was nothing but the laws of science—no God in sight.
Finally, earlier this spring Rob Bell, pastor of Mars Hill Baptist Church, a 10,000-member congregation in Grand Rapids, Michigan, provoked the ire of evangelicals by suggesting in his book Love Wins that God’s love is more radical than we mortals possibly can imagine, allowing for the redemption of everybody—Christians and nonbelievers, do-gooders and murderers, saints and monsters. In Bell’s theology, hell isn’t some big barbeque pit where sinners eternally twist on skewers, but, rather, is the stuff of such earthly pestilences as violence and war. Whether, conversely, Bell envisions a celestial cocktail party at which Gandhi sings Bavarian beer hall songs with his pal Hitler while smiling Holocaust victims hum along, I’m not sure. I haven’t read the book. But the idea that one needn’t submit to Jesus Christ to achieve salvation similarly offends devout Christians who believe you can’t ascend the stairway to heaven without the Son of God authorizing the climb.
Not even evangelicals seem to think much of Harold Camping’s doomsday forecast, citing the biblical passage in which Jesus says humans won’t know the day or hour of final judgment. I’m also going to go out on a limb and assume Stephen Hawking isn’t canceling whatever Sunday-morning plans he’s made in lieu of church this weekend. (Although you’d think he might at least cut Camping a little slack for basing his apocalyptic vision on math, even if it’s convoluted and arguably insane math.) Predictably, there’s news on Yahoo! today that various groups around the country have smirkingly scheduled “Rapture parties” this Saturday, at which one assumes they’ll cavort as if there’s no tomorrow.
The reason I call myself an agnostic and not an atheist is that, while I’ve trouble imagining the existence of a supreme being and consider the Bible (Koran, Torah, insert name of divine text) no more holy and sacrosanct that any other book(s) written by human beings, I frankly Just Don’t Know how this whole existence thing got started, or when and how it ultimately will end. My pea brain can’t conceive of a “time before time,” or of a God who’s Always Been but one day got bored with nothingness and decided to amuse Him-/Her-/Itself with a gangbusters laser-light show. For all my respect for Stephen Hawking—who’s not only a mental giant compared with me but awesomely has parried with Homer on The Simpsons— I have just as much trouble wrapping my feeble mind around the idea that science somehow Always Was and at some point spontaneously created the universe.
There’s another reason I prefer to describe myself as an agnostic. I’m seriously hedging my bets. While I’m quite prone to believe that this world is all there is—as attractive as I find Albert Brooks's vision, in the film Defending Your Life, of an afterlife in which you can eat everything you want without gaining a single ounce—there’s a tiny part of me that is absolutely chicken-shit of pissing off some Big Guy (or Big Girl or Big Entity) that might, just might, exist after all and hate nothing so much as an absolute denier. My thinking is, maybe if I keep an open mind—even if it’s open just the teeny-tiniest crack—and then it turns out that there really IS something to eternity other than the Big Sleep, I just might sneak in. Or, maybe there’s a hell, and I’m at least spared that fate—because I declined to put a damning exclamation point on my doubt.
For me, too, agnosticism bespeaks a certain humility. The thing I despise about religion is that it’s all ultimately rooted in the conviction that I’m Right and You’re Wrong. It’s an us-against-them belief system. In my view, people who trumpet their atheism are equally self-righteous. In their absolute certainty, they’re saying they have the answer, the final word, and that people who think differently are naïve at best and idiots at worst. While I personally don’t think people of faith—whatever faith—are right, I can’t say with 100-percent conviction they’re wrong. With 95 percent conviction, maybe. But that leaves just enough wiggle room that I don’t feel completely arrogant—and that I might still, somehow, squeeze my way into a heaven that almost surely, in my view, doesn’t exist.
So, to recap, I’m thinking that 200 million people aren’t suddenly going to vanish two days from now, that this old world will keep spinnin’ ’round long past October, and that you can’t spell Camping without “camp.” But that doesn’t mean I’ll be knocking back beers at any Rapture party come Saturday. That would be tempting fate. Of that, I’m strangely certain.
Meanwhile, the British physicist Stephen Hawking has come under fire from people of faith for recently telling the newspaper The Guardian the following: “I regard the brain as a computer which will stop working when its components fail. There is no heaven or afterlife for broken-down computers; that is a fairy story for people afraid of the dark.” Previously, in his 2010 book The Grand Design, Hawking had asserted, “Science predicts that many different kinds of universe will be spontaneously created out of nothing”—meaning that as Hawking sees it, in the Beginning, there was nothing but the laws of science—no God in sight.
Finally, earlier this spring Rob Bell, pastor of Mars Hill Baptist Church, a 10,000-member congregation in Grand Rapids, Michigan, provoked the ire of evangelicals by suggesting in his book Love Wins that God’s love is more radical than we mortals possibly can imagine, allowing for the redemption of everybody—Christians and nonbelievers, do-gooders and murderers, saints and monsters. In Bell’s theology, hell isn’t some big barbeque pit where sinners eternally twist on skewers, but, rather, is the stuff of such earthly pestilences as violence and war. Whether, conversely, Bell envisions a celestial cocktail party at which Gandhi sings Bavarian beer hall songs with his pal Hitler while smiling Holocaust victims hum along, I’m not sure. I haven’t read the book. But the idea that one needn’t submit to Jesus Christ to achieve salvation similarly offends devout Christians who believe you can’t ascend the stairway to heaven without the Son of God authorizing the climb.
Not even evangelicals seem to think much of Harold Camping’s doomsday forecast, citing the biblical passage in which Jesus says humans won’t know the day or hour of final judgment. I’m also going to go out on a limb and assume Stephen Hawking isn’t canceling whatever Sunday-morning plans he’s made in lieu of church this weekend. (Although you’d think he might at least cut Camping a little slack for basing his apocalyptic vision on math, even if it’s convoluted and arguably insane math.) Predictably, there’s news on Yahoo! today that various groups around the country have smirkingly scheduled “Rapture parties” this Saturday, at which one assumes they’ll cavort as if there’s no tomorrow.
The reason I call myself an agnostic and not an atheist is that, while I’ve trouble imagining the existence of a supreme being and consider the Bible (Koran, Torah, insert name of divine text) no more holy and sacrosanct that any other book(s) written by human beings, I frankly Just Don’t Know how this whole existence thing got started, or when and how it ultimately will end. My pea brain can’t conceive of a “time before time,” or of a God who’s Always Been but one day got bored with nothingness and decided to amuse Him-/Her-/Itself with a gangbusters laser-light show. For all my respect for Stephen Hawking—who’s not only a mental giant compared with me but awesomely has parried with Homer on The Simpsons— I have just as much trouble wrapping my feeble mind around the idea that science somehow Always Was and at some point spontaneously created the universe.
There’s another reason I prefer to describe myself as an agnostic. I’m seriously hedging my bets. While I’m quite prone to believe that this world is all there is—as attractive as I find Albert Brooks's vision, in the film Defending Your Life, of an afterlife in which you can eat everything you want without gaining a single ounce—there’s a tiny part of me that is absolutely chicken-shit of pissing off some Big Guy (or Big Girl or Big Entity) that might, just might, exist after all and hate nothing so much as an absolute denier. My thinking is, maybe if I keep an open mind—even if it’s open just the teeny-tiniest crack—and then it turns out that there really IS something to eternity other than the Big Sleep, I just might sneak in. Or, maybe there’s a hell, and I’m at least spared that fate—because I declined to put a damning exclamation point on my doubt.
For me, too, agnosticism bespeaks a certain humility. The thing I despise about religion is that it’s all ultimately rooted in the conviction that I’m Right and You’re Wrong. It’s an us-against-them belief system. In my view, people who trumpet their atheism are equally self-righteous. In their absolute certainty, they’re saying they have the answer, the final word, and that people who think differently are naïve at best and idiots at worst. While I personally don’t think people of faith—whatever faith—are right, I can’t say with 100-percent conviction they’re wrong. With 95 percent conviction, maybe. But that leaves just enough wiggle room that I don’t feel completely arrogant—and that I might still, somehow, squeeze my way into a heaven that almost surely, in my view, doesn’t exist.
So, to recap, I’m thinking that 200 million people aren’t suddenly going to vanish two days from now, that this old world will keep spinnin’ ’round long past October, and that you can’t spell Camping without “camp.” But that doesn’t mean I’ll be knocking back beers at any Rapture party come Saturday. That would be tempting fate. Of that, I’m strangely certain.
Sunday, May 8, 2011
Laden, with Meaning?
The lead photo on the front page of this morning’s Sunday New York Times shows a diminished-looking Osama bin Laden—draped in an old blanket and with a wool cap on his head—sitting in a dingy room, pointing a remote control device at a television set. Beside the TV and snaking down to a power strip on the floor is a jumble of cords that looks more like a fire hazard than the anatomy of a fierce propaganda operation.
Much already has been made of this image, released by the US government in what analysts are saying is an attempt to add insult to injury, in effect, by showing the world and its jihadists that the late al-Qaeda mastermind was far from the mythical figure he’d been built up to be. It’s a visual twist on “the emperor has no clothes”—the emperor has to dress in unflattering layers because he can’t afford to turn the heat up. (And by the way, even Mike Nesmith of the Monkees rocked a wool cap much better than you, ya grizzled old coot.)
On all-news WTOP radio this morning, the guy who leads the “talkback line” discussions solicited listener reactions to the photo. His own take was that it reminded him of the scene in The Wizard of Oz when the mighty Wiz was revealed to be just a bland guy behind a curtain using gadgetry to look tough. I didn’t listen long enough to hear any of the talkback responses, but I expect that a lot of “Guess you weren’t so tough after all, asshole!” gloating ensued. As I played those imagined sound bytes in my head I recalled the TV images in the immediate aftermath of the successful US strike on bin Laden: Jubilant Americans gathering in front of the White House and at other prominent spots across the country to celebrate, nay, revel in the fusillade of lead that ripped through the hated Saudi’s brain. “USA! USA! USA!” vast crowds of people shouted. (Unless I heard it wrong and it actually was “NRA! NRA! NRA!” which on such testosterone-choked occasions essentially means the same thing.)
There is so much that’s sad, self-serving, hypocritical and just plain wrong in all of this that I hardly know where to start addressing it.
Well, let’s take it chronologically. The raid on bin Laden’s heavily fortified compound in Abbottobad, Pakistan (no relation to Lynn or her Jewish family, by the way), was carried out on May 2. The initial reports were that bin Laden was armed and had used one of his wives as a human shield before being blown away by Navy SEALs. Of course, it later turned out that neither detail was true, but the errors served the initial purpose of painting bin Laden as a coward and suggesting that the good guys had been quicker on the draw—an image America loves so much that it’s the theme of thousands of old Westerns.
Next were those scenes of celebration, as hundreds of people in each location—many of them too young to have taken arms against al-Qaeda in the years right after the events of 9/11 in 2001—whooped and hollered and held aloft signs adorned with that iconic American symbol of the early 21st century—crosshairs. My personal favorite was the accounts of the chest-thumping at Philadelphia’s Citizen’s Bank Park, as Neanderthal Phillies fans (apologies to potential Lassitude readers Greg and Betsy) took time out from a baseball game against the Mets to strike up the “USA!” chant after hearing or reading the news on their phones. (Philadelphia sports fans being what they are, I expect more than a few of them also screamed, “Osama, you suck!” and perhaps shouted some vulgar things, as well, about his mother, his religion and that “towel” on his head.)
Then, finally, sobriety set in over the past week with the realizations that 1) al-Qaeda didn’t die with bin Laden, 2) the “shootout” had been the killing of an unarmed man, and 3) the “lavish” estate within which presumably harems of babes had been feeding bin Laden grapes poolside while his disciples shivered in caves in Afghanistan actually is pretty primitive and crappy looking (“cushy” only in the sense that it presumably has dependable electricity and indoor plumbing.) So, on cue, the US government released photos from the stormed compound designed to humiliate bin Laden, and in the process pump up our flagging nationalistic mojo. Of that photo in this morning’s Times, the reporter of the accompanying story wrote that bin Laden had been watching himself on TV “like an aging actor imagining a comeback.” Other confiscated videos reportedly show the old ham “flubbing his lines.” And Osama had been sufficiently mindful of his image, the story noted, to have dyed his white beard black for videos that later were broadcast to the world.
There’s your chronology. Now let’s take a closer look at all this.
I first heard the news that bin Laden had been killed by US forces when I was getting ready for bed that night. I found the radio report interesting, surprising and gratifying, in that order. No “USA!” chant welled in my throat. It simply struck me as very belated good news. A manhunt that had begun nearly a decade earlier finally had concluded. The brains (and start-up funder) of an operation that had killed thousands of Americans, and many people in other countries, had been silenced. This was, no doubt, a good thing.
But there was no feeling of “the monster has been slain.” Because just as I’ve always loathed as self-serving and simplistic the Bush-era terms “terrorists” and “war on terror”—which reduce the enemy to bloodthirsty killers without a shred of legitimate grievance and exalt the United States as utterly untainted defender of democracy and freedom—so, too, have I always distrusted the government-produced and media-amplified portrait of bin Laden as the personification of evil.
To this day, nearly 10 years after the downing of the Twin Towers, most Americans have little tolerance for anyone who so much as suggests that America itself bears an iota of culpability for the sociopolitical climate that gives rise to jihadists. Which seems willfully blind at a time in history when the Egyptian people have risen up to overthrow Hosni Mubarak, one of the many, many autocrats and dictators worldwide the US has propped up monetarily, diplomatically and militarily over the decades, the will of their people be damned. Sure, in each case our government has had its reasons—often arguing (accurately or not) that the despot we back not only is better for US security than the alternatives, but that the alternatives would be worse for that country’s population, as well. Regardless, there are legitimate reasons that many people in many countries see the United States in a way that we seem incapable of seeing ourselves—as the bad guys.
What I’m trying to say is that my reactions to bin Laden’s death were complicated enough that I was in no mood for celebration. Conversely, I was repulsed by the jingoistic displays of delight in a man’s—any man’s—death, however “evil” that individual might have been deemed. When I witnessed those celebratory scenes on TV, in the newspaper and on the Internet, they symbolized to me what’s worst about America: its violent, gun-crazy, vengeance-driven culture. I’m absolutely apoplectic about our toothless gun laws and the national mood that makes any meaningful reforms impossible. Don’t get me started! We may finally have a black president, but don’t expect to see a Second Amendment-doubting president anytime in the next couple of centuries. (Or an atheist, for that matter. Whereas I’d gladly throw my support behind a qualified candidate who makes no pretense of religious belief but who wholeheartedly believes private ownership of guns should have draconian limits.)
I have, however, found gratifying some reactions to the death celebrations. The Washington Post’s online religion section, “On Faith,” one day last week asked the question, “Is it moral to celebrate a person’s death, even if he is guilty of heinous crimes?” Many readers, including members of the clergy, firmly responded, “No.” I’ve seen similar questions asked, and the same answers received, on public radio and on local TV newscasts in the past few days. Then there was the 20-something blogger whose essay was published in the print edition of yesterday’s Washington Post. Alexandra Petri wrote amusingly and insightfully, under the headline “Osama’s Dead—Party On!”, that young revelers fist-pumping the announcement of bin Laden’s death may have seen the scary old Muslim fundamentalist as “our Voldemort.” For her generation—Americans who had been 12 or 14 on 9/11—Osama bin Laden had been “the face behind the random terror of the universe, the dragon we could slay and beat.” It seems to me there's some truth in that. It makes those young partiers seem less vengeful and more naively goofy. Which helps.
I also was reassured by the reaction of my office friend Meghan, who’s in her 20s and told me without solicitation that she found celebrations of bin Laden’s death “creepy”—like something you’d see on the streets of countries where decades of oppression have provoked bloodlust, not in a country where generations of prosperity have produced a youth culture happily ensconced in its Blackberries and iPhones.
Anyway, where was I? Here’s something else that’s been bothering me this past week. Or rather, that always has bothered me, but that this past week’s events have highlighted: The American government, mainstream media and public’s characterization—both reflexive and calculated—of our enemies as weaklings and cowards, and the concurrent view of America as always strong and noble. With Osama, it started when he coordinated the killing of thousands of people at the World Trade Center—the “cowardly” slaying of innocents. Then he ran and hid all those years, also cowardly. Finally, after we finally got him we found videos in which he looked kind of, well, weak and pathetic. What a loser!
Only, in truth, the guy was a genius who brought America to its knees that awful September day, then eluded capture for nearly 10 years, despite having what we now know wasn’t exactly a state-of-the-art command center. And you can hate, as I do, bin Laden’s view that an intolerant, backward, misogynistic distortion of Islam constitutes heaven on Earth, yet still concede that the ambition and execution of his vision was big, bold and daring. Whatever you might think of him personally, the guy did a lot with a little. Of course, many causes backed by fanatical True Believers tend to show outsized results (right down to that overachieving Army of One who went by the professional moniker Unabomber). But it takes strength—of mind, will and, yes, character—to take a ragtag bunch of discontented radicals and mold them into a force that sends the Western world into a panic.
And I’ve got to ask: What kind of a coward goes out of his way to bring the wrath of the most powerful nation on Earth down upon his own head—even if that head ends up topped by a silly looking knit cap? Not just in the case of bin Laden, but in many other instances around the world, it seems to me that much of the time we spend demonizing and belittling our putative enemies might be better spent trying to understand their totality, the history that has created them, and America’s overt and covert role in that history. I’m not saying that bin Laden was a good man. I mean, he thought everyone who didn’t completely share his beliefs was an infidel who should be executed—that, to me, goes several steps beyond tightly wound. The guy was nasty and scary. But—failed attempts to portray him as a luxury-loving wife-sacrificer aside—bin Laden unquestionably had a warped personal integrity.
Even those who’d be dancing on bid Laden’s grave right now, were he not buried at sea, will concede that the War on Terror—or, as I rather un-lyrically call it, the Battle Against Islamic Extremists Who Crave a World So Devoid of Laughter and Fun That You and I, Too, Would Beg for Suicide Missions—is far from over, and that it won’t definitively conclude in the foreseeable future, if ever. (Maybe al-Qaeda ultimately will disappear only after the melting polar ice caps flood its desert training compounds.) But perhaps there’s some takeaway from Osama bin Laden’s death that’s more useful than the realization that many Philadelphia Phillies fans are fat, hairy goombahs. (Which in fact we already knew.)
This is a good time to think about who bin Laden really was, how we (America and the West) contributed to his creation, and if there are heretofore untapped or under-tapped ways we might better understand and engage potential bin Ladens in the future.
Am I glad Osama bin Laden finally was caught? Definitely, although I’m not completely convinced he needed to be killed. Mostly, though, I would like his death to hold some deeper, longer-lasting meaning than simply, once again, proving the fantastic awesomeness of the USA, USA, USA.
Much already has been made of this image, released by the US government in what analysts are saying is an attempt to add insult to injury, in effect, by showing the world and its jihadists that the late al-Qaeda mastermind was far from the mythical figure he’d been built up to be. It’s a visual twist on “the emperor has no clothes”—the emperor has to dress in unflattering layers because he can’t afford to turn the heat up. (And by the way, even Mike Nesmith of the Monkees rocked a wool cap much better than you, ya grizzled old coot.)
On all-news WTOP radio this morning, the guy who leads the “talkback line” discussions solicited listener reactions to the photo. His own take was that it reminded him of the scene in The Wizard of Oz when the mighty Wiz was revealed to be just a bland guy behind a curtain using gadgetry to look tough. I didn’t listen long enough to hear any of the talkback responses, but I expect that a lot of “Guess you weren’t so tough after all, asshole!” gloating ensued. As I played those imagined sound bytes in my head I recalled the TV images in the immediate aftermath of the successful US strike on bin Laden: Jubilant Americans gathering in front of the White House and at other prominent spots across the country to celebrate, nay, revel in the fusillade of lead that ripped through the hated Saudi’s brain. “USA! USA! USA!” vast crowds of people shouted. (Unless I heard it wrong and it actually was “NRA! NRA! NRA!” which on such testosterone-choked occasions essentially means the same thing.)
There is so much that’s sad, self-serving, hypocritical and just plain wrong in all of this that I hardly know where to start addressing it.
Well, let’s take it chronologically. The raid on bin Laden’s heavily fortified compound in Abbottobad, Pakistan (no relation to Lynn or her Jewish family, by the way), was carried out on May 2. The initial reports were that bin Laden was armed and had used one of his wives as a human shield before being blown away by Navy SEALs. Of course, it later turned out that neither detail was true, but the errors served the initial purpose of painting bin Laden as a coward and suggesting that the good guys had been quicker on the draw—an image America loves so much that it’s the theme of thousands of old Westerns.
Next were those scenes of celebration, as hundreds of people in each location—many of them too young to have taken arms against al-Qaeda in the years right after the events of 9/11 in 2001—whooped and hollered and held aloft signs adorned with that iconic American symbol of the early 21st century—crosshairs. My personal favorite was the accounts of the chest-thumping at Philadelphia’s Citizen’s Bank Park, as Neanderthal Phillies fans (apologies to potential Lassitude readers Greg and Betsy) took time out from a baseball game against the Mets to strike up the “USA!” chant after hearing or reading the news on their phones. (Philadelphia sports fans being what they are, I expect more than a few of them also screamed, “Osama, you suck!” and perhaps shouted some vulgar things, as well, about his mother, his religion and that “towel” on his head.)
Then, finally, sobriety set in over the past week with the realizations that 1) al-Qaeda didn’t die with bin Laden, 2) the “shootout” had been the killing of an unarmed man, and 3) the “lavish” estate within which presumably harems of babes had been feeding bin Laden grapes poolside while his disciples shivered in caves in Afghanistan actually is pretty primitive and crappy looking (“cushy” only in the sense that it presumably has dependable electricity and indoor plumbing.) So, on cue, the US government released photos from the stormed compound designed to humiliate bin Laden, and in the process pump up our flagging nationalistic mojo. Of that photo in this morning’s Times, the reporter of the accompanying story wrote that bin Laden had been watching himself on TV “like an aging actor imagining a comeback.” Other confiscated videos reportedly show the old ham “flubbing his lines.” And Osama had been sufficiently mindful of his image, the story noted, to have dyed his white beard black for videos that later were broadcast to the world.
There’s your chronology. Now let’s take a closer look at all this.
I first heard the news that bin Laden had been killed by US forces when I was getting ready for bed that night. I found the radio report interesting, surprising and gratifying, in that order. No “USA!” chant welled in my throat. It simply struck me as very belated good news. A manhunt that had begun nearly a decade earlier finally had concluded. The brains (and start-up funder) of an operation that had killed thousands of Americans, and many people in other countries, had been silenced. This was, no doubt, a good thing.
But there was no feeling of “the monster has been slain.” Because just as I’ve always loathed as self-serving and simplistic the Bush-era terms “terrorists” and “war on terror”—which reduce the enemy to bloodthirsty killers without a shred of legitimate grievance and exalt the United States as utterly untainted defender of democracy and freedom—so, too, have I always distrusted the government-produced and media-amplified portrait of bin Laden as the personification of evil.
To this day, nearly 10 years after the downing of the Twin Towers, most Americans have little tolerance for anyone who so much as suggests that America itself bears an iota of culpability for the sociopolitical climate that gives rise to jihadists. Which seems willfully blind at a time in history when the Egyptian people have risen up to overthrow Hosni Mubarak, one of the many, many autocrats and dictators worldwide the US has propped up monetarily, diplomatically and militarily over the decades, the will of their people be damned. Sure, in each case our government has had its reasons—often arguing (accurately or not) that the despot we back not only is better for US security than the alternatives, but that the alternatives would be worse for that country’s population, as well. Regardless, there are legitimate reasons that many people in many countries see the United States in a way that we seem incapable of seeing ourselves—as the bad guys.
What I’m trying to say is that my reactions to bin Laden’s death were complicated enough that I was in no mood for celebration. Conversely, I was repulsed by the jingoistic displays of delight in a man’s—any man’s—death, however “evil” that individual might have been deemed. When I witnessed those celebratory scenes on TV, in the newspaper and on the Internet, they symbolized to me what’s worst about America: its violent, gun-crazy, vengeance-driven culture. I’m absolutely apoplectic about our toothless gun laws and the national mood that makes any meaningful reforms impossible. Don’t get me started! We may finally have a black president, but don’t expect to see a Second Amendment-doubting president anytime in the next couple of centuries. (Or an atheist, for that matter. Whereas I’d gladly throw my support behind a qualified candidate who makes no pretense of religious belief but who wholeheartedly believes private ownership of guns should have draconian limits.)
I have, however, found gratifying some reactions to the death celebrations. The Washington Post’s online religion section, “On Faith,” one day last week asked the question, “Is it moral to celebrate a person’s death, even if he is guilty of heinous crimes?” Many readers, including members of the clergy, firmly responded, “No.” I’ve seen similar questions asked, and the same answers received, on public radio and on local TV newscasts in the past few days. Then there was the 20-something blogger whose essay was published in the print edition of yesterday’s Washington Post. Alexandra Petri wrote amusingly and insightfully, under the headline “Osama’s Dead—Party On!”, that young revelers fist-pumping the announcement of bin Laden’s death may have seen the scary old Muslim fundamentalist as “our Voldemort.” For her generation—Americans who had been 12 or 14 on 9/11—Osama bin Laden had been “the face behind the random terror of the universe, the dragon we could slay and beat.” It seems to me there's some truth in that. It makes those young partiers seem less vengeful and more naively goofy. Which helps.
I also was reassured by the reaction of my office friend Meghan, who’s in her 20s and told me without solicitation that she found celebrations of bin Laden’s death “creepy”—like something you’d see on the streets of countries where decades of oppression have provoked bloodlust, not in a country where generations of prosperity have produced a youth culture happily ensconced in its Blackberries and iPhones.
Anyway, where was I? Here’s something else that’s been bothering me this past week. Or rather, that always has bothered me, but that this past week’s events have highlighted: The American government, mainstream media and public’s characterization—both reflexive and calculated—of our enemies as weaklings and cowards, and the concurrent view of America as always strong and noble. With Osama, it started when he coordinated the killing of thousands of people at the World Trade Center—the “cowardly” slaying of innocents. Then he ran and hid all those years, also cowardly. Finally, after we finally got him we found videos in which he looked kind of, well, weak and pathetic. What a loser!
Only, in truth, the guy was a genius who brought America to its knees that awful September day, then eluded capture for nearly 10 years, despite having what we now know wasn’t exactly a state-of-the-art command center. And you can hate, as I do, bin Laden’s view that an intolerant, backward, misogynistic distortion of Islam constitutes heaven on Earth, yet still concede that the ambition and execution of his vision was big, bold and daring. Whatever you might think of him personally, the guy did a lot with a little. Of course, many causes backed by fanatical True Believers tend to show outsized results (right down to that overachieving Army of One who went by the professional moniker Unabomber). But it takes strength—of mind, will and, yes, character—to take a ragtag bunch of discontented radicals and mold them into a force that sends the Western world into a panic.
And I’ve got to ask: What kind of a coward goes out of his way to bring the wrath of the most powerful nation on Earth down upon his own head—even if that head ends up topped by a silly looking knit cap? Not just in the case of bin Laden, but in many other instances around the world, it seems to me that much of the time we spend demonizing and belittling our putative enemies might be better spent trying to understand their totality, the history that has created them, and America’s overt and covert role in that history. I’m not saying that bin Laden was a good man. I mean, he thought everyone who didn’t completely share his beliefs was an infidel who should be executed—that, to me, goes several steps beyond tightly wound. The guy was nasty and scary. But—failed attempts to portray him as a luxury-loving wife-sacrificer aside—bin Laden unquestionably had a warped personal integrity.
Even those who’d be dancing on bid Laden’s grave right now, were he not buried at sea, will concede that the War on Terror—or, as I rather un-lyrically call it, the Battle Against Islamic Extremists Who Crave a World So Devoid of Laughter and Fun That You and I, Too, Would Beg for Suicide Missions—is far from over, and that it won’t definitively conclude in the foreseeable future, if ever. (Maybe al-Qaeda ultimately will disappear only after the melting polar ice caps flood its desert training compounds.) But perhaps there’s some takeaway from Osama bin Laden’s death that’s more useful than the realization that many Philadelphia Phillies fans are fat, hairy goombahs. (Which in fact we already knew.)
This is a good time to think about who bin Laden really was, how we (America and the West) contributed to his creation, and if there are heretofore untapped or under-tapped ways we might better understand and engage potential bin Ladens in the future.
Am I glad Osama bin Laden finally was caught? Definitely, although I’m not completely convinced he needed to be killed. Mostly, though, I would like his death to hold some deeper, longer-lasting meaning than simply, once again, proving the fantastic awesomeness of the USA, USA, USA.
Saturday, April 16, 2011
The Damage Done
I was listening to the local Boston and Journey—excuse me, “classic rock”—station on the radio the other day when I heard a promo for an upcoming Neil Young concert in Baltimore. Not that long ago, the prospect would’ve pricked up my ears. As it was, though, it just reminded me all over again what a prick Young had been when I’d paid a fortune to see him perform solo last May at DAR Constitution Hall.
I think I’ll quote from my own capsule review, then dissect it and elaborate.
The italicized material below is what I’d e-mailed the next day to my friend Karen, a single mother of three who I, to my subsequent regret, had convinced to spend a significant chunk of change (though not as much as I did) to catch the legendary rocker on his first DC trip in years. I’d billed Young to her as a must-see performer. (Having—significantly, it would develop—never attended one of his shows.) She seemed nearly as excited as was I when I met her and her date in the lobby before the show. Her oldest son, after all, had chosen a Neil Young song with which to serenade his mom and her new husband when she remarried. (That union proved to be a disaster. Perhaps in retrospect that had been another sign.)
Just before Karen and I headed to our respective seats—mine on the floor and hers in the rafters—she told me the amusing story of her boss’s reaction to her concert plans. The boss—whose name I’ve changed below out of a superabundance of caution for Karen—was incensed that she’d line the pockets of a lefty radical who, in his patriotic opinion, has spent his career figuratively, perhaps literally, soaking our star spangled banner in his salty Canadian spittle.
Anyway, enough back story. This is what I wrote to Karen the morning after the concert:
I hope you’re not cleaning out your desk as you read this because Harvey’s fired you for putting money in Neil “Enemy of America” Young’s wallet. Because I put the idea in your head, and my concert review is that last night totally wasn’t worth being fired over. I guess I’m still glad I was there, because I wouldn’t have wanted to have gone to my grave without seeing him perform live, but, for my money (all $193 of it), the iconoclasm that I so admire in Neil as an artist made for a terrible concert experience. I mean, it amuses me when Neil goes on binges where he records songs, and sometimes whole albums, of music he’s experimenting with or just feels like doing—tunefulness and commercial potential be damned. But then, I don’t have to—and believe me, I don’t—buy that stuff.
But it turns out that’s sort of what I bought—what we bought—last night. We paid for the privilege of attending a distortion-heavy, give-nothing-back-to-the-audience show in which even the potentially crowd-pleasing songs were oddly and off-puttingly arranged, and most of the obscure/new stuff (like that one treacly number at the piano about kids, and that wholly inexplicable encore) simply sucked. I do have to say, though, that I did get a rueful laugh or two out of watching one diehard woman try desperately to groove to that weird encore number through its various meanderings and false endings.
I’m no less enthusiastic this morning about NY and his music, but I doubt I’ll ever again buy a ticket to a show of his.
Now, let the dissection begin.
I hope you’re not cleaning out your desk as you read this because Harvey’s fired you for putting money in Neil “Enemy of America” Young’s wallet. Because I put the idea in your head, and my concert review is that last night totally wasn’t worth being fired over.
Karen graciously let me off the hook, but agreed with all aspects of my assessment. She added that she’s found the volume so “excruciating” that she’d spent part of the concert in the hallway outside the doors. When I read that, I felt bad for her, but also annoyed that our eardrums had been shattered for no good reason. I mean, I’ve been half-deaf after a Who concert yet sufficiently giddy to gladly have sacrificed the rest of my hearing for another long set. I hate to sound like my parents here, but Young’s rock ‘n’ roll was just disagreeably noisy.
I guess I’m still glad I was there, because I wouldn’t have wanted to have gone to my grave without seeing him perform live, but, for my money (all $193 of it), the iconoclasm that I so admire in Neil as an artist made for a terrible concert experience. I mean, it amuses me when he goes on binges where he records songs, and sometimes whole albums, of music he’s experimenting with or just feels like doing—tunefulness and commercial potential be damned. But then, I don’t have to—and believe me, I don’t—buy that stuff.
That passage has special resonance for me at this moment because I’ll be traveling to Raleigh this coming week. A few years ago, as I was leaving that city to drive back home, North Carolina State University’s radio station played a number from Young’s then-new CD. I don’t remember its name, but the lyrics were incredibly trite, the instrumentation unimaginative and the duration eternal. When it finally ended, I guesstimated its length at 16 minutes. It seemed to last until I’d driven over the line into Virginia. It was absolutely execrable. And hilarious.
“Woo! You go, man!” I found myself exclaiming. What I’d always appreciated about Neil Young was that he was both a musical genius—writer and performer of so many incredible songs that I hesitate to name one here for the urge to list 25—and a guy who followed his muse, not trends. He’s made brilliant albums and wretched ones, has crafted sublime melodies and unlistenable garbage. I was, at that moment, hearing a hefty dose of the latter. And it tickled me. Young didn’t give a hoot what we wanted to hear. He knew what he wanted to record. (Though why he wanted to do so is anyone’s guess.) It bespoke a certain artistic integrity and was amusing to experience from a distance. It was as if we were watching a cranky worker approach the nasty old boss, Mr Faceless Recording Industry, and kick him squarely in the ass. Only the roguish malcontent couldn’t be fired because he was worth too damn much to the company.
But Raleigh was then, and the Constitution Hall show was now. That Neil Young didn’t give much of a crap what anyone thought of his musical decisions had been way cooler when I wasn’t sitting right in front of him, desperately wishing I had back my $193, the temporarily inoperative 50% of my hearing, and my precious weekday evening.
We paid for the privilege of attending a distortion-heavy, give-nothing-back-to-the-audience show in which even the potentially crowd-pleasing songs were oddly and off-puttingly arranged, and most of the obscure/new stuff (like that one treacly number at the piano about kids, and that wholly inexplicable encore) simply sucked.
David Malitz, who reviewed the concert for the Washington Post, gently signaled his agreement in his opening line, which was, “Neil Young’s never-ending desire to live in the present can be both his most fascinating and frustrating quality.” It was from Malitz that I learned that fully half of the 18 songs Young had performed that night weren’t just obscure—they were “brand-new, unreleased compositions that have been debuted on his current week-old tour.” None of those tunes “seem likely to enter the Young pantheon,” the reviewer wryly observed. And the encore—“Walk With Me,” per a playlist I found this week on the Internet—had been, Malitz agreed, a "head-scratching” choice.
While he was far more charitable than I in his assessment of the hits Young performed—“Helpless,” “Tell Me Why” and “Cinnamon Girl” among them—Malitz noted that the new numbers had been met with “questioning whispers and staggered bathroom runs.”
Malitz noted that for much of the show (and I would include here the bizarre, choppy renditions of the hits) “no audience member was able to exercise his or her perceived $200-paid right to sing along.” Not that I feel any artist owes the audience a purely greatest-hits show, and not that I buy a concert ticket to hear my fellow patrons harmonize badly. But still. With great ticket price comes great responsibility. Or something like that.
I do have to say, though, that I did get a rueful laugh or two out of watching one diehard woman try desperately to groove to that weird encore number through its various meanderings and false endings.
That was kind of bitterly funny. You know how there’s a stupid song by some awful band that goes, “For those about to rock, we salute you”? (Or something like that?) I kind of wanted to salute that aging hippie. She was so determined to relive her own rock memories, and to show the resilience of her fandom for a 64-year-old rock god, that she was willing to make a complete idiot of herself. (And to no doubt infuriate people behind her who now couldn’t even see the source of their aural misery.)
I’m no less enthusiastic this morning about NY and his music, but I doubt I’ll ever again buy a ticket to a show of his.
As my utter lack of interest in Young’s upcoming Baltimore show suggests, the last part of that sentence is almost certainly true. Many’s the time in the months since the Constitution Hall show that I’ve felt the artist I’d really needed to see live before he stopped touring or died was the late James Brown—the crowd-pleasing, self-proclaimed hardest workin’ man in show business. Not the self-indulgent loner I’d chosen.
But what really makes me sad is that, as much as I wish it wasn’t so, I am less enthusiastic about Neil Young and his music, post-concert. The experience left a sour taste that lingers even now. Young’s best material is as great as it ever was, and I want to enjoy it as much as I ever did. But to date, I simply can’t. Several of his classic CDs—After the Gold Rush, Harvest, Rust Never Sleeps, the lovely Comes a Time—are sitting on my shelf, just waiting to be popped into the boom box or the car player. But then my mind’s eye again envisions the artist standing stolidly on the stage, treating Constitution Hall as his own private laboratory. Barely deigning to address us at all, let alone make us happy. And it still pisses me off.
When I leave Raleigh this time, the radio purposely will be turned off. It’s still too soon. Maybe next trip. I hope. I really do. But I have my doubts.
I think I’ll quote from my own capsule review, then dissect it and elaborate.
The italicized material below is what I’d e-mailed the next day to my friend Karen, a single mother of three who I, to my subsequent regret, had convinced to spend a significant chunk of change (though not as much as I did) to catch the legendary rocker on his first DC trip in years. I’d billed Young to her as a must-see performer. (Having—significantly, it would develop—never attended one of his shows.) She seemed nearly as excited as was I when I met her and her date in the lobby before the show. Her oldest son, after all, had chosen a Neil Young song with which to serenade his mom and her new husband when she remarried. (That union proved to be a disaster. Perhaps in retrospect that had been another sign.)
Just before Karen and I headed to our respective seats—mine on the floor and hers in the rafters—she told me the amusing story of her boss’s reaction to her concert plans. The boss—whose name I’ve changed below out of a superabundance of caution for Karen—was incensed that she’d line the pockets of a lefty radical who, in his patriotic opinion, has spent his career figuratively, perhaps literally, soaking our star spangled banner in his salty Canadian spittle.
Anyway, enough back story. This is what I wrote to Karen the morning after the concert:
I hope you’re not cleaning out your desk as you read this because Harvey’s fired you for putting money in Neil “Enemy of America” Young’s wallet. Because I put the idea in your head, and my concert review is that last night totally wasn’t worth being fired over. I guess I’m still glad I was there, because I wouldn’t have wanted to have gone to my grave without seeing him perform live, but, for my money (all $193 of it), the iconoclasm that I so admire in Neil as an artist made for a terrible concert experience. I mean, it amuses me when Neil goes on binges where he records songs, and sometimes whole albums, of music he’s experimenting with or just feels like doing—tunefulness and commercial potential be damned. But then, I don’t have to—and believe me, I don’t—buy that stuff.
But it turns out that’s sort of what I bought—what we bought—last night. We paid for the privilege of attending a distortion-heavy, give-nothing-back-to-the-audience show in which even the potentially crowd-pleasing songs were oddly and off-puttingly arranged, and most of the obscure/new stuff (like that one treacly number at the piano about kids, and that wholly inexplicable encore) simply sucked. I do have to say, though, that I did get a rueful laugh or two out of watching one diehard woman try desperately to groove to that weird encore number through its various meanderings and false endings.
I’m no less enthusiastic this morning about NY and his music, but I doubt I’ll ever again buy a ticket to a show of his.
Now, let the dissection begin.
I hope you’re not cleaning out your desk as you read this because Harvey’s fired you for putting money in Neil “Enemy of America” Young’s wallet. Because I put the idea in your head, and my concert review is that last night totally wasn’t worth being fired over.
Karen graciously let me off the hook, but agreed with all aspects of my assessment. She added that she’s found the volume so “excruciating” that she’d spent part of the concert in the hallway outside the doors. When I read that, I felt bad for her, but also annoyed that our eardrums had been shattered for no good reason. I mean, I’ve been half-deaf after a Who concert yet sufficiently giddy to gladly have sacrificed the rest of my hearing for another long set. I hate to sound like my parents here, but Young’s rock ‘n’ roll was just disagreeably noisy.
I guess I’m still glad I was there, because I wouldn’t have wanted to have gone to my grave without seeing him perform live, but, for my money (all $193 of it), the iconoclasm that I so admire in Neil as an artist made for a terrible concert experience. I mean, it amuses me when he goes on binges where he records songs, and sometimes whole albums, of music he’s experimenting with or just feels like doing—tunefulness and commercial potential be damned. But then, I don’t have to—and believe me, I don’t—buy that stuff.
That passage has special resonance for me at this moment because I’ll be traveling to Raleigh this coming week. A few years ago, as I was leaving that city to drive back home, North Carolina State University’s radio station played a number from Young’s then-new CD. I don’t remember its name, but the lyrics were incredibly trite, the instrumentation unimaginative and the duration eternal. When it finally ended, I guesstimated its length at 16 minutes. It seemed to last until I’d driven over the line into Virginia. It was absolutely execrable. And hilarious.
“Woo! You go, man!” I found myself exclaiming. What I’d always appreciated about Neil Young was that he was both a musical genius—writer and performer of so many incredible songs that I hesitate to name one here for the urge to list 25—and a guy who followed his muse, not trends. He’s made brilliant albums and wretched ones, has crafted sublime melodies and unlistenable garbage. I was, at that moment, hearing a hefty dose of the latter. And it tickled me. Young didn’t give a hoot what we wanted to hear. He knew what he wanted to record. (Though why he wanted to do so is anyone’s guess.) It bespoke a certain artistic integrity and was amusing to experience from a distance. It was as if we were watching a cranky worker approach the nasty old boss, Mr Faceless Recording Industry, and kick him squarely in the ass. Only the roguish malcontent couldn’t be fired because he was worth too damn much to the company.
But Raleigh was then, and the Constitution Hall show was now. That Neil Young didn’t give much of a crap what anyone thought of his musical decisions had been way cooler when I wasn’t sitting right in front of him, desperately wishing I had back my $193, the temporarily inoperative 50% of my hearing, and my precious weekday evening.
We paid for the privilege of attending a distortion-heavy, give-nothing-back-to-the-audience show in which even the potentially crowd-pleasing songs were oddly and off-puttingly arranged, and most of the obscure/new stuff (like that one treacly number at the piano about kids, and that wholly inexplicable encore) simply sucked.
David Malitz, who reviewed the concert for the Washington Post, gently signaled his agreement in his opening line, which was, “Neil Young’s never-ending desire to live in the present can be both his most fascinating and frustrating quality.” It was from Malitz that I learned that fully half of the 18 songs Young had performed that night weren’t just obscure—they were “brand-new, unreleased compositions that have been debuted on his current week-old tour.” None of those tunes “seem likely to enter the Young pantheon,” the reviewer wryly observed. And the encore—“Walk With Me,” per a playlist I found this week on the Internet—had been, Malitz agreed, a "head-scratching” choice.
While he was far more charitable than I in his assessment of the hits Young performed—“Helpless,” “Tell Me Why” and “Cinnamon Girl” among them—Malitz noted that the new numbers had been met with “questioning whispers and staggered bathroom runs.”
Malitz noted that for much of the show (and I would include here the bizarre, choppy renditions of the hits) “no audience member was able to exercise his or her perceived $200-paid right to sing along.” Not that I feel any artist owes the audience a purely greatest-hits show, and not that I buy a concert ticket to hear my fellow patrons harmonize badly. But still. With great ticket price comes great responsibility. Or something like that.
I do have to say, though, that I did get a rueful laugh or two out of watching one diehard woman try desperately to groove to that weird encore number through its various meanderings and false endings.
That was kind of bitterly funny. You know how there’s a stupid song by some awful band that goes, “For those about to rock, we salute you”? (Or something like that?) I kind of wanted to salute that aging hippie. She was so determined to relive her own rock memories, and to show the resilience of her fandom for a 64-year-old rock god, that she was willing to make a complete idiot of herself. (And to no doubt infuriate people behind her who now couldn’t even see the source of their aural misery.)
I’m no less enthusiastic this morning about NY and his music, but I doubt I’ll ever again buy a ticket to a show of his.
As my utter lack of interest in Young’s upcoming Baltimore show suggests, the last part of that sentence is almost certainly true. Many’s the time in the months since the Constitution Hall show that I’ve felt the artist I’d really needed to see live before he stopped touring or died was the late James Brown—the crowd-pleasing, self-proclaimed hardest workin’ man in show business. Not the self-indulgent loner I’d chosen.
But what really makes me sad is that, as much as I wish it wasn’t so, I am less enthusiastic about Neil Young and his music, post-concert. The experience left a sour taste that lingers even now. Young’s best material is as great as it ever was, and I want to enjoy it as much as I ever did. But to date, I simply can’t. Several of his classic CDs—After the Gold Rush, Harvest, Rust Never Sleeps, the lovely Comes a Time—are sitting on my shelf, just waiting to be popped into the boom box or the car player. But then my mind’s eye again envisions the artist standing stolidly on the stage, treating Constitution Hall as his own private laboratory. Barely deigning to address us at all, let alone make us happy. And it still pisses me off.
When I leave Raleigh this time, the radio purposely will be turned off. It’s still too soon. Maybe next trip. I hope. I really do. But I have my doubts.
Subscribe to:
Posts (Atom)