I was standing beside my car in a 7-Eleven parking one recent morning when I stunned speechful by a bumper sticker on a pickup truck preparing to exit the lot.
What I mean is, the message struck me as so infuriatingly, nonsensically aggrieved that I wanted to walk over, pound on the cab window and scream “WTF?!”
I didn’t do that, but I did ask, “What do you have to be mad about?” loud enough that I startled the neighborhood not-quite-right guy, who was distractedly presiding over the parking lot. The driver gave no indication he’d heard me. The pickup pulled into traffic and began rolling down the road.
The bumper sticker, now disappearing from sight, read, “I’m a Bitter Gun Owner—And I Vote.”
This was my reaction. First, this is America, which is to any semblance of sane gun laws as Syria is to civil liberties or North Korea is to frowning on cult-of-personality dictatorships. While technically the United States is a Western nation, when it comes to gun availability and the resulting violence, fatalities and incarceration rates, we stand alone as an Old West nation. On these shores, a shootout’s possible at any given time.
Second, I was thinking, this is Virginia. (I was in Alexandria at the time, and the pickup bore Virginia plates.) The commonwealth of Virginia has lax gun-control laws even by American standards. I knew for a fact that, among other things, Virginians are legally permitted to openly carry a holstered handgun just about anywhere. I also knew that, on the Brady Campaign to Prevent Gun Violence’s national scorecard, Virginia typically brings up the rear in comparison with more progressive states. (“Progressive,” again, being a relative term.)
What a quick bit of Internet research subsequently elicited was that Virginia’s exact Brady Campaign score is 16 out of 100. “Virginia has weak gun laws,” the Brady summary reads, “that help feed the illegal gun market, allow the sale of guns without background checks and put children at risk.”
As it happened, within hours of my bumper sticker sighting, the Courts of Justice Committee of the Virginia State Senate approved a measure that, if passed by the full legislature, would eliminate the state’s (presumably tyrannous) one-gun-a-month restriction on handgun purchases.
So, when I read that bumper sticker, my visceral rejoinder was, “Bitter? Really?! About what, exactly?”
When I next got to a computer, I Googled the offending phrase, seeking any available insight into what might sour a Virginia gun owner on a political system that all but legalizes a matrimonial state between citizen and firearm.
Ah! The bumper sticker made better contextual sense, at least, when the search results popped up. The sentiment’s genesis had been a speech Barack Obama, then a candidate for president, had given in April 2008 to a private audience at a California fundraiser. In an unfortunate turn of phrase, Obama had expressed his view that some Americans who feel economically dispossessed and betrayed by their government may “get bitter” in response, and may “cling to their guns or religion.”
The comments sparked a huge controversy at the time and sent the Obama campaign temporarily on its heels. Even fellow Democrat Hillary Clinton, then duking it out with Obama for the party’s presidential nod, made political hay of what she described as Obama’s “elitist and divisive comments.”
But to my recollection the backlash from evangelicals had been the harshest; I’d kind of forgotten the “guns” part of the quote. At any rate, Obama ended up winning the election, and he’s been steadily demonized by the political Right in countless other ways in the years since. That explained why I hadn’t made the connection to the speech when I read the bumper sticker.
What the Web search also suggested, unsurprisingly, was that the bumper sticker may well have been manufactured and distributed by the National Rifle Association. A brief visit to the NRA’s Web site—a momentary touch-down that nevertheless left me feeling only slightly less soiled than I might have by visiting the Man-Boy Love Society’s membership page—turned up a T-shirt the NRA had marketed in response to the Obama quote that bore precisely the wording of the bumper sticker. Gun owners had been urged to wear the T-shirt proudly, in order to show their “support for the NRA and the Second Amendment,” and, in the process, to “send an unmistakable message to legislators.”
My reaction there was, “As if Congress and the vast majority of state legislatures aren’t already in the NRA’s back pocket, or cowed into submission by its lobbying power!” Because the one compliment, if you want to call it that, that I’ll pay to the NRA is that they’re awesome at their job. This, to me, is evidenced both at the macro level—where the Second Amendment has become so sanctified in the national psyche that even so-called liberals don’t question its modern-day applicability—and at the micro level, where I’d never think of displaying a bumper sticker stating, “I’m a Bitter Advocate of Strict Gun Control Laws—And I Vote.” Why wouldn’t I? Not because it isn’t true, but because I’m genuinely fearful that some card-carrying, firearm-toting NRA member would shoot my car windows out. And possibly save some lead for me, depending on what kind of day he (or she) had had.
(On a related note, my other favorite bumper sticker I’ll never commission or display, for similar fear of violent reprisal, would read: “It’s Clearly a Choice, but Only Debatably a Baby.”)
Do I sometimes wish for the courage of my convictions? Sure. I mean, I’ve donated money to the Brady Campaign in the past, and I’ve signed the odd gun-control petition over the years, but I’ve never done anything notably public or strident. For one thing, the NRA has managed to pretty much convince me that Resistance Is Futile—that is, that I’ll need to move to Canada or Europe to ever live under laws that strictly or even meaningfully limit private gun ownership. For another thing, per my earlier allusion, gun nuts—of which this country seemingly has too many to count—scare the shit out of me. To put it colloquially, Them People Be Crazy.
I’ve been thinking this week, too, about the resignation from Congress of Gabrielle Giffords. A deranged gunman wielding a legally purchased firearm nearly killed her, did slay six people, and injured an additional 13 individuals in that January 2011 shooting spree in Tucson, Arizona. Politicians from both side of the political aisle wept a few days ago as they saluted the former congresswoman’s courage, marveled at her progress and wished her Godspeed on her continuing rehabilitative journey. For her part, Giffords vowed to return to elective office one day.
Nowhere in the Giffords exit story, however, was the need for sane gun laws mentioned. In the days immediately following the tragedy, that question got a nanosecond of airplay, but the predictable “Guns don’t kill people—people kill people” chorus quickly drowned out such talk. As it always does in this country. In a state like Arizona, any gun-control message never stood a chance, anyway. Arizona’s Brady Campaign score is zero, making Virginia look a surrender monkey in the fight for a Well-Armed Militia Nation. To be sure, Giffords herself was very much a pro-gun Democrat during her time in Congress. While it would be interesting to know if her views have shifted since her cranial encounter with a bullet, any personal change of heart would be immaterial should she indeed ever seek her old congressional seat. In the Grand Canyon State, you can’t advocate for gun control and win. Candidate Giffords presumably would keep any such thoughts to herself.
I recently happened upon a great op-ed piece on the upshot (no pun intended) of the Tucson tragedy. It wasn’t published in a prominent journal of opinion and wasn’t linked to Giffords’ resignation, but that event clearly was the reason the satirical Web publication The Onion had resurrected and highlighted the link. The faux essay by a made-up commentator named Ellen Crawford-Price was dated May 24, 2011, and bore the headline, “Let’s Just Go Ahead and Assume We’ve Learned the Lessons of the Gabrielle Giffords Shooting.”
The writer noted that she hadn’t “heard so much as a word” in the three months since the shooting about such briefly raised issues as gun control, America’s treatment of its mentally ill (a reference to accused gunman Jared Loughner) or the “inflammatory political rhetoric” that renders impossible any meaningful discussion of such matters. But that being the case, “Crawford-Price” continued with a benefit-of-the-doubt shrug, “I’m going to go ahead and assume that at some point we thoroughly explored those complex issues, resolved them, and now are living our lives based on the lessons we learned from the in-depth conversations I assume we had.”
She felt safe in making those assumptions, she continued, “Because, after all, if we had just brushed aside the life-altering assassination attempt of a congresswoman, as well as the death of a federal judge and a 9-year-old girl without seizing the opportunity to address our nation's glaring problems, then all the shooting victims would have died in vain, and all 300 million of us would be irresponsible, superficial hypocrites with the attention spans of newborns.”
That made me laugh.
But of course, the laughter was … what’s the word I’m looking for?
Oh yeah: bitter.
(Editor’s note: For a related post, read “Song Remains the Same,” from July 30 of last year. Sometimes I’m only belatedly aware that I’ve repeated myself. So sue me. Just don’t shoot me.)
Saturday, January 28, 2012
Monday, January 16, 2012
Corps Values?
Like most Americans who aren’t idiots (more about Rick Perry shortly), I was appalled and saddened but hardly shocked when video recently emerged of four US Marines urinating on the bodies of several dead Taliban fighters in Afghanistan. Or, as I prefer to de-sanitize this morally indefensible desecration, pissing on corpses.
The reaction among US military and civilian officials was swift and appropriate. Secretary of State Hillary Clinton and Defense Secretary Leon Panetta immediately condemned the Marines’ actions, and the military promised an inquiry. Clinton called the incident “absolutely inconsistent with American values.”
But was it? As several commentaries I’ve read in the days since have pointed out, American values, particularly in the realm of war, don’t seem quite what they used to be. In cased you missed it, I’d like to quote a few paragraphs from an op-ed piece published in yesterday’s Washington Post that summarized better than I ever could have the disconnect between our collective outrage and the messages we’ve been imbuing in our fighting forces. The following words were written by Sebastian Junger, an author whose best-selling 2010 book War chronicled a year he’d spent embedded with US soldiers in the Korengal Valley of eastern Afghanistan, and a director whose film Restropo, released the same year, also focused on that conflict. The ensuing was excerpted from an essay to which the Post affixed the headline “We’re All Guilty of Dehumanizing the Enemy.”
“There is a context for this act in which we are all responsible, all guilty. A 19-year-old Marine has a very hard time reconciling the fact that it’s okay to waterboard a live Taliban fighter but not okay to urinate on a dead one.
“When the war on terror started, the Marines in that video were probably 9 or 10 years old. As children they heard adults—and political leaders—talk about our enemies in the most inhuman terms. The Internet and the news media are filled with self-important men and women referring to our enemies as animals that deserve little legal or moral consideration. We have sent enemy fighters to countries like Syria and Libya to be tortured by the very regimes that we have recently condemned for engaging in war crimes and torture. They have been tortured into confessing their crimes and then locked up indefinitely without trial because their confessions—achieved through torture—will not stand up in court.
“For the past 10 years, American children have absorbed these moral contradictions, and now they are fighting our wars. The video doesn’t surprise me, but it makes me incredibly sad—not just for them, but also for us. We may prosecute these men for desecrating the dead while maintaining that it is okay to torture the living.”
That doesn’t by any stretch absolve the responsible parties. But I do think it’s important and instructive to contextualize their actions, as Junger so concisely and powerfully did in the preceding paragraphs. I feel obliged, too, to add that I readily concede that it’s easy for me, never having served in the military, to say that even the hell of war would never compell me to unzip my pants and splash yellow contempt on a lifeless adversary. As Junger pointed out elsewhere in his essay, our soldiers in various ways are encouraged to regard the enemy in dehumanized terms, because “otherwise they would have to face the enormous guilt and anguish of killing other human beings.” Is it such a leap to go from satisfaction in a job well done, in that context, to taking a sort of sick pleasure in it?
But I do think “sick” is the key word there. I was ruminating on all this in the shower yesterday when the name of a movie I’d nervously but eagerly rented from a video store many years ago popped into my head. In those pre-Internet days the title alone had earned the film considerable infamy: I Spit on Your Grave. Released in 1978, it's a revenge fantasy in which a young female novelist seeking a quiet place to write in rural New York state is savagely raped by a group of young men and then proceeds to murder them in increasingly brutal fashion: hanging one, leaving another to bleed to death in a tub after slicing off his penis, dismembering yet another with the blades of a speedboat.
I still remember that I was as self-conscious about renting it as I would’ve been had I placed on the counter a porn flick. But then, of course, I Spit on Your Grave was simply a different kind of porn. I’d rented it for all the reasons adolescent and arrested-adolescent guys (I’d have been in my 20s then) did and still do a lot of things: for the promise of exposed breasts, the expectation of lots of blood and the titallation of glimpses at the forbidden.
The movie delivered on all those counts. It was very graphic for its time and fully merited its word-of-mouth infamy. But it also had an effect on me I hadn’t at all expected: It ultimately grossed me out and really made me think.
Most of all, it made me think about the cancer of retribution and vengeance, even when it’s arguably merited and is dispensed in response to the worst sorts of atrocities. For a low-budget movie that was populated with no-name actors portraying prototypes more than recognizable people, the film did an excellent job of pressing my buttons. Having squirmed through the brutalizations of the victim, I initially delighted in the unconflicted sadism with which she turned avenging heroine and Got Even. But then a funny thing happened. Regret and shame gradually overtook me, to the point that when I unceremoniously dumped the video in the store’s overnight slot early the next morning, I felt sullied by the entire experience and mad for having allowed myself to be so thoroughly manipulated.
What I thought about as I stood in the shower yesterday were the similarities between the cinematic I Spit on Your Grave and the military I Piss on Your Corpse. Sure, one could argue—as the drunk-with-revenge Marines in that now-infamous video surely did—that those murdered Taliban fighters Had It Coming. That they, or their cronies, had killed American soldiers. That, given a Taliban culture in which beheadings and all other manner of brutality toward one’s perceived enemies are not uncommon, the deceased, had they survived, might well have lifted their robes in a similar situation and Done the Same Thing to Us. But does that make it right? Does that make it moral?
Here’s where Texas Gov Rick Perry comes in. Rather than condemning the uniformed urinators, he reserved his umbrage for the Obama administration—accusing it of “over-the-top rhetoric” and “disdain for the military.” While being careful not to outright praise the soldiers for a Texas-sized arrogance one sensed he admired—Perry does still have a terminal but not yet dead presidential campaign to consider, after all—he took a break from the stump yesterday to tell CNN that the corpse desecration had been, in fact, nothing more than a “stupid mistake” made by a quartet of guys who are, after all, “just kids.” There certainly was nothing criminal about their actions, Perry added—the Geneva Conventions be damned.
Just as I wasn’t surprised that some US soldiers would take pride in relieving themselves on dead men, I wasn’t surprised that Rick Perry felt compelled to defend such behavior. In fact, given that he has vigorously defended waterboarding during the Republican presidential debates, his comments about the Marines’ actions are more or less what I might have scripted him to say. But I will add this: His words make me want to spit in his eye.
Unlike Rick Perry, however, I would stop there.
The reaction among US military and civilian officials was swift and appropriate. Secretary of State Hillary Clinton and Defense Secretary Leon Panetta immediately condemned the Marines’ actions, and the military promised an inquiry. Clinton called the incident “absolutely inconsistent with American values.”
But was it? As several commentaries I’ve read in the days since have pointed out, American values, particularly in the realm of war, don’t seem quite what they used to be. In cased you missed it, I’d like to quote a few paragraphs from an op-ed piece published in yesterday’s Washington Post that summarized better than I ever could have the disconnect between our collective outrage and the messages we’ve been imbuing in our fighting forces. The following words were written by Sebastian Junger, an author whose best-selling 2010 book War chronicled a year he’d spent embedded with US soldiers in the Korengal Valley of eastern Afghanistan, and a director whose film Restropo, released the same year, also focused on that conflict. The ensuing was excerpted from an essay to which the Post affixed the headline “We’re All Guilty of Dehumanizing the Enemy.”
“There is a context for this act in which we are all responsible, all guilty. A 19-year-old Marine has a very hard time reconciling the fact that it’s okay to waterboard a live Taliban fighter but not okay to urinate on a dead one.
“When the war on terror started, the Marines in that video were probably 9 or 10 years old. As children they heard adults—and political leaders—talk about our enemies in the most inhuman terms. The Internet and the news media are filled with self-important men and women referring to our enemies as animals that deserve little legal or moral consideration. We have sent enemy fighters to countries like Syria and Libya to be tortured by the very regimes that we have recently condemned for engaging in war crimes and torture. They have been tortured into confessing their crimes and then locked up indefinitely without trial because their confessions—achieved through torture—will not stand up in court.
“For the past 10 years, American children have absorbed these moral contradictions, and now they are fighting our wars. The video doesn’t surprise me, but it makes me incredibly sad—not just for them, but also for us. We may prosecute these men for desecrating the dead while maintaining that it is okay to torture the living.”
That doesn’t by any stretch absolve the responsible parties. But I do think it’s important and instructive to contextualize their actions, as Junger so concisely and powerfully did in the preceding paragraphs. I feel obliged, too, to add that I readily concede that it’s easy for me, never having served in the military, to say that even the hell of war would never compell me to unzip my pants and splash yellow contempt on a lifeless adversary. As Junger pointed out elsewhere in his essay, our soldiers in various ways are encouraged to regard the enemy in dehumanized terms, because “otherwise they would have to face the enormous guilt and anguish of killing other human beings.” Is it such a leap to go from satisfaction in a job well done, in that context, to taking a sort of sick pleasure in it?
But I do think “sick” is the key word there. I was ruminating on all this in the shower yesterday when the name of a movie I’d nervously but eagerly rented from a video store many years ago popped into my head. In those pre-Internet days the title alone had earned the film considerable infamy: I Spit on Your Grave. Released in 1978, it's a revenge fantasy in which a young female novelist seeking a quiet place to write in rural New York state is savagely raped by a group of young men and then proceeds to murder them in increasingly brutal fashion: hanging one, leaving another to bleed to death in a tub after slicing off his penis, dismembering yet another with the blades of a speedboat.
I still remember that I was as self-conscious about renting it as I would’ve been had I placed on the counter a porn flick. But then, of course, I Spit on Your Grave was simply a different kind of porn. I’d rented it for all the reasons adolescent and arrested-adolescent guys (I’d have been in my 20s then) did and still do a lot of things: for the promise of exposed breasts, the expectation of lots of blood and the titallation of glimpses at the forbidden.
The movie delivered on all those counts. It was very graphic for its time and fully merited its word-of-mouth infamy. But it also had an effect on me I hadn’t at all expected: It ultimately grossed me out and really made me think.
Most of all, it made me think about the cancer of retribution and vengeance, even when it’s arguably merited and is dispensed in response to the worst sorts of atrocities. For a low-budget movie that was populated with no-name actors portraying prototypes more than recognizable people, the film did an excellent job of pressing my buttons. Having squirmed through the brutalizations of the victim, I initially delighted in the unconflicted sadism with which she turned avenging heroine and Got Even. But then a funny thing happened. Regret and shame gradually overtook me, to the point that when I unceremoniously dumped the video in the store’s overnight slot early the next morning, I felt sullied by the entire experience and mad for having allowed myself to be so thoroughly manipulated.
What I thought about as I stood in the shower yesterday were the similarities between the cinematic I Spit on Your Grave and the military I Piss on Your Corpse. Sure, one could argue—as the drunk-with-revenge Marines in that now-infamous video surely did—that those murdered Taliban fighters Had It Coming. That they, or their cronies, had killed American soldiers. That, given a Taliban culture in which beheadings and all other manner of brutality toward one’s perceived enemies are not uncommon, the deceased, had they survived, might well have lifted their robes in a similar situation and Done the Same Thing to Us. But does that make it right? Does that make it moral?
Here’s where Texas Gov Rick Perry comes in. Rather than condemning the uniformed urinators, he reserved his umbrage for the Obama administration—accusing it of “over-the-top rhetoric” and “disdain for the military.” While being careful not to outright praise the soldiers for a Texas-sized arrogance one sensed he admired—Perry does still have a terminal but not yet dead presidential campaign to consider, after all—he took a break from the stump yesterday to tell CNN that the corpse desecration had been, in fact, nothing more than a “stupid mistake” made by a quartet of guys who are, after all, “just kids.” There certainly was nothing criminal about their actions, Perry added—the Geneva Conventions be damned.
Just as I wasn’t surprised that some US soldiers would take pride in relieving themselves on dead men, I wasn’t surprised that Rick Perry felt compelled to defend such behavior. In fact, given that he has vigorously defended waterboarding during the Republican presidential debates, his comments about the Marines’ actions are more or less what I might have scripted him to say. But I will add this: His words make me want to spit in his eye.
Unlike Rick Perry, however, I would stop there.
Friday, January 6, 2012
Milestone for a Star of Science
The history of my time with Stephen Hawking was brief indeed.
I’d been sucked in (black hole allusion unintended but fortuitous) by all the hoopla surrounding the famed physicist’s best-selling book, A Brief History of Time. It had come out in 1988 and was an immediate sensation, promising readers an accessible and even entertaining guide to unlocking the universe’s mysteries. Having fulfilled my minimal science requirement in college only by writing a science fiction short story for extra credit (bless you, Dr Danforth, for taking pity on liberal arts majors), I was excited by the prospect of becoming a starry-eyed smartypants without having to expend much effort.
So, at some point I cracked open somebody’s copy of the book. I was crestfallen, however, to discover that the wheelchair-bound scientist’s purportedly breezy roll down Cosmos Lane was unadorned by the cartoon drawings and elementary captions I’d hopefully envisioned. What Hawking surely had deemed a significant dumbing-down of the material still left it several levels above my comprehension. But then I hit upon another possible path to meteoric expansion of my astronomical knowledge: I’d buy the audio book and listen in the car! Some upcoming weekend, I’d depart Savannah, Georgia, where I was living at the time, an interstellar ignoramus but would arrive several hours later at my parents’ home in Greensboro, North Carolina, a newly minted Big Bang brainiac.
In retrospect, I’m not sure why I felt I’d easily comprehend and feel enlivened by the audio version of a book that had vexed and bored the hell out of me in the 10 minutes I’d devoted to the print version. Perhaps I thought it would be abridged to within 30 pages of its life, and/or alluringly voiced by a hot Hollywood actress. Possibly, having had no experience to that point with audio books, I thought the spoken version would be the true Idiot’s Guide I’d sought in the first place—in sort of the same way that marketers earlier had transformed the genius of Einstein into a T-shirt on which he goofily sticks out his tongue.
As you might have guessed, my brilliant plan didn’t work out too well. If the audiocassette version of A Brief History of Time had been abridged at all, it certain hadn’t been purged of all the multi-syllabic science words and allegedly thought-provoking concepts that merely had provoked me to seek aspirin for an early-onset headache. And the reader certainly wasn’t some sultry actress who made deep space sound titillatingly like Deep Throat, or who lent relativity the ring of an after-party romp. I can’t remember who the narrator was, except that he was male and was not the author—whose computer-synthesized voice might at least have intrigued me for a time. As it was, by 20 minutes in I was violently banging my leg against the car door, trying desperately to keep myself awake and alert in highway traffic. About 10 minutes after that, half-asleep at the wheel and seeing the wrong kind of stars, I nearly drove off a bridge on I-95. It was at that point that I elected not to contribute my body to eternity in a failed attempt to better understand timelessness. Out popped A Brief History of Time, replaced by a music CD—probably something as cerebral as the Ramones—as I turned the page on my failed foray into cosmology.
The reason Stephen Hawking is on my mind this week is because I read yesterday that he’ll turn 70 this Sunday. Which is startling when you consider that the guy first started showing signs of Lou Gehrig’s Disease 50 years ago, when he was at university at Cambridge. (Although Hawking doesn’t call it Lou Gehrig’s Disease, just as we Yanks don’t call scoliosis David Beckham Syndrome, even though it leaves people bent. The physicist sometimes calls his condition by its widely familiar medical name, ALS—amyotrophic lateral sclerosis—but more often refers to it as “motor neurone disease.”)
I’m not proud of myself for this, but frankly, most times I’ve given Hawking any thought at all in the years since his book almost caused my death, it’s been not to ponder his work or to admire his amazing productivity in the face of his physical limitations, but to ask, either to myself or out loud, “Shouldn’t that guy be dead by now?” The news story about his pending 70th birthday reinforced my feeling that Hawking’s biggest accomplishment might simply be his continued existence. I mean, Lou Gehrig himself was dead within a few years of giving his tear-jerking “luckiest man on the face of the Earth” farewell speech at Yankee Stadium, and many of us, myself included, have at least peripherally known of people who’ve contracted ALS later in life and have died rather quickly and horrifyingly, becoming completely paralyzed and ultimately losing even their ability to breath. So, how is it possible that Hawking has lived long enough to become such an icon that he’s been featured in a skit on Conan O’Brien’s show and has appeared in animated form on The Simpsons?
I’ve done a little reading on this subject in the past 24 hours, and the answer is that nobody—including smartest-guy-in-the-room Hawking himself—really knows why he’s still around. Apparently if you’re fated to get ALS, it’s better to do so at a younger age, as he did, because survival periods tend to be longer. And certainly the fame and wealth that Hawking’s fully functional brain has brought him has bankrolled the very best care possible. At this point in the disease’s progression, he has around-the-clock aid. Virginia Lee, a brain disease expert at the University of Pennsylvania School of Medicine, was quoted as saying, in one article I read, “The disease can sometimes stabilize, and then the kind of care delivered may be a factor in survival.” Remaining mentally alert, she added, “also is extremely important, and [Hawking] clearly has done that.” Still, only about 10 percent of people with ALS live longer than a decade. Most succumb to the disease within two to five years. I also read that Hawking’s DNA is being analyzed for possible clues to his longevity—yet another way in which he may yet contribute to science.
I found on Hawking’s Web site an essay he wrote, or dictated, titled, “Professor Stephen Hawking’s Disability Advice.” This is the opening paragraph:
“I am quite often asked, ‘What do you think about having ALS?’ The answer is, not a lot. I try to lead as normal a life as possible, and not to think about my condition, or regret the things it prevents me from doing. Which are not that many.”
Toward the end of the essay, Hawking notes that ALS “has not prevented me from having a very attractive family and being successful in my work.” Indeed, he’s been married twice and has three children, including a daughter with whom he’s written several children’s books on physics. He’s earned a shelf-full of science awards, been named a Commander of the British Empire and has been honored with this country’s Presidential Medal of Freedom. And he hasn’t been spending his senior years contentedly resting on his laurels, either. After years of coyly speaking of God in a vague metaphorical sense, the cosmologist threw diplomacy to the winds in 2010, telling The Guardian newspaper that he believes there is “no heaven or afterlife.” He called such a notion “a fairy story for people afraid of the dark.”
Clearly Stephen Hawking doesn’t fear the dark, and hasn’t for the extraordinary length and breadth of his surprising and inspiring life and career. He might have done his damnedest to kill me 20 years ago, but I nevertheless must salute the guy, wish him a happy birthday, and hope he makes it through another year—rather than assuming he won’t.
I’d been sucked in (black hole allusion unintended but fortuitous) by all the hoopla surrounding the famed physicist’s best-selling book, A Brief History of Time. It had come out in 1988 and was an immediate sensation, promising readers an accessible and even entertaining guide to unlocking the universe’s mysteries. Having fulfilled my minimal science requirement in college only by writing a science fiction short story for extra credit (bless you, Dr Danforth, for taking pity on liberal arts majors), I was excited by the prospect of becoming a starry-eyed smartypants without having to expend much effort.
So, at some point I cracked open somebody’s copy of the book. I was crestfallen, however, to discover that the wheelchair-bound scientist’s purportedly breezy roll down Cosmos Lane was unadorned by the cartoon drawings and elementary captions I’d hopefully envisioned. What Hawking surely had deemed a significant dumbing-down of the material still left it several levels above my comprehension. But then I hit upon another possible path to meteoric expansion of my astronomical knowledge: I’d buy the audio book and listen in the car! Some upcoming weekend, I’d depart Savannah, Georgia, where I was living at the time, an interstellar ignoramus but would arrive several hours later at my parents’ home in Greensboro, North Carolina, a newly minted Big Bang brainiac.
In retrospect, I’m not sure why I felt I’d easily comprehend and feel enlivened by the audio version of a book that had vexed and bored the hell out of me in the 10 minutes I’d devoted to the print version. Perhaps I thought it would be abridged to within 30 pages of its life, and/or alluringly voiced by a hot Hollywood actress. Possibly, having had no experience to that point with audio books, I thought the spoken version would be the true Idiot’s Guide I’d sought in the first place—in sort of the same way that marketers earlier had transformed the genius of Einstein into a T-shirt on which he goofily sticks out his tongue.
As you might have guessed, my brilliant plan didn’t work out too well. If the audiocassette version of A Brief History of Time had been abridged at all, it certain hadn’t been purged of all the multi-syllabic science words and allegedly thought-provoking concepts that merely had provoked me to seek aspirin for an early-onset headache. And the reader certainly wasn’t some sultry actress who made deep space sound titillatingly like Deep Throat, or who lent relativity the ring of an after-party romp. I can’t remember who the narrator was, except that he was male and was not the author—whose computer-synthesized voice might at least have intrigued me for a time. As it was, by 20 minutes in I was violently banging my leg against the car door, trying desperately to keep myself awake and alert in highway traffic. About 10 minutes after that, half-asleep at the wheel and seeing the wrong kind of stars, I nearly drove off a bridge on I-95. It was at that point that I elected not to contribute my body to eternity in a failed attempt to better understand timelessness. Out popped A Brief History of Time, replaced by a music CD—probably something as cerebral as the Ramones—as I turned the page on my failed foray into cosmology.
The reason Stephen Hawking is on my mind this week is because I read yesterday that he’ll turn 70 this Sunday. Which is startling when you consider that the guy first started showing signs of Lou Gehrig’s Disease 50 years ago, when he was at university at Cambridge. (Although Hawking doesn’t call it Lou Gehrig’s Disease, just as we Yanks don’t call scoliosis David Beckham Syndrome, even though it leaves people bent. The physicist sometimes calls his condition by its widely familiar medical name, ALS—amyotrophic lateral sclerosis—but more often refers to it as “motor neurone disease.”)
I’m not proud of myself for this, but frankly, most times I’ve given Hawking any thought at all in the years since his book almost caused my death, it’s been not to ponder his work or to admire his amazing productivity in the face of his physical limitations, but to ask, either to myself or out loud, “Shouldn’t that guy be dead by now?” The news story about his pending 70th birthday reinforced my feeling that Hawking’s biggest accomplishment might simply be his continued existence. I mean, Lou Gehrig himself was dead within a few years of giving his tear-jerking “luckiest man on the face of the Earth” farewell speech at Yankee Stadium, and many of us, myself included, have at least peripherally known of people who’ve contracted ALS later in life and have died rather quickly and horrifyingly, becoming completely paralyzed and ultimately losing even their ability to breath. So, how is it possible that Hawking has lived long enough to become such an icon that he’s been featured in a skit on Conan O’Brien’s show and has appeared in animated form on The Simpsons?
I’ve done a little reading on this subject in the past 24 hours, and the answer is that nobody—including smartest-guy-in-the-room Hawking himself—really knows why he’s still around. Apparently if you’re fated to get ALS, it’s better to do so at a younger age, as he did, because survival periods tend to be longer. And certainly the fame and wealth that Hawking’s fully functional brain has brought him has bankrolled the very best care possible. At this point in the disease’s progression, he has around-the-clock aid. Virginia Lee, a brain disease expert at the University of Pennsylvania School of Medicine, was quoted as saying, in one article I read, “The disease can sometimes stabilize, and then the kind of care delivered may be a factor in survival.” Remaining mentally alert, she added, “also is extremely important, and [Hawking] clearly has done that.” Still, only about 10 percent of people with ALS live longer than a decade. Most succumb to the disease within two to five years. I also read that Hawking’s DNA is being analyzed for possible clues to his longevity—yet another way in which he may yet contribute to science.
I found on Hawking’s Web site an essay he wrote, or dictated, titled, “Professor Stephen Hawking’s Disability Advice.” This is the opening paragraph:
“I am quite often asked, ‘What do you think about having ALS?’ The answer is, not a lot. I try to lead as normal a life as possible, and not to think about my condition, or regret the things it prevents me from doing. Which are not that many.”
Toward the end of the essay, Hawking notes that ALS “has not prevented me from having a very attractive family and being successful in my work.” Indeed, he’s been married twice and has three children, including a daughter with whom he’s written several children’s books on physics. He’s earned a shelf-full of science awards, been named a Commander of the British Empire and has been honored with this country’s Presidential Medal of Freedom. And he hasn’t been spending his senior years contentedly resting on his laurels, either. After years of coyly speaking of God in a vague metaphorical sense, the cosmologist threw diplomacy to the winds in 2010, telling The Guardian newspaper that he believes there is “no heaven or afterlife.” He called such a notion “a fairy story for people afraid of the dark.”
Clearly Stephen Hawking doesn’t fear the dark, and hasn’t for the extraordinary length and breadth of his surprising and inspiring life and career. He might have done his damnedest to kill me 20 years ago, but I nevertheless must salute the guy, wish him a happy birthday, and hope he makes it through another year—rather than assuming he won’t.
Subscribe to:
Posts (Atom)