Friday, April 20, 2012

Skies: The Limit

When I was a kid, space travel was exciting.

I’ll never forget sitting in the Claremont Theater in Montclair, New Jersey, in 1968, when I was 10 years old and watching 2001: A Space Odyssey on a towering screen. Of course, I had no idea what director Stanley Kubrick’s film treatment of Arthur C Clarke’s novel was supposed to mean—those shiny black monoliths materializing on prehistoric Earth and the modern-day moon; that soft-spoken computer calmly murdering those poor astronauts; the lone survivor’s spectacular rebirth as a star, or part of the space-time continuum, or some damn thing.

I’m still not sure what it all meant, despite having read various interpretations over the years. I’m scarcely more of a Deep Thinker now than I was at 10, when I hooted over Mad magazine’s parody of the highbrow epic, which was titled 2001: A Space Idiocy. I remember Fred Flintstone standing among the apes that were stoning the monolith. His take on the mysterious structure, as I recall, was a spirited “Yabba dabba doo!”

But man, what a day that was! And not just because Stephen Wasserman’s dad had driven several of us neighborhood kids to one of the Garden State’s fanciest cinemas in celebration of his son’s birthday, and had paid our admission, and afterward had bought each of us a premium sundae at the adjoining ice cream parlor.

The film’s images were so vivid and intoxicating. A space station where beautiful people sipped cocktails, living the Mad Men life far beyond Earth’s orbit. A ship hurtling quietly through the cosmos on autopilot, its crew free to enjoy the infinite vistas beyond the windows. Until, that was, the robot HAL wigged out, advancing his own homicidal agenda in an unnervingly mild voice. Then, finally, there was that crescendo of imploding, strobe-like images, heralding Dave the Astronaut’s reincarnation as the Big Baby of the Universe. (I can’t remember what Mad did with that denouement. I picture a wailing brat in a diaper as big as Jupiter. With wavy stink lines rising from it, suggesting a BM for the ages.)

Then, the following summer, Apollo 11 landed on the moon—for real!—in what was a huge, televised global event. The lunar footage was no less awe-inspiring to me for its primitive black-and-white graininess. Nor to just about anyone else tuning in that day—the Vietnam War, civil unrest and everything else that was happening be damned. It was amazing to think that human beings—Americans, no less—had actually touched down on a distant orb in our night sky. Not only that, but the moon landing fulfilled a seemingly unachievable government promise in an era of “don’t trust anyone over 30” cynicism. It came just eight years after President John F Kennedy had vowed that the US would land a man on the moon by decade’s end. Many years hence, the satiric Onion would perfectly capture the exhilaration of that feat in the hilariously profane but scarcely hyperbolic headline “Holy Shit! Man Walks on Fucking Moon!”

Seriously! People who didn’t live through those times have no idea how motivating and unifying Kennedy’s Cold War bet on Yankee knowhow had been, whatever the sociopolitical maelstrom swirling around it. When Woody Allen joked in his stand-up act around 1965 that a woman had refused to sleep with him “even if it would help the space program,” the comprehensiveness of his rejection was devastatingly clear.

At the dawn of the 1970s, it seemed anything was possible when it came to space exploration. After all, hadn’t man just walked on the fucking moon? (Woman surely would be next, right? Watch her roar!) Back then, it seemed to me—it seemed to most people, I think—that we at least would approach, if not outright achieve, Arthur C Clarke’s vision by 2001. (Ideally without the homicidal computer or the metaphysical mumbo-jumbo, though.) It was easy for me to imagine US astronauts literally being in position, by the turn of the 21st century, to know if Martians wore silver spacesuits and resembled actor Ray Walston, as they had on CBS.

But that wasn’t the course history would take. By 1972 the moon landings were over, as costs mounted, political will waned and earthly priorities shifted. In the decades since, the US, the Soviet Union/Russia, other nations and private industry have launched or partnered in a variety of celestial meet-up stations, rockets, shuttles, rovers and telescopes. We’ve gotten great pictures back from Mars, Jupiter and beyond. Probes and cameras have found evidence of water and other building blocks of life “out there.” Scientists now believe that, given the vastness of space and the statistical likelihood of countless Earth doppelgangers, we’re about as likely to be Alone in the Universe as today’s wired teenager is likely to sit unplugged and alone with his or her thoughts for more than two minutes.

It’s intriguing stuff. But it’s not the same as placing human bodies on celestial ones. Don’t get me wrong. I understand all the reasons why putting an American on Mars or even back on the moon isn’t in the works. We’re dealing, after all, with skyrocketing (no pun intended) budget deficits. And anyway, most pertinent information about space can be obtained by onboard technology on unmanned flights. Also, it’s clear by now that our beleaguered planet will be toast long before we can ever colonize a new one—what with global warming, the coming water and food wars, the multitude of unsafeguarded nuclear weapons lying around, et cetera. So where’s the urgency?

Piggybacking on that last point, it seems to me that our hope for the future has taken a huge hit in the 40 years since the last moon landing. So, we beat the Soviet Union to the moon. Big deal! That entity later disappeared altogether, along with our assumption that winning the Cold War would usher in a new order of peace and prosperity. Where we once looked to the skies with great anticipation, we now see ominous holes in the ozone layer. We envisioned continued public support for real-life star treks, but every last one of the TV iterations ultimately died for lack of audience.

Speaking of piggybacking, what prompted these thoughts was Tuesday’s piggyback flight, atop a Boeing 747, of the space shuttle Discovery to its retirement home at the Smithsonian Air & Space Museum’s Udvar-Hazy Center near Dulles Airport. The flight pattern brought Discovery very near my office building, which sits along the Potomac River in Alexandria. This prompted my boss to send out an e-mail encouraging us to step outside and watch. Crowds gathered on the National Mall, too, to witness what was billed as a historic event. Discovery’s fly-by was all over the news.

I did not move from my desk, though. I could not gin up the enthusiasm others felt. One of my co-workers returned from outside to exclaim that the shuttle sighting had moved her to tears. Whether from admiration, pride, sadness at the end of an era, or a combination of emotions, I didn’t ask. She did pronounce herself a “space geek,” however, and observed that my lack of interest suggests I’m not.

I guess that’s right. If space geekery means being awed by Discovery’s history of manned flights to the International Space Station, I’m underwhelmed. If it means seeing majesty in a double-decker flying object winging its way to the DC suburbs, my glasses do lack a rose-colored tint.

What I said to Lynn when I got home Tuesday was, “It’s not like Discovery went to Mars or something.” Earlier, in an e-mail to a friend who’d watched the fly-by with rapt enthusiasm, I’d written, “To me, the space shuttle is just a fancypants vehicle that hasn’t even been to the moon, and its retirement is a sad reminder that our once-vaunted space program is now earthbound and reliant on private industry. There’s no romance in any of that for me.”

Again, I get it. Successfully transporting astronauts to and from the space station several times ain’t chopped liver. Discovery did boldly go where few humans have gone. It’s an amazing piece of engineering that parents now can share with their children and grandchildren, even as they regale them with the tale of having witnessed Discovery’s final flight. Tuesday’s fly-by was history. It was a singular moment—the conclusion of one chapter of America’s space story and the beginning of an unclear and uncertain future.

I do not, however, regret in the slightest having stayed at my desk while Discovery neared its final destination. I mean, c’mon. My formative astronomical experience was seeing man walk on the fucking moon! The headline this time, to me, was “Hitchhiking Space Taxi Arrives for Museum Debut.” Little mystery or majesty there.

Saturday, April 14, 2012

Lingua Stanca

The New York Times helpfully pointed out recently that my hapless monolingualism not only makes me stupid now, but also may hasten the onset of Alzheimer’s in my fast-approaching dotage.

The op-ed piece was headlined “Why Bilinguals are Smarter,” and its author was Yudhijit Bhattacharjee, a staff writer at Science. I felt hugely defensive from the get-go, and not just because I’d been already been implicitly outed as “dumber.” With a name like Bhattacharjee, I had to assume the author is fluent in Hindi or Urdu in addition to English—making him one of the countless millions of world citizens who are smarter than me.

Then I got to wondering, is “Yudhijit” a guy’s name or a woman’s name? How do you even pronounce it? I felt more stupid by the second.

The word Science gave me pause, also. I mean, Was there any subject at which I’d sucked worse when I was in school? Mathematics, maybe. But that was about it.

I began reading the piece—a decision I quickly regretted. This was the lead paragraph:

“Speaking two languages rather than just one has obvious practical benefits in an increasingly globalized world. But in recent years, scientists have begun to show that the advantages of bilingualism are even more fundamental that being able to converse with a wider range of people. Being bilingual, it turns out, makes you smarter. It can have a profound effect on your brain, improving cognitive skills not related to language and even shielding against dementia in old age.”

That got me to wondering my if lifelong ineptitude at bilingualism means, conversely, that my cognitive skills are actually deficient. That’s a depressing thought, given that even though I took French from junior high through my freshman year in college, the best I can do now is pick out the odd word here and there. When I overhear Gallic conversations on the streets of DC, they translate in my head as roughly “Blah blah blah TODAY. Something something RED something NOW. Yadda yadda yadda MARCEL MARCEAU.”

Not that I was anywhere close to fluent even back in the day. Or back in the jour, as I like to say. At the height of my “powers” (and no, I don’t know the French word for that), I could conjugate a few major verbs and utter a smattering of complete sentences—in the present tense, anyway. Even there, I often mixed up masculine and feminine words. My pronunciation always was atrocious, and I could read at perhaps the comic book level. (Not even the graphic novel level, though graphic novels weren’t yet a thing—a chose!—back then.)

For the life of me, I don’t know how I ever passed a French course. I do have a recollection of linguistic binging and purging—stockpiling French words and phrases for tests, spewing them onto the page at exam time, then walking away empty and spent. Is it possible that I simply retook Beginner’s French every time? It’s a mystere.

What I’m saying is, bilingualism hasn’t even really been a “use it or lose it” thing for me. It’s more that I never had it in the first place. True, I seldom even hear French in my daily life, let alone have occasion to speak it. The point, though, is that I had precious little aptitude for the language even when I was regularly hitting the books. Hitting the … romans? No, wait, that means “novels.” Hitting the … livres! (OK, busted—I just Googled that.)

Lynn took French in school, too, and she’s about as fluent as am I. We joke that we could reside in France for the rest of our lives, hearing nothing but French spoken every day, yet still live to a ripe old age as clueless monolinguals—eking out an existence by pointing pitifully to eau and pain and prevailing upon the kindness of etrangers. Except that it’s not quite a joke. At least in my case. Lynn might eventually pick up the language. I’m pretty sure I wouldn’t, though, even if an entire village were to adopt me as its community service-project idiot.

I’m always amazed by stories of non-English speakers who became fluent in our exceedingly difficult tongue simply by watching American TV. I can envision myself sitting intently through entire seasons of old French sitcoms, yet picking up only the nonverbal stuff—like the Romanian fan of Happy Days who arrives at long last in the city of Milwaukee and longs for someone to ape the Fonz’s “Aaaaay!” because everything else sounds like gibberish.

Several years ago, suckered by the propaganda one hears that Spanish is easy to learn, Lynn and I enrolled in a night class offered by our county government. For several weeks, we drove on a weeknight to a middle school for instruction from a Castro-hating Cuban matron who thought the best way to teach complete novices her native language was via total immersion. Frankly, that lady was insane. But anyway, ask me what Spanish I remember from those weeks. I’ll tell you: Nada. And I didn’t even learn that word there.

Here’s another passage from the New York Times op-ed piece:

“The collective evidence from a number of studies suggests that the bilingual experience improves the brain’s so-called executive function—a command system that directs the attention processes that we use for planning, solving problems and performing various other mentally demanding tasks. These processes include ignoring distractions to stay focused, switching attention willfully from one thing to another and holding information in mind—like remembering a sequence of directions while driving.”

Great! I’m easily distracted, and I remember pretty much nothing that I don’t write down. In fact, I’ve missed more than one meeting at work after blocking out the time on a desk calendar that I then neglected to consult. So, I’ve got to figure that if I weren’t moron-lingual, I’d in many ways be in much better shape. Talk about that “executive function”! What my brain’s got is more like untrained administrative assistant function. (No offense to secretaries and other office aides.)

This, finally, was the New York Times piece’s penultimate paragraph:

“Bilingualism’s effects also extend into the twilight years. In a recent study of 44 elderly Spanish-English bilinguals, scientists led by the neuropsychologist Tamar Gollan of the University of California, San Diego, found that individuals with a higher degree of bilingualism—measured through a comparative evaluation of proficiency in each language—were more resistant than others to the onset of dementia and other symptoms of Alzheimer’s disease: the higher the degree of bilingualism, the later the age of onset.”

Again, I’m wondering if my complete failure in the area of bilingualism conversely portends a particularly dismal fate. Perhaps the fact that I couldn’t beat a French toddler in a debate means I’m due any day now to start losing even the tiny store of mental acuity I currently possess. This thought is very upsetting. Why, I’ll even go so far as to exclaim, “Sacrebleu!”

But damned if Wikipedia doesn’t advise that sacrebleu—described as an “old French profanity meant as a cry of surprise or anger”—no longer is in widespread use in the major French-speaking countries.” Zut alors! Now my working vocabulary of French words may not even surpass double digits.

I guess the one positive in all this, if you want to call it that, is that given the Alzheimer’s thing, I soon may forget all about my complete inability to transcend the English language. Along with everything else.