Dumbledore: Don’t Ask, Don’t Tell


I was all set to write about J. K. Rowling’s announcement that Albus Dumbledore, headmaster of Hogwarts, was gay, but Jason Mittell over at JustTV beat me to it. Rather than reiterating his excellent post, I’ll just point you to it with this link.

Here’s a segment of the comment I left on Jason’s blog, highlighting what I see as a particularly odd aspect of the whole event:

On a structural level, it’s interesting to note that Rowling is commenting on and characterizing an absence in her text, a profound lacuna. It’s not just that Dumbledore’s queerness is there between the lines if you know to read for it (though with one stroke, JKR has assured that future readers will do so, and probably quite convincingly!). No, his being gay is so completely offstage that it’s tantamount to not existing at all, and hence, within the terms of the text, is completely irrelevant. It’s as though she said, “By the way, during the final battle with Voldemort, Harry was wearing socks that didn’t match” or “I didn’t mention it at the time, but one of the Hogwarts restrooms has a faucet that leaked continuously throughout the events of the seven books.” Of course, the omission is far more troubling than that, because it involves the (in)visibility of a marginalized identity: it’s more as though she chose to reveal that a certain character had black skin, though she never thought to mention it before. While the move seems on the surface to validate color-blindness, or queer-blindness, with its blithe carelessness, the ultimate message is a form of “stay hidden”; “sweep it under the rug”; and of course, “Don’t ask, don’t tell.”

We’ve got two more movies coming out, so of course it will be interesting to see how the screenwriters, directors, production designers, etc. — not to mention Michael Gambon — choose to incorporate the news about Dumbledore into the ongoing mega-experiment in cinematic visualization. My strong sense is that it will change things not at all: the filmmakers will become, if anything, scrupulously, rabidly conscientious about adapting the written material “as is.”

But I disagree, Jason, with your contention that Rowling’s statement is not canonical. Come on, she’s the only voice on earth with the power to make and unmake the Potter reality! She could tell us that the whole story happened in the head of an autistic child, a la St. Elsewhere, and we’d have to believe it, whether we liked it or not — unless of course it could be demonstrated that JKR was herself suffering from some mental impairment, a case of one law (medical) canceling out another (literary).

For better or worse, she’s the Author — and if that concept might be unraveling in the current mediascape, all the more reason that people will cling to it, a lifejacket keeping us afloat amid a stormy sea of intepretation.

One Nation Under Stephen


I felt a delicious chill as I read the news that Stephen Colbert is running for President. (He made his announcement on Tuesday’s edition of The Colbert Report, the half-hour news and interview program he hosts on Comedy Central.) Why a chill? For all that I enjoy and respect Colbert, he has always prompted in me a faint feeling of vertigo. Watching his comedy is like staring into a deep well or over the side of a tall building: you get the itchy feeling in your legs of wanting to jump, to give yourself up to gravity and the abyss, obliterating yourself and all that you hold dear. Colbert’s impersonation of a rabidly right-wing, plummily egotistical media pundit is so polished and impenetrable that it stops being a joke and moves into more uncannily undecidable territory: simulation, automaton, a doll that has come to life. Unlike Jon Stewart on The Daily Show, Colbert’s satire doesn’t have a target, but becomes the target, triggering a collapse of categories, an implosion, a joke that eats itself and leaves its audience less thrilled than simply unsure (cf. Colbert’s performance at the 2006 White House Correspondents Dinner, at which he mapped uneasy smiles and half-frowns across a roomful of Republican faces).

Judging from Colbert’s offstage discussion of his work, like his recent interview with Terry Gross of Fresh Air, he’s a modest, sensible, reflective guy, able to view his Report persona with wit and detachment even as he delights in using it to generate ever more extreme, Dada-like interventions in popular and political culture — his Wikipedia mischief being only one instance. My half-serious worry is that with his latest move, he’s unleashed something far bigger than he knows or can control. The decision to place himself on the 2008 Presidential ballot, even if only in South Carolina, has been received by the mainstream media primarily as another ironic turn of the comedy-imitates-reality-imitates-art cycle, noting the echo of Robin Williams’s Man of the Year (2006) and comedian Pat Paulsen’s bid for the White House in 1968. But I think the more accurate and alarming comparison might be Larry “Lonesome” Rhodes, the character played by Andy Griffith in Elia Kazan’s A Face in the Crowd (1957). In that film, Rhodes goes from being a bumpkinish caricature on a television variety show to a populist demagogue, drunk on his own power and finally revealed as a hollow shell, a moral vacuum. The unsubtle message of Kazan’s film is that TV’s pervasive influence makes it a tool for our most destructive collective tendencies — a nation of viewers whose appetite for entertainment leads them to eagerly embrace fascism.


I’d be lying — or at least being flippant — if I claimed to believe that Colbert could be another “Lonesome” Rhodes. I’m neither that cynical about our culture nor that paranoid about the power of media. But given that we live in an era when the opportunities for self-organizing social movements have multiplied profoundly through the agency of the internet, who is to say where Colbert’s campaign comedy will mutate smoothly into something more genuine? Maybe he is, at this moment in history, the perfect protest candidate, smoother and more telegenic than Nader and Perot by orders of magnitude. He just might win South Carolina. And if that happens … what next?

Movie-a-Day: August 2007

Nineteen titles this time around, reflecting the fact that, around the third week of August, the oncoming fall semester was looming as large as the giant alien saucers that shadow the world’s capitals in Independence Day. Getting my courses up and running finally killed poor Movie-a-Day, bringing to an end the blissful ritual that structured my summer … except that Movie-a-Day isn’t dead; in fact, since September 29th, I’ve been back on the plan. Turns out I’m hooked. Hooked not simply on movies – I’ve always loved them – but on the practice of watching them daily, of unequivocally setting aside two-plus hours to lock the door, kick back in my armchair, and do nothing but watch (with occasional note-taking). With the semester underway, it’s a lot harder to find the time, and I’ve loosened the rules a bit (it’s now OK to watch movies I’ve seen before, and films screened for class count). But, like my friend and former employer Sy Safransky at The Sun Magazine, who wakes up every day at 4 a.m. to meditate and write, I’ve learned that the commitment itself – the claim one stakes on each new day – is the real treasure, and not one to be traded away lightly.

As always, I’ve starred the titles that made the greatest impression on me, positive, negative, or any other flavor. More stars than usual in August – either I meandered into some good choices, or I’m becoming easier to impress. (Not that I’m complaining: impressionability is precisely what I’m trying to cultivate.) Also a few more animated titles; recent releases; my first silent film of the summer (unless one counts Sherlock, Jr.); and films from outside the U.S.

Movie-a-Day: August 2007

Butch Cassidy and the Sundance Kid (George Roy Hill, 1969)
Paths of Glory (Stanley Kubrick, 1957)*
Key Largo (John Huston, 1948)*
Waitress (Adrienne Shelly, 2007)*
Ninja Scroll (Yoshiaki Kawajiri, 1993)*
Ghost in the Shell 2: Innocence (Mamoru Oshii, 2004)
The Passion of Joan of Arc (Carl Theodor Dreyer, 1928)*
A Scanner Darkly (Richard Linklater, 2006)
Fantastic Planet (René Laloux, 1973)
Avalon (Mamoru Oshii, 2001)
The Shanghai Gesture (Josef von Sternberg, 1941)*
The Holiday (Nancy Meyers, 2006)
Wicked City (Yoshiaki Kawajiri, 1987)
Sink or Swim (Su Friedrich, 1990)*
A Face in the Crowd (Elia Kazan, 1957)*
One Missed Call (Takashi Miike, 2004)
Medium Cool (Haskell Wexler, 1969)*
Soylent Green (Richard Fleischer, 1973)
Children of Men (Alfonso Cuarón, 2006)*

Better, Stronger, Faster (TM)


Spoiler Alert!

I’ll let you in on a little secret regarding the new NBC series Bionic Woman: they’re all bionic on that show, every last one of them. Sure, the premise centers primarily on one technologically-augmented body, that of Jaime Summers (Michelle Ryan), a bartender injured so severely in a car crash that her boyfriend — an imaginative combination of your standard TV hot guy and your standard mad scientist; think McBrainy — promptly replaces both of Jaime’s legs, one arm, one eye, and one ear, with $50 million worth of bionic machinery, making her about 65% superhuman. The show, a remake or, I suppose, reboot of the 1976-1978 series that starred Lindsay Wagner in the title role, does go one step further by providing a nemesis/doppelganger in the form of Sarah Corvus (Katee Sackhoff), a previous experiment in bionic integration who, either through bad character or having been “hacked,” has become a murderous tormenter of the nameless paragovernmental organization where Jaime now works. (Corvus is also sultry and talks like a film noir femme fatale, but it’s unclear to what degree these traits preceded her upgrade.)

But the truth, as I said before, is that everyone on the show is bionic, from Jaime’s boss Jonas Bledsoe (Miguel Ferrer) to her little sister Becca (Lucy Hale) to the extras that populate the backgrounds. This greater degree of bionicization reflects the enormous strides that have occurred in the field since the late 1970s; see Eric Freedman’s excellent Flow article for a recap. Nowadays, instead of simply tacking on a robotic limb or improved sensory organ here and there, bodies can be implanted with structuring quantities of generic and intertextual material, resulting in characters whose every look, gesture, and word of dialogue issues from another source. The cast of Bionic Woman has literally been stitched together from other TV shows, movies, and comic books — reconstituted like chicken beaks and hog parts into shiny pink hot dogs, repurposed like ground-up car tires into bouncy playground equipment.

And it doesn’t stop there. Internal memoranda leaked to me from showrunner David Eick’s office reveal the deeper layers of bionicization that make up the new series. The settings, while profilmically real enough in their own right, were all first used on other shows — as were the scripts, storylines, character arcs, action setpieces, and cliffhangers. In actuality, Bionic Woman is a postmodern breakthrough, the cutting edge in mashup culture. It exists purely as a composite of borrowed and recycled material, a house mix of C-level storytelling retrofitted from the efflux of the SciFi Channel, USA, and Spike, chopped and reedited into apparently new creative “product.”

My sources inform me that, while the pilot episode was assembled under human supervision, the duties of outputting new weekly 42-minute swatches of text has now been handed over entirely to a computerized system which uses sophisticated pattern recognition to dovetail one unrelated shot or scene to another in semi-plausible continuity. (A related algorithm keeps the editing lightning-fast, ensuring that any mismatches will flash by unnoticed.) There are still a few glitches in the system, evidenced by the second episode’s kludgy splice of two fundamentally incompatible stereotypes into one character (the teenage Becca): as those of us whose organic bodies passed through adolescence know, no one who gets in trouble for smoking pot in the dressing room could simultaneously burn with the dream of performing in her high-school talent show’s number from Annie Get Your Gun. It’s not simply a logical impossibility, but a paradoxical, reality-ripping snarl in the fabric of fictive spacetime. NBC troubleshooters have traced the problem to a dunderheaded subroutine that mistakenly blended Rory Gilmore (Alexis Bledel) from Gilmore Girls with Angela Chase (Claire Danes) from My So-Called Life. The techs say it shouldn’t happen again, but aren’t making any promises.

In the meantime, Bionic Woman will continue to unspool, following its own logic of recombination in blissful automaticity. I find the show more watchable than just about any of the other new offerings of the season, except for Kitchen Nightmares, which quaintly and cannily embeds at least one real person within its own bionic grammar — kind of an inside-out TV cyborg. Certainly Bionic Woman passes the test that Chuck didn’t or couldn’t, drawing me back for a second look. I encourage you to check out Bionic Woman, especially if you’re a fan, as I am, of the sorts of mesmerizingly random patterns that emerge from nonlinear, chaotic flow like lava lamps and screen savers.

Beep … Beep … Beep …


The Soviet satellite Sputnik, launched fifty years ago today, is stitched into my family history in an odd way. A faded Polaroid photograph from that year, 1957, shows my older siblings gathered in the living room in my family’s old house. The brothers and sisters I would come to know as noisily lumbering teenage creatures who alternately vied for my attention and pounded me into the ground are, in the image, blond toddlers messing around with toys. There also happens to be a newspaper in the frame. On its front page is the announcement of Sputnik’s launch.

Whatever the oblique and contingent quality of this captured moment — one time-stopping medium (newsprint) preserved within another (the photograph) — I’ve always been struck by how it layers together so many kinds of lost realities, realities whose nature and content I dwell upon even though, or because, I never knew them personally. Sputnik’s rhythmically beeping trajectory through orbital space echoes another, more idiomatic “outer space,” the house where my family lived in Ann Arbor before I was born (in the early 1960s, my parents moved across town to a new location, the one that I would eventually come to know as home). These spaces are not simply lost to me, but denied to me, because they existed before I was born.

Which is OK. Several billion years fall into that category, and I don’t resent them for predating me, any more than I pity the billions to come that will never have the pleasure of hosting my existence. (I will admit that the only time I’ve really felt the oceanic impact of my own inevitable death was when I realized how many movies [namely all of them] I won’t get to see after I die.) If I’m envious of anything about that family in the picture from fall 1957, it’s that they got to be part of all the conversations and headlines and newspaper commentaries and jokes and TV references and whatnot — the ceaseless susurration of humanity’s collective processing — that accompanied the little beeping Russian ball as it sliced across the sky.

As a fan of the U.S. space program, I didn’t think I really cared that much about Sputnik until I caught a story from NPR on today’s Morning Edition, which profiled the satellite’s designer, Sergei Korolev. One of Korolev’s contemporaries, Boris Chertok, relates how Sputnik’s shape “was meant to capture people’s imagination by symbolizing a celestial body.” It was the first time, to be honest, I’d thought about satellites being designed as opposed to engineered — shaped by forces of fashion and signification rather than the exigences of physics, chemistry, and ballistics. One of the reasons I’ve always liked probes and satellites such as the Surveyor moon probe, the Viking martian explorer, the classic Voyager, and my personal favorite, the Lunar Orbiter 1 (pictured here), is that their look seemed entirely dictated by function.


Free of extras like tailfins and raccoon tails, flashing lights and corporate logos, our loyal emissaries to remoteness like the Mariner or Galileo satellites possessed their own oblivious style, made up of solar panels and jutting antennae, battery packs and circuit boxes, the mouths of reaction-control thrusters and the rotating faces of telemetry dishes. Even the vehicles built for human occupancy — Mercury, Gemini, and Apollo capsules — I found beautiful, or in the case of Skylab or the Apollo missions’ Lunar Module, beautifully ugly, with their crinkled reflective gold foil, insectoid angles, and crustacean asymmetries. My reverence for these spacefaring robots wasn’t limited to NASA’s work, either: when the Apollo-Soyuz docking took place in 1975 ( I was nine years old then, equidistant from the Sputnik launch that bracketed my 1966 birthday), it was like two creatures from the deep sea getting it on — literally bumping uglies.


So the notion that Sputnik’s shape was supposed to suggest something, “symbolizing a celestial body,” took me at first by surprise. But I quickly came to embrace the idea. After all, the many fictional space- and starships that have obsessed me from childhood — the Valley Forge in Silent Running, the rebel X-Wings and Millennium Falcon from Star Wars, the mothership from Close Encounters of the Third Kind, the Eagles from Space 1999, and of course the U.S.S. Enterprise from Star Trek — are, to a one, the product of artistic over technical sensibilities, no matter how the modelmakers might have kit-bashed them into verisimilitude. And if Sputnik’s silver globe and backswept antennae carried something of the 50s zeitgeist about it, it’s but a miniscule reflection of the satellite’s much larger, indeed global, signification: the galvanizing first move in a game of orbital chess, the pistol shot that started the space race, the announcement — through the unbearably lovely, essentially passive gesture of free fall, a metal ball dropping endlessly toward an earth that swept itself smoothly out of the way — that the skies were now open for warfare, or business, or play, as humankind chooses.

Happy birthday, Sputnik!



I like the tired, almost fatalistic tone of Charles Herold’s New York Times review of Halo 3; it’s an unusually self-reflexive piece of videogame criticism. “It doesn’t really matter what reviewers say,” Herold writes with more than a hint of a cynical sigh. “Halo 3 is not just a game: it is a phenomenon fueled by obsessed fans, slick advertising and excessive press coverage (of which I find myself a part).”

Wisely, Herold approaches this newest version of Bungie’s blockbuster series less as a chapter or sequel than as an upgrade. The gist of his review — a line repeated twice, like a mantra — is that “Halo 3 is Halo 2 with somewhat better graphics.” The game’s strengths, he asserts, are in its enhancements to the multiplayer experience, an experience that I consider indistinguishable from the “obsessed fans” and “excessive press coverage” that Herold cites. That is to say, Halo 3 is as much a social game, in its way, as World of Warcraft or Second Life.

Before fans of those elaborate MMORPGs object, let me stipulate that Halo and other shooters involve very different aesthetics of play and intellectual engagement. Movement and communication in deathmatch are channeled and intensified by tactical exigence; interactions are of necessity fast and brutal, and the only economies one deals in are those of weapons and ammo. Foundational dynamics of avatarial immersion and what I have elsewhere called “spectatorial play” are, of course, present in Halo, as they are in any other videogame. But the MMORPG and the FPS deathmatch remain two distinct branches of ludic evolution.

What’s interesting to me about Halo 3‘s huge, immediate, predictable success is that it casts into sharp relief a vast preexisting social base of gamers who sit ready with their XBox 360s to spend hours, days, weeks, and months hunting and blasting each other in the virtual arenas provided by this latest upgrade: a package of new spatialities to explore and master. This base is as loyal as the most devout religious faith, the most engaged political party. (Indeed, I suspect that today’s online gaming audiences, which merge the pragmatics of commercial technology with the mysticism of avatarial transubstantiation, will be looked back upon by future historians as the first true hybridizations of the secular and religious communities.)

It’s too easy to say that Halo differs from something like World of Warcraft in its bloodiness and speed, its apparent simplicity (or primitiveness). I think the more profound distinction lies in the fact that Halo has colonized social spaces beyond those of the MMORPG, something that became clear to me a couple of years ago when I taught a course in videogame ethnography at Indiana University. Many of my students played only Halo and sports games like Madden; these players would never go near EverQuest, for example, because only “hardcore” gamers — the real geeks — played that. Halo, in other words, succeeds as a game because it has gone mainstream, become something one can mention without embarrassment. It nestles much more closely and comfortably into the crenellations and capillaries of real-world social dynamics; it is, in this sense, the norm.