Conventional Wisdom

Ooooh, the next two weeks have me tingling with anticipation: it’s time again for the Democratic National Convention and its bearded-Spock alternate-universe doppelganger, the Republican National Convention. I intend to watch from my cushy couch throne, which magisterially oversees a widescreen high-def window into the mass ornament of our country’s competing electoral carnivals.

Strangely, the Olympics didn’t hold me at all (beyond the short-lived controversy of their shameless simulationism), even though they served up night after night of HD spectacle. It wasn’t until I drove into the city last week to take in a Phillies game that I realized how hungry I am to immerse myself in that weird, disembodied space of the arena, where folks to the right and left of you are real enough, but rapidly fall away into a brightly-colored pointillist ocean, a rasterized mosaic that is, simply, the crowd, banked in rows that rise to the skyline, a bowl of enthusiastic spectatorial specks training their collective gaze on each other as well as inward on a central proscenium of action. At the baseball game I was in a state of happy distraction, dividing my attention among the actual business of balls, strikes, and runs; the fireworky HUDs of jumbotrons, scoreboards, and advertising banners, some of which were static billboards and others smartly marching graphics; the giant kielbasa (or “Bull Dog”) smothered with horseradish and barbecue sauce clutched in my left hand, while in my right rested a cold bottle of beer; and people, people everywhere, filling the horizon. I leaned over to my wife and said, “This is better than HD — but just barely.”

Our warring political parties’ conventions are another matter. I don’t want to be anywhere near Denver or Minneapolis/St. Paul in any physical, embodied sense. I just want to be there as a set of eyes and ears, embedded amid the speechmakers and flagwavers through orbital crosscurrents of satellite-bounced and fiber-optics-delivered information flow. I’ll watch every second, and what I don’t watch I’ll DVR, and what I don’t DVR I’ll collect later through the discursive lint filters of commentary on NPR, CNN, MSNBC, and of course Comedy Central.

The main pleasure in my virtual presence, though, will be jumping around from place to place inside the convention centers. I remember when this joyous phenomenon first hit me. It was in 1996, when Bill Clinton was running against Bob Dole, and my TV/remote setup were several iterations of Moore’s Law more primitive than what I wield now. Still, I had the major network feeds and public broadcasting, and as I flicked among CBS, NBC, ABC, and PBS (while the radio piped All Things Considered into the background), I experienced, for the first time, teleportation. Depending on which camera I was looking through, which microphone I was listening through, my virtual position jumped from point to point, now rubbing shoulders with the audience, now up on stage with the speaker, now at the back of the hall with some talking head blocking my view of the space far in the distance where I’d been an instant previously. It was not the same as Classical Hollywood bouncing me around inside a space through careful continuity editing; nor was it like sitting in front of a bank of monitors, like a mall security guard or the Architect in The Matrix Reloaded. No, this was multilocation, teletravel, a technological hopscotch in increments of a dozen, a hundred feet. I can’t wait to find out what all this will be like in the media environment of 2008.

As for the politics of it all, I’m sure I’ll be moved around just as readily by the flow of rhetoric and analysis, working an entirely different (though no less deterministic) register of ideological positioning. Film theory teaches us that perceptual pleasure, so closely allied with perceptual power, starts with the optical and aural — in a word, the graphic — and proceeds downward and outward from there, iceberg-like, into the deepest layers of self-recognition and subjectivity. I’ll work through all of that eventually — at least by November 4! In the meantime, though, the TV is warming up. And the kielbasa’s going on the grill.

Technologies of Disappearance

My title is a lift from Alan N. Shapiro’s interesting and frustrating book on Star Trek as hyperreality, but what motivates me to write today are three items bobbing around in the news: two from the world of global image culture, the other from the world of science and technology.

Like Dan North, who blogs smartly on special effects and other cool things at Spectacular Attractions, I’m not deeply into the Olympics (either as spectator or commentator), but my attention was caught by news of what took place at last week’s opening ceremonies in Beijing. In the first case, Lin Miaoke, a little girl who sang the revolutionary hymn “Ode to the Motherland,” was, in turns out, lip-synching to the voice of another child, Yang Peiyi, who was found by the Communist Party politburo to be insufficiently attractive for broadcast. And in the second case, a digitally-assisted shot of fireworks exploding in the nighttime sky was used in place of the evidently less-impressive real thing.

To expound on the Baudrilliardian intricacies at play here hardly seems necessary: the two incidents were tied together instantly by the world press and packaged in headlines like “Fakery at the Olympics.” As often happens, the Mass Media Mind — churning blindly away like something from John Searles’s Chinese room thought experiment — has stumbled upon a rhetorical algorithm that tidily condenses several discourses: our simultaneous awe and dread of the powers of technological simulation; the sense that the Olympics embodies an omnivorous spectacularity threatening to consume and amplify beyond recognition all that is homely and human in scale; and good ol’ fashioned Orientalism, here resurrected as suspicion of the Chinese government’s tendency toward manipulation and disguise. (Another “happy” metaphorical alignment: the visibility-cloaking smog over Beijing, so ironically photogenic as a contrast to the crisp and colorful mass ornament of the crowded, beflagged arenas.)

If anything, this image-bite of twinned acts of deception functions, itself, as another and trickier device of substitution. Judging the chicanery, we move within what Adorno called the closed circle of ideology, smugly wielding criticism while failing to escape the orbit of readymade meanings to question more fundamental issues at stake. We enjoy, that is, our own sense of scandal, thinking it premised on a sure grasp of what is true and indexical — the real singer, the unaltered skies — and thus reinscribe a belief that the world can be easily sorted into what is real and what is fake.

Of course it’s all mediated, fake and real at the same time, calibrated as cunningly as the Olympics themselves. Real bodies on bright display in extremes of exertion unimaginable by this couch potato: the images on my high-def screen have rarely been so viscerally indexical in import, every grimace and bead of sweat a profane counterpoint to sacred ballistics of muscled motion. But I fool myself if I believe that the reality of the event is being delivered to me whole. Catching glimpses of the ongoing games as I shuffle through surrounding channels of televisual flow is like seeing a city in flickers from a speeding train: the experience julienned by commercials and camera cuts, embroidered by thickets of helpful HUD graphics and advertisers’ eager logos. Submerged along another axis entirely is the vanished reality of the athletes’ training: eternities of drilling and repetition, an endless dull disciplining at profound odds with the compacted, adrenalized, all-or-nothing showstoppers of physical prowess.

Maybe the collective fascination of the lip-synching stems from our uncomfortable awareness that we’re engaged in a vicarious kind of performance theft, sitting back and dining on the visual feast of borrowed bodily labor. And maybe the sick appeal of the CG fireworks is our guilty knowledge that human beings are functioning as special effects themselves, there to evince oohs and ahs. All I know is that the defense offered up by the guy I heard on BBC World News this morning seems to radically miss the point. Madonna and Britney lip-synch, he said: why is this any different? As for the digital fireworks, did we really expect helicopters to fly close to the airborne pyrotechnics? The cynicism of the first position, that talent is always a manufactured artifact, is matched by the blase assumption of the second, permuting what we might call the logic of the stuntman: if an exploit is too dangerous for a lead actor to perform, sub in a body worth a little less. In the old days, filmmakers did it with people whose names appeared only in the end credits (and then not among the cast). Nowadays, filmmakers hand the risk over to technological standins. In either case, visualization has trumped representation, the map preceding the territory.

But I see I’ve fallen into the trap I outlined earlier, dressing up in windy simulationist rhetoric a more basic dismay. Simply put, I’m sad to think of Yang Peiyi’s rejection as unready for global prime time, based on a chubby face and some crooked teeth (features, let me add, now unspooling freely across the world’s screens — anyone else wondering how she’ll feel at age twenty about having been enshrined as the Ugly Ducking?). Prepping my Intro to Film course for the fall, I thought about showing Singin’ in the Rain — beneath its happy musical face a parable of insult in which pretty but untalented people highjack the vocal performances of skilled but unglamorous backstage workers. Hey, I was kind of a chubby-faced, snaggle-toothed kid too, but at least I got to sing my own part (Frank Butler) in Annie Get Your Gun.

In other disappearance news: scientists are on their way to developing invisibility. Of this I have little to say, except that I’m relieved the news is getting covered at all. There’s more than one kind of disappearance, and if attention to events at Berkeley and Beijing is reassuring in any measure, it’s in the “making visible” of cosmetic technologies that, in their amnesiac emissions and omissions, would otherwise sand off the rough, unpretty edges of the world.

Singing Along with Dr. Horrible

Duration, when it comes to media, is a funny thing. Dr. Horrible’s Sing-Along Blog, the web-distributed musical (official site here; Wiki here), runs a tad over 42 minutes in all, or about the length of an average hour-long block of TV entertainment with commercials properly interspersed. But my actual experience of it was trisected into chunks of approximately 15 minutes, for like your standard block of TV programming (at least in the advertising-supported format favored in the U.S.), Dr. Horrible is subdivided into acts, an exigence which shapes the ebb and flow of its dramatic humours while doing double service as a natural place to pause and reflect on what you’ve seen — or to cut yourself another slice of ice-cream cake left over from your Dairy-Queen-loving relatives’ visit.

That last would be a blatantly frivolous digression, except in this key sense: working my way through the three acts of Dr. Horrible was much like consuming thick slices of icy sweetness: each individual slab almost sickeningly creamy and indulgent, yet laced throughout with a tantalizingly bitter layer of crisp chocolate waferage. Like the cake, each segment of the show left me a little swoony, even nauseated, aesthetic sugar cascading through my affective relays. After each gap, however, I found myself hungry for more. Now, in the wake of the total experience, I find myself contemplating (alongside the concentrated coolness of the show itself) the changing nature of TV in a digital moment in which the forces of media evolution — and more properly convergence — have begun to produce fascinating cryptids: crossbred entities in which the parent influences, harmoniously combined though they might be, remain distinct. Sweet cream, bitter fudge: before everything melts together to become the soupy unremarkable norm, a few observations.

Ultimately, it took me more than two weeks to finish Dr. Horrible. I watched the first two acts over two nights with my wife, then finished up on my own late last week. (For her part, Katie was content to have the ending spoiled by an online forum she frequents: a modern Cliffs Notes for media-surfers in a hurry to catch the next wave.) So another durative axis enters the picture — the runtime of idiosyncratic viewing schedules interlacing the runtime of actual content, further macerated by multiple pausings and rewindings of the iPod Touch that was the primary platform, the postage-stamp proscenium, for my download’s unspooling. Superstring theorists think they have things hard with their 10, 11, or 26 dimensions!

As such, Horrible‘s cup of video sherbet was the perfect palate-cleanser between rounds of my other summer viewing mission — all five seasons of The Wire. I’m racing to get that series watched before the school year (another arbitrary temporal framework) resumes in three weeks; enough of my students are Wireheads that I want to be able to join in their conversations, or at least not have to fake my knowing nods or shush the conversation before endings can be ruined. On that note, two advisories about the suspense of watching The Wire. First, be careful on IMDb. Hunting down members of the exceptionally large and splendid cast risks exposing you to their characters’ lifespans: finding out that such-and-such exits the series after 10, 11, or 26 episodes is a pretty sure clue as to when they’ll take a bullet in the head or OD on a needle. Second, and relatedly, it’s not lost on this lily-white softy of an academic that I would not last two frickin’ seconds on the streets of Baltimore — fighting on either side of the drug wars.

Back to Dr. Horrible. Though other creators hold a somewhat higher place in my Pantheon of Showrunners (Ronald D. Moore with Battlestar Galactica, Matt Weiner with Mad Men, and above them all, of course, Trek‘s Great Bird of the Galaxy, Gene Roddenberry), Joss Whedon gets mad props for everything from Buffy the Vampire Slayer to Firefly/Serenity and for fighting his way Dante Alighieri-like through the development hell of Alien Resurrection. I was only so-so about the turn toward musical comedy Whedon demonstrated in “Once More with Feeling,” the BtVS episode in which a spell forced everyone to sing their parts; I always preferred Buffy when the beating of its heart of darkness drowned out its antic, cuddly lullabies.

But Dr. Horrible, in a parallel but separate universe of its own, is free to mix its ugliness and frills in a fresh ratio, and the (re)combination of pathos and hummable tunes works just fine for me. Something of an inversion of High School Musical, Dr. Horrible is one for all the kids who didn’t grow up pretty and popular. Moreover, its rather lonesome confidence in superweaponry and cave lairs suggests a masculine sensibility: Goth Guy rather than Gossip Girl. Its characters are presented as grownups, but they’re teenagers at the core, and the genius is in the indeterminacy of their true identities; think of Superman wearing both his blue tights and Clark Kent’s blue business suit and still not winning Lois Lane’s heart. My own preteen crush on Sabrina Duncan (Kate Jackson) of Charlie’s Angels notwithstanding, I first fell truly in love in high school, and it’s gratifying to see Dr. Horrible follow the arc of unrequited love, with laser-guided precision, to its accurate, apocalyptically heartbreaking conclusion.

What of the show as a media object, which is to say, a packet-switched quantum of graphic data in which culture and technology mingle undecidably like wave and particle? NPR hailed it as the first genuine flowering of TV in a digital milieu, and perhaps they’re right; the show looks and acts like an episode of something larger, yet it’s sui generis, a serial devoid of seriality. It may remain a kind of mule, giving rise to nothing beyond the incident of itself, or it may reproduce wildly within the corporate cradle of Whedon’s Mutant Enemy and in the slower, rhizomatic breeding beds of fanfic and fanvids. It’s exciting to envision a coming world in which garage- and basement-based production studios generate in plenty their own Dr. Horribles for grassroots dissemination; among the villains who make up the Evil League of Evil, foot-to-hoof with Bad Horse, there must surely stand an Auteur of Doom or two.

In the mise-en-abyme of digital networks, long tails, and the endlessly generative matrix of comic books and musical comedy, perhaps we will all one day turn out to be mad scientists, conquering the world only to find we have sacrificed the happy dreams that started it all.

No Stopping the Terminator’s Serial Return

sarah_connor_chronicles.jpg

I’m enjoying Terminator: The Sarah Connor Chronicles in the same three-quarters, semi-committed way I enjoyed Star Trek Voyager (the other show I’m catching up on via iPod during the writer’s strike): it ain’t Shakespeare, but it’ll do. The FOX network’s new program reinvents the venerable Terminator franchise by serializing it, retrofitting the premise into yet another of television’s narrative deli slicers. Now, instead of giant cinematic slabs of Termination every decade or so (The Terminator in 1984; Terminator 2: Judgment Day in 1991; Terminator 3: Rise of the Machines in 2003), we get wafer-thin weekly slices, peeled from the block of a setup almost mathematically pristine in its triangulation of force and counterforce: Sarah Connor, mother of a future leader of a rebellion against sentient, world-ending AI; John Connor, her teenaged son who will eventually grow into that man; and an endless procession of Terminators, silvery machine skeletons cloaked in human flesh, whose variations on the theme of lethally-unstoppable-yet-childishly-innocent are as simultaneously charming, pathetic, and horrifying as Boris Karloff’s Frankenstein monster.

frankenstein1.jpgterminator1.jpg

The fact that these cyborgs are sent back in time to our present day to menace the Connors a priori is what lends the story its relentless urgency — it’s an almost perfect chase structure — as well as allowing it to encompass nearly any conceivable narrative permutation. Science fiction’s most productive conceit (at least in terms of storytelling), time travel and its even zanier offshoot, parallel universes, grant drama a toolkit of feints, substitutions, and surprises otherwise acceptable only in avant-garde experimentation (the cryptic as art) and tales of pure subjectivity (it was all a dream). When characters are bopping back and forth along the timespace continuum, in other words, it’s possible to stretch continuity to the breaking point, risking the storyworld’s implosion into absurdity, only to save it at the last minute by revealing each seeming reversal of cause and effect to be part of a larger logic of temporal shenanigans.

Hence the concept of the narrative reset button — a staple of Star Trek‘s many dips into the time-travel well — and the freedom of the Chronicles to recast its leads in a trendy, demographic-friendly makeover. Lena Headey takes over the role of Sarah from the movies’ Linda Hamilton; John, played in T2 by Edward Furlong and T3 by Nick Stahl, here is played by Thomas Dekker, Claire’s nerdy videographer friend in the first season of Heroes. It kind of all makes sense if you squint, turn your head sideways, and tell yourself that maybe this is all some parallel reality splintered off from the one James Cameron created (as indeed it is, industrially). More galvanizing is the recasting of the key Terminator — a “good” one, we presume, though its origin and nature are one of the series’ close-kept secrets — as a beautiful young woman, approximately John’s age.

summer-glau.jpg

As Cameron (get it?), Summer Glau plays the Terminator with a sort of detached glaze; think Zooey Deschanel with the power to bend steel and survive an eight-story fall. Though her ability to convincingly mimic human social behavior fluctuates alarmingly, the character is great, and her presence at one node of the Connor triangle remaps gender and sexual relationships in a way that is both jarring and absolutely plausible. In T2 the “good” Terminator (played by Arnold Schwarzenegger) had the square, inert reliability of John Wayne’s taxidermied corpse, and the absence of romantic chemistry between him and Hamilton’s Sarah seemed natural. (If he was a Dad-figure, it was a 50’s sitcom Dad — patriarchal but neutered.) Things are very different between Cameron and John, at least in subtext, and for that matter between Cameron and Sarah. If only because it seems such a ripe matrix for fannish invention, Chronicles marks the first time in a while I’ve been curious to seek out some slash.

As far as plotting goes, the new series seems primed to run for a while, if it can find its audience. The time-travel motif has already enabled our three protagonists to fast-forward from 1999 to 2007, introducing a fun fish-out-of-water counterpoint to the predictable (but very satisfying) setpieces of superpowerful beings throwing each other through walls, firing automatic weapons into each other, and getting mowed down by SUVs. I’m sure if things get dull, or when Sweeps loom, we’ll be treated to glimpses of the future (when the human-Skynet war is at its visual-FX-budget-busting peak) or the distant past (anyone for a Civil-War-era Terminator hunt?).

Overall I’m pleased with how gracefully this franchise has found a fresh point of generation for new content — how felicitously the premise has fitted itself to the domain of serial TV, with its unique grammar of cliffhangers, season-arcs, and long-simmering mysteries of origins, destinations, and desires. If they last, the Chronicles promise to be far more rewarding than the projected live-action Star Wars series (an experience I expect to be like having my soul strip-mined in between commercial interludes). Notwithstanding the cosmic expense of the second and third feature films, there’s always been something visceral and pleasingly cheap about the Terminator fantasy, remnant of its shoestring-budget 1984 origins; Terminator‘s simplified structure of feeling seems appropriate to the spiritual dimensions of televised SF. Like those robots that keep coming back to dog our steps, the franchise has found an ideal way to to insinuate itself into the timeline of our weekly viewing.

American Idolatry

philadelphia_idol.jpg

It’s back, somehow seeming simultaneously like manna from heaven and a plague of fire descending to destroy us all.

My wife and I readied ourselves for last night’s premiere of American Idol‘s seventh season by cooking a delicious dinner and timing it so that we sat down to eat just as Ryan Seacrest appeared onscreen, accompanied by the chords of that theme song — so much like the revving of an enormous engine. That this engine has churned to life six times previously does not at all diminish its impact: on the contrary, its sweet electronic throb was a song of salvation, delivering us from our long trek through the dramaless desert of the writer’s strike.

OK, maybe that’s a bit strong: the last several months have certainly not been devoid of drama, whether in the kooky kaleidoscope of the presidential race or (ironically, tautologically) the writer’s strike itself, which has provided us all an opportunity to catch up on our viewing backlog as well as to reflect on what it means to have writers writing television at all. (Random observation: since new commercials keep coming out, does this mean that the creative content of advertising and marketing isn’t considered writing?) And as always, the concentric circles of collective fascination in the United States, with TV at the center of the conceptual bullseye, stop well short of encompassing the large and very real dramas experienced by the rest of the world; in other words, we should take a moment to remember that not everybody on Earth cares so passionately about what Simon, Paula, and Randy think, or about who will succeed Jordin Sparks.

For my wife and me, the excitement was multiplied by the fact that Idol‘s first round of auditions took place in Philadelphia, the city we now live a shocking 15 minutes away from. As the camera panned across the familiar skyline, it was hard not to succumb to Idol‘s implicit ideological address: I see you! Louis Althusser defined interpellation as the moment when, walking down a street, a cop calls out “Hey you!” and you turn around, believing yourself (mistakenly) to be the subject of the law’s address. For me, it’s all summed up in Ryan Seacrest’s direct gaze into the camera.

seacrest.jpg

And then, of course, there are the crowds of hopefuls. Finally I see the point of high-def TV; the mass ornament of naked ambition, in all its variegated poppy-field glory, never looked so good as when rendered in HD. And at the other pole of specular mastery, the bad hair, flop sweat, and glazed eyeballs of the doomed auditioners has never seemed more squirm-inducingly intimate. Yes, the opening rounds of the new Idol promise to be as relentlessly mean as the previous seasons; nowhere is the long take used with more calculated cruelty than in the expertly extended coverage of contestants’ caterwauling, or the deadtime of waiting for the judges to drop the ax. Indeed, part of the pleasure of the coming months is knowing that out of this primordial, cannibalistic muck, we will ritualistically pull ourselves onto the shores of civilization, narrowing the field to the truly “worthy” singers, casting out the misfits until we arrive at the golden calf, er, One True Talent. On American Idol, cultural ontology recapitulates phylogeny; we replay, each season, the millennial-scale process by which our species learned to stop eating each other and got around to more important things, like popularity contests. (Come to think of it, Idol is also a lot like high school.)

The other nice thing about Idol is that there’s so freaking much of it; the two hours shown last night were but a skimming of the hundreds of hours of footage shot by the production. Assuming the brand continues its juggernaut profitability (and hey, now that TV is all reality n’ reruns, what’s to stop it?), we may someday see an entire channel devoted to nothing but Idol. That said, it was with something of a pop-culture hangover that I awoke this morning and realized that tonight brings yet more auditions, this time from Dallas.

My wife and I will be right there/here, watching. Will you?

Going with the Flow

batttlestar-logo.jpg

FlowTV’s new issue is out (or, given its online nature, up): a special edition on Battlestar Galactica, guest-edited by Lynne Joyrich and Julie Levin Russo with the help of FlowTV’s editorial liaison Jean Anne Lauer. My own contribution, Downloads, Copies, and Reboots: Battlestar Galactica and the Changing Terms of TV Genre, uses Galactica’s storied evolution — its many iterations and reinventions — as a springboard for thinking about how industrial replication structures TV as well as ways of talking about TV: in particular, the emergence of terms like reboot and showrunner, which seem to me laden with implications about how TV is being reconfigured in the popular (and industrial) imaginary.

Here’s an excerpt:

Ronald D. Moore’s Battlestar Galactica is, of course, a remake or — his preferred term — “reimagining” of Glen A. Larson’s Battlestar Galactica, which ran from 1978-1979. Even in that first, Carter-era incarnation, the show occupied an undecidable space between copy and original; it was judged by many, including George Lucas and 20th Century Fox, to be a bald steal of Star Wars (1977). (Evidence of thievery was not merely textual; two of Lucas’s key behind-the-scenes talents, conceptual artist Ralph McQuarrie and visual-effects guru John Dykstra, defected to the Galactica team.) And following its first cancellation by ABC, the series was followed by the much-loathed “relaunch,” Galactica 1980, which ran just ten episodes before dying on the Nielsen vine.

The irony is not just that the 1978-1980 versions of Battlestar Galactica have now come to be seen as canonical by a subset of fans who reject Moore’s version as being GINO (“Galactica In Name Only”). Popular culture, especially from the 1950s onward, is marked by an alchemical process of nostalgia by which even the most derivative texts (Star Wars being the chief example) grow a callus of originality simply through continual shoulder-bumping with the ripoffs, sequels, and series that follow. Such is the nature of the successful media franchise, doomed to plow forward under the ever-increasing inertia of its own fecund replication.

No, what’s striking about the many iterations of Galactica is how cleanly the coordinates of its fantasy lure have flipped over time, illustrating the ability of genre myths to reconfigure themselves around new cultural priorities. Larson’s Battlestar Galactica, even in its heyday, was pure cheese, a disco-hued mélange of droning chrome robots, scrappy space cowboys, a cute mechanical dog, and endless space battles (whose repetitive nature can be attributed to the exigencies of weekly production; as with the first Star Trek, pricey optical effects were recycled to amortize their cost). Back then, it was fun to fantasize planetary diaspora as effervescent escape; the prospect of being chased from our homeworld by cyclopean robots with a mirror finish seemed, by the late seventies, as giddily implausible as Ronald Reagan moving into the White House.

But nowadays, the dream embodied in Battlestar Galactica has inverted frictionlessly into nightmare. The shift in tone is reflected in a new design scheme of drably militaristic grays and browns, brutal drumbeats on the soundtrack, and jittery camerawork on both actors and spaceships — thanks to the digital-effects house Zoic, whose signature visuals lend zoomy, handheld verisimilitude to the combat scenes. It all comes inescapably together to suggest a very different mindset: hunted, paranoid, and starkly conscious of the possibility of spiritual, if not physical, annihilation.

What I do see Battlestar Galactica bringing to the table with fresh force is the useful concept of the reboot as a strategy for dealing with franchise fatigue. A liberating alternative to the depressingly commercial and linear “sequel,” the reboot signals a profound shift in how we perceive and receive serial media. We are coming to see serial dramas as generative systems, more about ground rules and conditions of possibility than events or outcomes. (And I would argue that the only sane serial aesthetic is one that allows for occasional misfires; one bad episode does not a series invalidate.) Like the terms canon and retcon, the reboot borrows from brethren like comic books and print lit. Like the term game-changer, it characterizes TV production in computational terms, as ludic algorithm. And like the term show-runner, it signals our growing comfort with the notion of series as industrial product, indeed, as series: a potentially unending churn of a diegetic engine rather than a standalone text.

Other articles include Anne Kustritz on fans and producers; Melanie E. S. Kohnen on history and technology; Sarah Toton on fan-generated databases; and a conversation with Galactica star Mary McDonnell.

Razor’s Edge

cain.jpg

Tonight I had the privilege of attending an advance screening of “Razor,” the Battlestar Galactica telefilm that will be broadcast on the SciFi Channel on November 24. Fresh from the experience, I want to tell you a bit about it. I’ll keep the spoilers light – that said, however, read on with caution, especially if, like me, you want to remain pure and unsullied prior to first exposure.

Along with several colleagues from Swarthmore College, I drove into Philadelphia a couple of hours before the 7 p.m. showing, fearing that more tickets had been issued than there were seats; this turned out not to be a problem, but it was fun nevertheless – a throwback to my teenage days in Ann Arbor when I stood in line for midnight premieres of Return of the Jedi and Indiana Jones and the Temple of Doom – to kill time with a group of friends, all of us atingle with anticipation, eyeing the strangers around us with a mingled air of social fascination (are we as nerdy as they are?) and prefab amity (hail, fellow travelers, well met!).

The event itself was interesting on several levels, some of them purely visual: We knew we’d be watching a video screener blown up onto a movie-sized screen, and true to expectation, the image had the washed-out, slightly grainy quality that I’m coming to recognize now that I’m getting used to a high-def TV display. (Things overall are starting to look very good in the comfort of my living room.) There was also the odd juxtaposition of completely computer-generated science-fiction imagery in the plentiful ads for Xbox 360 titles such as Mass Effect and the new online Battlestar Galactica game (yes, more tingling at this one) with the actual show content – the space battles especially were in one sense hard to distinguish from their Xbox counterparts.

But at the same time, the entire program served as a reminder of what makes narratively-integrated visual effects sequences more compelling (in a certain sense) than their videogame equivalents. “Razor”’s battle scenes, of which there are – what’s the technical term? – puh-lenty, carry the dramatic weight of documentary footage or at least historical reenactments, by comparison to which the explosive combat of Mass Effect and the BSG game were received by audiences with the amused condescension of parents applauding politely an elementary-school play starring somebody else’s kids. Disposable entertainment, in a word, paling beside the high-stakes offering of “real” Galactica – and not just any Galactica, but the backstory of one of BSG’s most nightmarish and searing storylines, that of the “lost” Battlestar Pegasus and her ruthlessly hardline commander, Admiral Helena Cain (Michelle Forbes).

(I’ll get to the meat of the story in a moment, but one last thought on the blatantly branded evening of Microsoft-sponsored fun: does anyone really own, or use, or enjoy their Zune? The ad we watched [twice] went to great lengths to portray the Zune as better than an iPod – without ever mentioning iPods, of course – but the net effect was to remind me that a device intended to put portable personal media on a collective footing is as useless as a prehensile toe if no one around you actually owns the thing. “Welcome to the Social,” indeed.)

On to “Razor” itself. Was it any good? In my opinion, it was fantastic; it did everything I wanted it to do, including

  • Lots of space battles
  • Hard military SF action, namely a sequence highly reminiscent of the Space Marine combat staged to perfection by James Cameron in Aliens
  • A few heart-tugging moments, including several exchanges between Bill Adama (Edward James Olmos) and his son Lee (Jamie Bamber) of a type that never fail to bring tears to my eyes
  • Scary, Gigerish biomedical horror
  • Aaaaand the requisite Halloween-candy sampler of “revelations” regarding BSG’s series arc, which I won’t go into here except to note that they do advance the story, and suitably whet my appetite for season four (assuming the writer’s strike doesn’t postpone it until 2019).

A better title, then, might be “Razor: Fanservice,” for this long-awaited installment returns to the foreground many of the elements that made BSG such a potent reinvigoration of televised SF when it premiered in the U.S. at the end of 2004. Since then, Galactica has flagged in ways that I detail in an essay for an upcoming issue of Flow devoted to the series; but judging from “Razor,” showrunner Ronald D. Moore, like Heroes’s Tim Kring, has heard the fans and decided to give them what they want.

For me, the season-two Pegasus arc marked a kind of horizon of possibility for Galactica’s bold and risky game of mapping the least rendering of real-world political realities – namely government-sponsored torture questionably and conveniently justified by the “war on terror” – in SF metaphor. With the exception of the New Caprica arc that ended season two and began season three, the show has never since quite lived up to the queasy promise of the Pegasus storyline, in which a darkly militarized mirror-version of the valiant Galactica crew plunged itself with unapologetic resolve into Abu Ghraib-like sexual abuse and humiliation of prisoners.

What “Razor” does so engrossingly is revisit this primal scene of Galactica’s complex political remapping to both rationalize it – by giving us a few more glimpses of Admiral Cain’s pre- and post-apocalypse behavior and inner turmoil – and deepen its essential and inescapable repugnance. We’re given a framework, in other words, for the unforgivable misdeeds of Pegasus’s command structure and its obedient functionaries; the additional material both explains and underscores what went wrong and why it should never happen again.

Perhaps most strikingly, “Razor” provides a fantasy substitute for George W. Bush — a substitute who, despite her profoundly evil actions, is reassuring precisely because she seems aware of what she has wrought. In the film’s crucial scene, Cain instructs her chief torturer, Lieutenant Thorne (Fulvio Cecere), to make Six (Tricia Helfer)’s interrogation a humiliating, shameful experience. “Be creative,” Cain commands, and the fadeout that follows is more chilling than any clinically pornographic rendering of the subsequent violence could ever be. Precisely because I cannot imagine the cowardly powers-that-be, from Bush, Dick Cheney, and Alberto Gonzales on down to Lynndie England and Charles Graner, to ever take responsibility in the straightforward way that Cain does, this scene strikes me as one of the most powerful and eloquent portrayals of the contemporary U.S./Iraqi tragedy that TV has generated.

Admiral Cain is the real frog in SF’s imaginary garden. Moreover, her brief return in “Razor” suggests our ongoing need – a psychic wound in need of a good antisepsis and bandage – for a real leader, one with the courage not just to do the unthinkable on our behalf, but to embrace his role in it, and ride that particular horse all the way to his inevitable destruction and damnation.

Thoughts on the Writers’ Strike

strike.jpg

The decision by the Writers Guild of America to go on strike this week, bringing production of scripted media content in the U.S. to a halt, triggered a couple of different reactions in me.

1. Thank god for the strike. I say this not because I believe in the essential rightness of unionized labor (though I do), or because I believe writers deserve far more monetary benefits from their work than they are currently getting (though I also do). No, I’m grateful for the strike because there is just too much new content out there, and with the scribes picketing, we now have a chance to recover — to catch up. The launch of the fall TV season has been stressful for me because I’m sharply aware of how many shows are vying for my attention; the good ones (Heroes, House, 30 Rock) demand a weekly commitment, but even the bad or unproven ones (Journeyman, Bionic Woman, Pushing Daisies) deserve at least a glance. And while being a media scholar has its benefits, the downside is that it casts a “work aura” over every leisure activity; it’s nearly impossible to just watch anything anymore, without some portion of my brain working busily away on ideas for essays, blog entries, or material to use in the classroom. Hooray for the stoppage, then: it means more time to catch up on the “old” content spooled up and patiently waiting on DVD, hard drive, and videotape, and more mental energy to spare on each. To live in a time of media plenitude and infinite access is great in its way. But having so much, all the time, also risks reducing the act of engaging with it to dreary automaticity — a forced march.

2. It’s fascinating to watch the differential impact of the script drought diffusing through the media ecosystem. First to go dead are the daily installments of comedy predicated on quick response to current events: nightly talk shows, The Daily Show and The Colbert Report. Next to fold will be the half-hour sitcoms and hourly dramas currently in production. Some series, like 24, may not get their seasons off the ground at all. And somewhere far down the line, if the strike continues long enough, even the mighty buffers of Hollywood will go dry. Seeing the various media zones blink out one at a time is like watching the spread of a radioactive trace throughout the body’s organs, reminding us not only of the organic, massively systematic and interconnectedly flowing nature of the mediascape, but of the way in which our media renew themselves at different rates, deriving their particular relevance and role in our lives by degree of “greenness” on the one hand, polish on the other.

One Nation Under Stephen

colbert.jpg

I felt a delicious chill as I read the news that Stephen Colbert is running for President. (He made his announcement on Tuesday’s edition of The Colbert Report, the half-hour news and interview program he hosts on Comedy Central.) Why a chill? For all that I enjoy and respect Colbert, he has always prompted in me a faint feeling of vertigo. Watching his comedy is like staring into a deep well or over the side of a tall building: you get the itchy feeling in your legs of wanting to jump, to give yourself up to gravity and the abyss, obliterating yourself and all that you hold dear. Colbert’s impersonation of a rabidly right-wing, plummily egotistical media pundit is so polished and impenetrable that it stops being a joke and moves into more uncannily undecidable territory: simulation, automaton, a doll that has come to life. Unlike Jon Stewart on The Daily Show, Colbert’s satire doesn’t have a target, but becomes the target, triggering a collapse of categories, an implosion, a joke that eats itself and leaves its audience less thrilled than simply unsure (cf. Colbert’s performance at the 2006 White House Correspondents Dinner, at which he mapped uneasy smiles and half-frowns across a roomful of Republican faces).

Judging from Colbert’s offstage discussion of his work, like his recent interview with Terry Gross of Fresh Air, he’s a modest, sensible, reflective guy, able to view his Report persona with wit and detachment even as he delights in using it to generate ever more extreme, Dada-like interventions in popular and political culture — his Wikipedia mischief being only one instance. My half-serious worry is that with his latest move, he’s unleashed something far bigger than he knows or can control. The decision to place himself on the 2008 Presidential ballot, even if only in South Carolina, has been received by the mainstream media primarily as another ironic turn of the comedy-imitates-reality-imitates-art cycle, noting the echo of Robin Williams’s Man of the Year (2006) and comedian Pat Paulsen’s bid for the White House in 1968. But I think the more accurate and alarming comparison might be Larry “Lonesome” Rhodes, the character played by Andy Griffith in Elia Kazan’s A Face in the Crowd (1957). In that film, Rhodes goes from being a bumpkinish caricature on a television variety show to a populist demagogue, drunk on his own power and finally revealed as a hollow shell, a moral vacuum. The unsubtle message of Kazan’s film is that TV’s pervasive influence makes it a tool for our most destructive collective tendencies — a nation of viewers whose appetite for entertainment leads them to eagerly embrace fascism.

griffith.jpg

I’d be lying — or at least being flippant — if I claimed to believe that Colbert could be another “Lonesome” Rhodes. I’m neither that cynical about our culture nor that paranoid about the power of media. But given that we live in an era when the opportunities for self-organizing social movements have multiplied profoundly through the agency of the internet, who is to say where Colbert’s campaign comedy will mutate smoothly into something more genuine? Maybe he is, at this moment in history, the perfect protest candidate, smoother and more telegenic than Nader and Perot by orders of magnitude. He just might win South Carolina. And if that happens … what next?

Better, Stronger, Faster (TM)

bionic-woman.jpg

Spoiler Alert!

I’ll let you in on a little secret regarding the new NBC series Bionic Woman: they’re all bionic on that show, every last one of them. Sure, the premise centers primarily on one technologically-augmented body, that of Jaime Summers (Michelle Ryan), a bartender injured so severely in a car crash that her boyfriend — an imaginative combination of your standard TV hot guy and your standard mad scientist; think McBrainy — promptly replaces both of Jaime’s legs, one arm, one eye, and one ear, with $50 million worth of bionic machinery, making her about 65% superhuman. The show, a remake or, I suppose, reboot of the 1976-1978 series that starred Lindsay Wagner in the title role, does go one step further by providing a nemesis/doppelganger in the form of Sarah Corvus (Katee Sackhoff), a previous experiment in bionic integration who, either through bad character or having been “hacked,” has become a murderous tormenter of the nameless paragovernmental organization where Jaime now works. (Corvus is also sultry and talks like a film noir femme fatale, but it’s unclear to what degree these traits preceded her upgrade.)

But the truth, as I said before, is that everyone on the show is bionic, from Jaime’s boss Jonas Bledsoe (Miguel Ferrer) to her little sister Becca (Lucy Hale) to the extras that populate the backgrounds. This greater degree of bionicization reflects the enormous strides that have occurred in the field since the late 1970s; see Eric Freedman’s excellent Flow article for a recap. Nowadays, instead of simply tacking on a robotic limb or improved sensory organ here and there, bodies can be implanted with structuring quantities of generic and intertextual material, resulting in characters whose every look, gesture, and word of dialogue issues from another source. The cast of Bionic Woman has literally been stitched together from other TV shows, movies, and comic books — reconstituted like chicken beaks and hog parts into shiny pink hot dogs, repurposed like ground-up car tires into bouncy playground equipment.

And it doesn’t stop there. Internal memoranda leaked to me from showrunner David Eick’s office reveal the deeper layers of bionicization that make up the new series. The settings, while profilmically real enough in their own right, were all first used on other shows — as were the scripts, storylines, character arcs, action setpieces, and cliffhangers. In actuality, Bionic Woman is a postmodern breakthrough, the cutting edge in mashup culture. It exists purely as a composite of borrowed and recycled material, a house mix of C-level storytelling retrofitted from the efflux of the SciFi Channel, USA, and Spike, chopped and reedited into apparently new creative “product.”

My sources inform me that, while the pilot episode was assembled under human supervision, the duties of outputting new weekly 42-minute swatches of text has now been handed over entirely to a computerized system which uses sophisticated pattern recognition to dovetail one unrelated shot or scene to another in semi-plausible continuity. (A related algorithm keeps the editing lightning-fast, ensuring that any mismatches will flash by unnoticed.) There are still a few glitches in the system, evidenced by the second episode’s kludgy splice of two fundamentally incompatible stereotypes into one character (the teenage Becca): as those of us whose organic bodies passed through adolescence know, no one who gets in trouble for smoking pot in the dressing room could simultaneously burn with the dream of performing in her high-school talent show’s number from Annie Get Your Gun. It’s not simply a logical impossibility, but a paradoxical, reality-ripping snarl in the fabric of fictive spacetime. NBC troubleshooters have traced the problem to a dunderheaded subroutine that mistakenly blended Rory Gilmore (Alexis Bledel) from Gilmore Girls with Angela Chase (Claire Danes) from My So-Called Life. The techs say it shouldn’t happen again, but aren’t making any promises.

In the meantime, Bionic Woman will continue to unspool, following its own logic of recombination in blissful automaticity. I find the show more watchable than just about any of the other new offerings of the season, except for Kitchen Nightmares, which quaintly and cannily embeds at least one real person within its own bionic grammar — kind of an inside-out TV cyborg. Certainly Bionic Woman passes the test that Chuck didn’t or couldn’t, drawing me back for a second look. I encourage you to check out Bionic Woman, especially if you’re a fan, as I am, of the sorts of mesmerizingly random patterns that emerge from nonlinear, chaotic flow like lava lamps and screen savers.