Indiana Jones and the Unattainable FX Past

This isn’t a review, as I haven’t yet made it to the theater to see Indiana Jones and the Kingdom of the Crystal Skull (portal to the transmedia world of Dr. Jones here; typically focused and informative Wiki entry here). What I have been doing — breaking my normal rule about keeping spoiler-free — is poring over fan commentaries on the new movie, swimming within the cometary aura of its street-level paratexts, working my way into the core theatrical experience from the outside in. This wasn’t anything intentional, more the crumbling of an internet wall that sprang one informational leak after another, until finally the wave of words washed over me like, well, one of the death traps in an Indiana Jones movie.

Usually I’m loath to take this approach, finding the twists and turns of, say, Battlestar Galactica and Lost far more compelling when they clobber me unexpectedly (and let me add, both shows have been rocking out hard with their last couple of episodes). But it seemed like the right approach here. Over the years, the whole concept of Indiana Jones has become a diffuse map, gas rather than solid, ocean rather than island. Indy 4 is a media object whose very essence — its cultural significance as well as its literal signification, the decoding of its concatenated signage — depends on impacted, recursive, almost inbred layers of cinematic history.

On one level, the codes and conventions of pulp adventure genres, 1930s serials and their ilk, have been structured into the series film by film, much like the rampant borrowings of the Star Wars texts (also masterminded by George Lucas, whose magpie appropriations of predecessor art are cannily and shamelessly redressed, in his techno-auteur house style, as timelessly mythic resonance). But by now, 27 years after the release of Raiders of the Lost Ark, the Indy series must contend with a second level of history: its own. The logic of pop-culture migration has given way to the logic of the sequel chain, the franchise network, the transmedia system; we assess each new installment by comparing it not to “outside” films and novels but to other extensions of the Indiana Jones trademark. Indy 4, in other words, cannot be read intertextually; it must be read intratextually, within the established terms of its brand. And here the franchise’s history becomes indistinguishable from our own, since it is only through the activity of audiences — our collective memory, our layered conversations, the ongoing do-si-do of celebration, critique, and comparison — that the Indy texts sustain any sense of meaning above and beyond their cold commodity form.

All of this is to say that there’s no way Indiana Jones and the Kingdom of the Crystal Skull could really succeed, facing as it does the impossible task of simultaneously returning to and building upon a shared and cherished moment in film history. While professional critics have received the new film with varying degrees of delight and disappointment, the talkbacks at Aint-It-Cool News (still my go-to site for rude and raucous fan discourse) are far more scornful, even outraged, in their assessment. Their chorused rejection of Indy 4 hits the predictable points: weak plotting, flimsy attempts at comic relief, and in the movie’s blunt infusion of science-fiction iconography, a generic splicing so misjudged / misplayed that the film seems to be at war with its own identity, a body rejecting a transplanted organ.

But running throughout the talkback is another, more symptomatic complaint, centering on the new film’s overuse of CG visual effects. The first three movies — Raiders, Temple of Doom, and Last Crusade — covered a span from 1981 to 1989, an era which can now be retroactively characterized as the last hurrah of pre-digital effects work. All three feature lots of practical effects — stuntwork, pyrotechnics, and the on-set “wrangling” of everything from cobras to cockroaches. But more subtly, all make use of postproduction optical effects based on non-digital methods: matte paintings, bluescreen compositing, a touch of cel animation here, a cloud tank there. Both practical and optical effects have since been augmented if not colonized outright by CG, a shift apparently unmissable in Indy 4. And that has longtime fans in an uproar, their antidigital invective targeted variously on Lucas’s influence, the loss of verisimilitude, and the growing family resemblance of one medium (film) to another (videogames):

The Alien shit didnt bother me at all, it was just soulless and empty as someone earlier said.. And the CGI made it not feel like an Indy flick in some parts.. I walked out of the theater thinking the old PC game Fate of Atlantis gave me more Indiana joy than this piece of big budget shit.

My biggest gripe? Too much FUCKING CGI. The action lacked tension in crucial places. And there were too many parts (more than from the past films) where Looney Tunes physics kept coming into play. By the end, when the characters endure 3 certain deaths, you begin to think “Okay, the filmmakers are just fucking around, lean back in your seat and take in the silliness.” No thanks. That’s not what makes Indiana Jones movies fun.

This film was AVP, The Mummy Returns and Pirates of the Fucking Carribean put together, a CGI shitfest. A long time ago in a galaxy far far away, Lucas said “A special effect is a tool, a means to telling astory, a special effect without a story is a pretty boring thing.” Take your own advice Lucas, you suck!!!

The entire movie is shot on a stage. What happened to the locations of the past? The entire movie is CG. What a disappointment. I really, REALLY wanted to enjoy it.

Interestingly, this tension seems to have been anticipated by the filmmakers, who loudly claimed that the new film would feature traditional stuntwork, with CGI used only for subtleties such as wire removal. But the slope toward new technologies of image production proves to be slippery: according to Wikipedia, CG matte paintings dominate the film, and while Steven Spielberg allegedly wanted the digital paintings to include visible brushstrokes — as a kind of retro shout-out to the FX artists of the past — the result was neither nostalgically justifiable or convincingly indexical.

Of course, I’m basing all this on a flimsy foundation: Wiki entries, the grousing of a vocal subcommunity of fans, and a movie I haven’t even watched yet. I’m sure I will get out to see Indy 4 soon, but this expedition into the jungle of paratexts has definitely diluted my enthusiasm somewhat. I’ll encounter the new movie all too conscious of how “new” and “old” — those basic, seemingly obvious temporal coordinates — exceed our ability to construct and control them, no matter how hard the filmmakers may try, no matter how hard we audiences may hope.

A Marvel of Engineering

The opening act of the summer movie season, Iron Man, is much like the machine armor worn by Tony Stark (Robert Downey Jr.): a potent blend of advanced technology, sleek style, and glowing energy. The fetishism of the super-suit has rarely been quite so explicitly rendered, or embraced with such pornographic shamelessness, in comic-book cinema. Sure, movies and television have given us plenty of heroes whose iconic power resides in the costume (whether caped or capeless): Christopher Reeve’s Superman, Tobey Maguire’s Spider-Man, the leather-overcoat-and-sunglasses combo of Wesley Snipes in Blade, the patriotic bustier worn by Lynda Carter as Wonder Woman. Often these sartorial choices become flashpoints of controversy with fans: think of Bryan Singer’s X-Men adaptation, which did away with the classic yellow costumes of the comic series, or the many nippled and sculpted variations of the Batsuit worn by the Batactors playing a series of Batmen in the Batfranchise.

In Iron Man, the situation is different, for Iron Man is his suit, the “secret identity” within a compromised figure both morally and physically. Over the course of the story, Stark’s transformation from a hard-partying weapons magnate to a passionately peace-committed and (mostly) teetotaling cyborg is made concrete — made metal, really — through the metaphor of the successively more sophisticated armor shells in which he encases himself. The first is a kitbashed monstrosity the color of corroded tin cans, crisscrossed with scars of solder. Like a rustbucket car, it survives just long enough to convey Stark to version two, a more compact silver exoskeleton reminiscent of Ginsu knives and Brookstone gadgets. It’s quite satisfying when Stark arrives at the canonical configuration, a red-and-gold chassis of interlocking plates, purring hydraulics, and HUD graphics that answers the question “What would it be like to wear a Lamborghini?”

What’s clever is how these upgrades express Stark’s ethical evolution while recapitulating decades of shifting design in the Marvel comic series from which this movie sprang. (It’s kind of like a He’s Not There version of James Bond in which the lead is played over the course of the film by Sean Connery, then Roger Moore, followed by Timothy Dalton, Pierce Brosnan, and Daniel Craig — with a quick dream interlude, of course, starring George Lazenby.) Iron Man, in other words, manages to honor superhero history rather than pillaging it, and this — along with the film’s smart screenplay and glossy digital mise-en-scene — wins it a sure place in future best-of lists when it comes to the spotty genre of comic-book adaptations.

Another weapon in the film’s arsenal, riding within in its narrative delivery system like he pilots his mechanical costume, is Robert Downey Jr., who seems to have arrived at a point of perfect intertextual harmony with this turn in his career. His performance as Stark is properly lived-in and mischievous (indeed, the actor’s persona is yet another kind of suit, this one built of bad publicity) yet alive with disarmingly sincere warmth. In one of the film’s facile yet pitch-perfect tropes, Stark must wear a pulsing blue-and-white generator on his chest, a kind of electromagnetic pacemaker that doubles as an energy source for his armor and trebles as a signifier of the character’s humanity. Downey Jr. is himself a kind of power source that propels the vehicle of Iron Man forward while lending it, in stray moments, genuine moral weight. His portrayal reminds us that, while a superhero’s technological or organic essence is important, the larger ingredient is something harder to quantify and calibrate — in Stark’s case, a kludge of intellect, imagination, and compassion that easily trades one outfit for another.

That’s not all that’s going on under the hood: the fight scenes, not to mention flight scenes, are awesome, and Jeff Bridges is remarkably menacing with all his hair relocated from his head to his chin. There’s some nimble ideological shadowboxing around the the military-industrial complex and the terrible allure of “shock and awe” (there’s your real pornography). And while Stan Lee makes his trademarked cameo, branding this as yet another item in Marvel’s transmedia catalog, a quirky counterpoint is sounded when a villain taunts Stark by asking “Did you really think that just because you had an idea, it belongs to you?” For comic fans, it’s hard not to hear this and think of Marvel’s infamous screwing of Jack Kirby. At the same time, we should count our blessings that concepts like Iron Man travel so fluidly through our mediascape, rephrasing themselves in the transformational grammar of convergence cinema. Too often, the result is an ugly and lifeless thing — a strangled fragment like Daredevil, a run-on sentence like Van Helsing. But once in a lucky while, you get a perfect little poem like Iron Man.

No Stopping the Terminator’s Serial Return


I’m enjoying Terminator: The Sarah Connor Chronicles in the same three-quarters, semi-committed way I enjoyed Star Trek Voyager (the other show I’m catching up on via iPod during the writer’s strike): it ain’t Shakespeare, but it’ll do. The FOX network’s new program reinvents the venerable Terminator franchise by serializing it, retrofitting the premise into yet another of television’s narrative deli slicers. Now, instead of giant cinematic slabs of Termination every decade or so (The Terminator in 1984; Terminator 2: Judgment Day in 1991; Terminator 3: Rise of the Machines in 2003), we get wafer-thin weekly slices, peeled from the block of a setup almost mathematically pristine in its triangulation of force and counterforce: Sarah Connor, mother of a future leader of a rebellion against sentient, world-ending AI; John Connor, her teenaged son who will eventually grow into that man; and an endless procession of Terminators, silvery machine skeletons cloaked in human flesh, whose variations on the theme of lethally-unstoppable-yet-childishly-innocent are as simultaneously charming, pathetic, and horrifying as Boris Karloff’s Frankenstein monster.


The fact that these cyborgs are sent back in time to our present day to menace the Connors a priori is what lends the story its relentless urgency — it’s an almost perfect chase structure — as well as allowing it to encompass nearly any conceivable narrative permutation. Science fiction’s most productive conceit (at least in terms of storytelling), time travel and its even zanier offshoot, parallel universes, grant drama a toolkit of feints, substitutions, and surprises otherwise acceptable only in avant-garde experimentation (the cryptic as art) and tales of pure subjectivity (it was all a dream). When characters are bopping back and forth along the timespace continuum, in other words, it’s possible to stretch continuity to the breaking point, risking the storyworld’s implosion into absurdity, only to save it at the last minute by revealing each seeming reversal of cause and effect to be part of a larger logic of temporal shenanigans.

Hence the concept of the narrative reset button — a staple of Star Trek‘s many dips into the time-travel well — and the freedom of the Chronicles to recast its leads in a trendy, demographic-friendly makeover. Lena Headey takes over the role of Sarah from the movies’ Linda Hamilton; John, played in T2 by Edward Furlong and T3 by Nick Stahl, here is played by Thomas Dekker, Claire’s nerdy videographer friend in the first season of Heroes. It kind of all makes sense if you squint, turn your head sideways, and tell yourself that maybe this is all some parallel reality splintered off from the one James Cameron created (as indeed it is, industrially). More galvanizing is the recasting of the key Terminator — a “good” one, we presume, though its origin and nature are one of the series’ close-kept secrets — as a beautiful young woman, approximately John’s age.


As Cameron (get it?), Summer Glau plays the Terminator with a sort of detached glaze; think Zooey Deschanel with the power to bend steel and survive an eight-story fall. Though her ability to convincingly mimic human social behavior fluctuates alarmingly, the character is great, and her presence at one node of the Connor triangle remaps gender and sexual relationships in a way that is both jarring and absolutely plausible. In T2 the “good” Terminator (played by Arnold Schwarzenegger) had the square, inert reliability of John Wayne’s taxidermied corpse, and the absence of romantic chemistry between him and Hamilton’s Sarah seemed natural. (If he was a Dad-figure, it was a 50’s sitcom Dad — patriarchal but neutered.) Things are very different between Cameron and John, at least in subtext, and for that matter between Cameron and Sarah. If only because it seems such a ripe matrix for fannish invention, Chronicles marks the first time in a while I’ve been curious to seek out some slash.

As far as plotting goes, the new series seems primed to run for a while, if it can find its audience. The time-travel motif has already enabled our three protagonists to fast-forward from 1999 to 2007, introducing a fun fish-out-of-water counterpoint to the predictable (but very satisfying) setpieces of superpowerful beings throwing each other through walls, firing automatic weapons into each other, and getting mowed down by SUVs. I’m sure if things get dull, or when Sweeps loom, we’ll be treated to glimpses of the future (when the human-Skynet war is at its visual-FX-budget-busting peak) or the distant past (anyone for a Civil-War-era Terminator hunt?).

Overall I’m pleased with how gracefully this franchise has found a fresh point of generation for new content — how felicitously the premise has fitted itself to the domain of serial TV, with its unique grammar of cliffhangers, season-arcs, and long-simmering mysteries of origins, destinations, and desires. If they last, the Chronicles promise to be far more rewarding than the projected live-action Star Wars series (an experience I expect to be like having my soul strip-mined in between commercial interludes). Notwithstanding the cosmic expense of the second and third feature films, there’s always been something visceral and pleasingly cheap about the Terminator fantasy, remnant of its shoestring-budget 1984 origins; Terminator‘s simplified structure of feeling seems appropriate to the spiritual dimensions of televised SF. Like those robots that keep coming back to dog our steps, the franchise has found an ideal way to to insinuate itself into the timeline of our weekly viewing.

Four-Leaf “Clover”


Like the Manhattan-demolishing leviathan at its center — sometimes only a distant, crashing presence, sometimes terrifyingly close and looming — Cloverfield is an enigma built of striking contrasts. At once epic and intimate, the film seems utterly familiar in some ways and breathtakingly new in others. At its best (and there is a lot of “best” in its 84-minute running time), Cloverfield takes an almost unbearably cliched monster-movie premise and reinvents it whole, deftly stripping away the audience’s ability to anticipate what will happen next — even if, moments later, we realize that we saw the twists and shocks coming a mile away.

In this sense, the new film from producer and concept author J. J. Abrams, screenwriter Drew Goddard, and director Matt Reeves accomplishes what any good movie must: find a new, temporarily convincing way to obey the established rules of its genre and yet package them in a manner that seems fresh and original. I say “temporarily” because, of course, it’s a zero-sum game: assuming Cloverfield is the box-office phenomenon its makers and marketers clearly expect it to be, we’re in for any number of B, C, and D-grade knockoffs. We’ll quickly tire of the Cloverfield effect, just as we tired of the Matrix‘s bullet time, CG films featuring wisecracking animals in an urban setting, or — next on the block for burnout — the recent boomlet of pregnancy comedies like Knocked Up and Juno.

For the moment, though, we’re in the sweet spot. Cloverfield works beautifully as a lean, scary, and occasionally awe-inspiring fusion of science fiction and horror. Its impact seems inseparable from the promotional campaign leading up to its release, though what strikes me in retrospect (now that the quantum function of collective anticipation has collapsed, the wave of our wanting condensed into a hard particle of finished film) is how trickily non-promotional the publicity turned out to be. From its first teaser onwards, Cloverfield was sold to us more on the basis of what we didn’t know than what we did.

By the old logic of movie marketing, the more we were fed about an upcoming film, the better. Even in cases where a structuring piece of narrative information was withheld, as in The Crying Game, the absence itself became a lure, with reviewers falling all over themselves not to give away the Secret So Shocking You Won’t Believe Your Eyes! Not so Cloverfield, whose central mystery — the monster’s nature and appearance — became an object of extended forensic investigation by fans and, for many, the primary reason to turn up on opening day to see the film. Speculative images like the one at the top of this article (not, let me add, an accurate representation) abounded as fans scoured Quicktime files frame-by-frame and read clues Rorschach-like into promotional artwork. This was accompanied by much skepticism about the prospect of our ever actually seeing the monster; many felt we were in for another bait-and-switch of the Blair Witch variety.

It’s probably no spoiler at this point to announce that there is a monster, and a very satisfying one at that. What’s great, though, is how our fear and fascination toward the thing is mostly generated through the human activity around it, in particular the reactions of the quartet of young actors whom we follow throughout the movie. None is a well-known performer, for obvious reasons. Encountering a familiar movie face amid the frenzy and pathos of Cloverfield would destroy the film’s precarious conceit of being “real” footage captured by “real” people as the attack “really” happens.

The filmic pursuit of realism has a long and storied history — almost as long as the list of ways that Hollywood has put that realism to cynical use to sell its fictions. In staying within the boundaries of its metaphor, Cloverfield is endlessly gimmicky, finding ways to frame traditional dramatic setpieces and character beats while entirely avoiding artful compositions or anything resembling continuity editing. (As a side note, the visual effects are particularly impressive for the way in which digital elements have been added to jouncing camera work; the production’s match-movers deserve a special technical Oscar of their own.)

For Cloverfield‘s interwoven illusions — not just the spectacle of invented monsters, but affective phantasms like suspense and empathy — to work, everything must seem unplanned, contingent, or (my favorite word from graduate school) aleatory. That term means “dependent on chance or luck,” and it’s entirely appropriate in this context. Abrams and company have stumbled upon a way to put an electrifying new spin on a comfortable old story, and as fans of the genre, we are lucky indeed.

American Idolatry


It’s back, somehow seeming simultaneously like manna from heaven and a plague of fire descending to destroy us all.

My wife and I readied ourselves for last night’s premiere of American Idol‘s seventh season by cooking a delicious dinner and timing it so that we sat down to eat just as Ryan Seacrest appeared onscreen, accompanied by the chords of that theme song — so much like the revving of an enormous engine. That this engine has churned to life six times previously does not at all diminish its impact: on the contrary, its sweet electronic throb was a song of salvation, delivering us from our long trek through the dramaless desert of the writer’s strike.

OK, maybe that’s a bit strong: the last several months have certainly not been devoid of drama, whether in the kooky kaleidoscope of the presidential race or (ironically, tautologically) the writer’s strike itself, which has provided us all an opportunity to catch up on our viewing backlog as well as to reflect on what it means to have writers writing television at all. (Random observation: since new commercials keep coming out, does this mean that the creative content of advertising and marketing isn’t considered writing?) And as always, the concentric circles of collective fascination in the United States, with TV at the center of the conceptual bullseye, stop well short of encompassing the large and very real dramas experienced by the rest of the world; in other words, we should take a moment to remember that not everybody on Earth cares so passionately about what Simon, Paula, and Randy think, or about who will succeed Jordin Sparks.

For my wife and me, the excitement was multiplied by the fact that Idol‘s first round of auditions took place in Philadelphia, the city we now live a shocking 15 minutes away from. As the camera panned across the familiar skyline, it was hard not to succumb to Idol‘s implicit ideological address: I see you! Louis Althusser defined interpellation as the moment when, walking down a street, a cop calls out “Hey you!” and you turn around, believing yourself (mistakenly) to be the subject of the law’s address. For me, it’s all summed up in Ryan Seacrest’s direct gaze into the camera.


And then, of course, there are the crowds of hopefuls. Finally I see the point of high-def TV; the mass ornament of naked ambition, in all its variegated poppy-field glory, never looked so good as when rendered in HD. And at the other pole of specular mastery, the bad hair, flop sweat, and glazed eyeballs of the doomed auditioners has never seemed more squirm-inducingly intimate. Yes, the opening rounds of the new Idol promise to be as relentlessly mean as the previous seasons; nowhere is the long take used with more calculated cruelty than in the expertly extended coverage of contestants’ caterwauling, or the deadtime of waiting for the judges to drop the ax. Indeed, part of the pleasure of the coming months is knowing that out of this primordial, cannibalistic muck, we will ritualistically pull ourselves onto the shores of civilization, narrowing the field to the truly “worthy” singers, casting out the misfits until we arrive at the golden calf, er, One True Talent. On American Idol, cultural ontology recapitulates phylogeny; we replay, each season, the millennial-scale process by which our species learned to stop eating each other and got around to more important things, like popularity contests. (Come to think of it, Idol is also a lot like high school.)

The other nice thing about Idol is that there’s so freaking much of it; the two hours shown last night were but a skimming of the hundreds of hours of footage shot by the production. Assuming the brand continues its juggernaut profitability (and hey, now that TV is all reality n’ reruns, what’s to stop it?), we may someday see an entire channel devoted to nothing but Idol. That said, it was with something of a pop-culture hangover that I awoke this morning and realized that tonight brings yet more auditions, this time from Dallas.

My wife and I will be right there/here, watching. Will you?

Razor’s Edge


Tonight I had the privilege of attending an advance screening of “Razor,” the Battlestar Galactica telefilm that will be broadcast on the SciFi Channel on November 24. Fresh from the experience, I want to tell you a bit about it. I’ll keep the spoilers light – that said, however, read on with caution, especially if, like me, you want to remain pure and unsullied prior to first exposure.

Along with several colleagues from Swarthmore College, I drove into Philadelphia a couple of hours before the 7 p.m. showing, fearing that more tickets had been issued than there were seats; this turned out not to be a problem, but it was fun nevertheless – a throwback to my teenage days in Ann Arbor when I stood in line for midnight premieres of Return of the Jedi and Indiana Jones and the Temple of Doom – to kill time with a group of friends, all of us atingle with anticipation, eyeing the strangers around us with a mingled air of social fascination (are we as nerdy as they are?) and prefab amity (hail, fellow travelers, well met!).

The event itself was interesting on several levels, some of them purely visual: We knew we’d be watching a video screener blown up onto a movie-sized screen, and true to expectation, the image had the washed-out, slightly grainy quality that I’m coming to recognize now that I’m getting used to a high-def TV display. (Things overall are starting to look very good in the comfort of my living room.) There was also the odd juxtaposition of completely computer-generated science-fiction imagery in the plentiful ads for Xbox 360 titles such as Mass Effect and the new online Battlestar Galactica game (yes, more tingling at this one) with the actual show content – the space battles especially were in one sense hard to distinguish from their Xbox counterparts.

But at the same time, the entire program served as a reminder of what makes narratively-integrated visual effects sequences more compelling (in a certain sense) than their videogame equivalents. “Razor”’s battle scenes, of which there are – what’s the technical term? – puh-lenty, carry the dramatic weight of documentary footage or at least historical reenactments, by comparison to which the explosive combat of Mass Effect and the BSG game were received by audiences with the amused condescension of parents applauding politely an elementary-school play starring somebody else’s kids. Disposable entertainment, in a word, paling beside the high-stakes offering of “real” Galactica – and not just any Galactica, but the backstory of one of BSG’s most nightmarish and searing storylines, that of the “lost” Battlestar Pegasus and her ruthlessly hardline commander, Admiral Helena Cain (Michelle Forbes).

(I’ll get to the meat of the story in a moment, but one last thought on the blatantly branded evening of Microsoft-sponsored fun: does anyone really own, or use, or enjoy their Zune? The ad we watched [twice] went to great lengths to portray the Zune as better than an iPod – without ever mentioning iPods, of course – but the net effect was to remind me that a device intended to put portable personal media on a collective footing is as useless as a prehensile toe if no one around you actually owns the thing. “Welcome to the Social,” indeed.)

On to “Razor” itself. Was it any good? In my opinion, it was fantastic; it did everything I wanted it to do, including

  • Lots of space battles
  • Hard military SF action, namely a sequence highly reminiscent of the Space Marine combat staged to perfection by James Cameron in Aliens
  • A few heart-tugging moments, including several exchanges between Bill Adama (Edward James Olmos) and his son Lee (Jamie Bamber) of a type that never fail to bring tears to my eyes
  • Scary, Gigerish biomedical horror
  • Aaaaand the requisite Halloween-candy sampler of “revelations” regarding BSG’s series arc, which I won’t go into here except to note that they do advance the story, and suitably whet my appetite for season four (assuming the writer’s strike doesn’t postpone it until 2019).

A better title, then, might be “Razor: Fanservice,” for this long-awaited installment returns to the foreground many of the elements that made BSG such a potent reinvigoration of televised SF when it premiered in the U.S. at the end of 2004. Since then, Galactica has flagged in ways that I detail in an essay for an upcoming issue of Flow devoted to the series; but judging from “Razor,” showrunner Ronald D. Moore, like Heroes’s Tim Kring, has heard the fans and decided to give them what they want.

For me, the season-two Pegasus arc marked a kind of horizon of possibility for Galactica’s bold and risky game of mapping the least rendering of real-world political realities – namely government-sponsored torture questionably and conveniently justified by the “war on terror” – in SF metaphor. With the exception of the New Caprica arc that ended season two and began season three, the show has never since quite lived up to the queasy promise of the Pegasus storyline, in which a darkly militarized mirror-version of the valiant Galactica crew plunged itself with unapologetic resolve into Abu Ghraib-like sexual abuse and humiliation of prisoners.

What “Razor” does so engrossingly is revisit this primal scene of Galactica’s complex political remapping to both rationalize it – by giving us a few more glimpses of Admiral Cain’s pre- and post-apocalypse behavior and inner turmoil – and deepen its essential and inescapable repugnance. We’re given a framework, in other words, for the unforgivable misdeeds of Pegasus’s command structure and its obedient functionaries; the additional material both explains and underscores what went wrong and why it should never happen again.

Perhaps most strikingly, “Razor” provides a fantasy substitute for George W. Bush — a substitute who, despite her profoundly evil actions, is reassuring precisely because she seems aware of what she has wrought. In the film’s crucial scene, Cain instructs her chief torturer, Lieutenant Thorne (Fulvio Cecere), to make Six (Tricia Helfer)’s interrogation a humiliating, shameful experience. “Be creative,” Cain commands, and the fadeout that follows is more chilling than any clinically pornographic rendering of the subsequent violence could ever be. Precisely because I cannot imagine the cowardly powers-that-be, from Bush, Dick Cheney, and Alberto Gonzales on down to Lynndie England and Charles Graner, to ever take responsibility in the straightforward way that Cain does, this scene strikes me as one of the most powerful and eloquent portrayals of the contemporary U.S./Iraqi tragedy that TV has generated.

Admiral Cain is the real frog in SF’s imaginary garden. Moreover, her brief return in “Razor” suggests our ongoing need – a psychic wound in need of a good antisepsis and bandage – for a real leader, one with the courage not just to do the unthinkable on our behalf, but to embrace his role in it, and ride that particular horse all the way to his inevitable destruction and damnation.

Better, Stronger, Faster (TM)


Spoiler Alert!

I’ll let you in on a little secret regarding the new NBC series Bionic Woman: they’re all bionic on that show, every last one of them. Sure, the premise centers primarily on one technologically-augmented body, that of Jaime Summers (Michelle Ryan), a bartender injured so severely in a car crash that her boyfriend — an imaginative combination of your standard TV hot guy and your standard mad scientist; think McBrainy — promptly replaces both of Jaime’s legs, one arm, one eye, and one ear, with $50 million worth of bionic machinery, making her about 65% superhuman. The show, a remake or, I suppose, reboot of the 1976-1978 series that starred Lindsay Wagner in the title role, does go one step further by providing a nemesis/doppelganger in the form of Sarah Corvus (Katee Sackhoff), a previous experiment in bionic integration who, either through bad character or having been “hacked,” has become a murderous tormenter of the nameless paragovernmental organization where Jaime now works. (Corvus is also sultry and talks like a film noir femme fatale, but it’s unclear to what degree these traits preceded her upgrade.)

But the truth, as I said before, is that everyone on the show is bionic, from Jaime’s boss Jonas Bledsoe (Miguel Ferrer) to her little sister Becca (Lucy Hale) to the extras that populate the backgrounds. This greater degree of bionicization reflects the enormous strides that have occurred in the field since the late 1970s; see Eric Freedman’s excellent Flow article for a recap. Nowadays, instead of simply tacking on a robotic limb or improved sensory organ here and there, bodies can be implanted with structuring quantities of generic and intertextual material, resulting in characters whose every look, gesture, and word of dialogue issues from another source. The cast of Bionic Woman has literally been stitched together from other TV shows, movies, and comic books — reconstituted like chicken beaks and hog parts into shiny pink hot dogs, repurposed like ground-up car tires into bouncy playground equipment.

And it doesn’t stop there. Internal memoranda leaked to me from showrunner David Eick’s office reveal the deeper layers of bionicization that make up the new series. The settings, while profilmically real enough in their own right, were all first used on other shows — as were the scripts, storylines, character arcs, action setpieces, and cliffhangers. In actuality, Bionic Woman is a postmodern breakthrough, the cutting edge in mashup culture. It exists purely as a composite of borrowed and recycled material, a house mix of C-level storytelling retrofitted from the efflux of the SciFi Channel, USA, and Spike, chopped and reedited into apparently new creative “product.”

My sources inform me that, while the pilot episode was assembled under human supervision, the duties of outputting new weekly 42-minute swatches of text has now been handed over entirely to a computerized system which uses sophisticated pattern recognition to dovetail one unrelated shot or scene to another in semi-plausible continuity. (A related algorithm keeps the editing lightning-fast, ensuring that any mismatches will flash by unnoticed.) There are still a few glitches in the system, evidenced by the second episode’s kludgy splice of two fundamentally incompatible stereotypes into one character (the teenage Becca): as those of us whose organic bodies passed through adolescence know, no one who gets in trouble for smoking pot in the dressing room could simultaneously burn with the dream of performing in her high-school talent show’s number from Annie Get Your Gun. It’s not simply a logical impossibility, but a paradoxical, reality-ripping snarl in the fabric of fictive spacetime. NBC troubleshooters have traced the problem to a dunderheaded subroutine that mistakenly blended Rory Gilmore (Alexis Bledel) from Gilmore Girls with Angela Chase (Claire Danes) from My So-Called Life. The techs say it shouldn’t happen again, but aren’t making any promises.

In the meantime, Bionic Woman will continue to unspool, following its own logic of recombination in blissful automaticity. I find the show more watchable than just about any of the other new offerings of the season, except for Kitchen Nightmares, which quaintly and cannily embeds at least one real person within its own bionic grammar — kind of an inside-out TV cyborg. Certainly Bionic Woman passes the test that Chuck didn’t or couldn’t, drawing me back for a second look. I encourage you to check out Bionic Woman, especially if you’re a fan, as I am, of the sorts of mesmerizingly random patterns that emerge from nonlinear, chaotic flow like lava lamps and screen savers.

Herding the Nerds


This cheerful fellow is Chuck Bartowski (Zachary Levi), the lead character on NBC’s new series Chuck. Notable not just for its privileged placement as the lead-in to Heroes on Monday nights, Chuck stands out to me as a particularly significant shift in how primetime network TV audiences are conceptualized and spoken to — or to use a fancy but endlessly useful term from Louis Althusser’s work, interpellated. Like Heroes, Chuck offers viewers a user-friendly form of complex, mildly fantastical narrative: call it science-fiction lite, albeit several shades “liter” than Heroes‘s chase-structured saga of anxious, insecure superheroes or Lost‘s island of brooding mindgamers. In fact, as I think back over the family tree from which Chuck seems to have sprouted, I detect a steady march toward the innocuous: consider the progression from Twin Peaks to The X-Files to Buffy: The Vampire Slayer to Lost and its current offspring. Dark stuff, yes, but delivered in ever more buoyantly funny and sexy packages. (Only Battlestar Galactica seems determined to honor its essentially traumatic and decentering premise.)

But maybe I’m putting Chuck in the wrong box. The show is a cross between Alias and Ed, with a little sprinkle of Moonlighting and a generous glaze of rapid, self-conscious dialogue in the style of Gilmore Girls and Grey’s Anatomy. The main character works at a big electronics chain (think The 40-Year-Old Virgin, though here the brand milieu has not even been disguised to the degree it was in that movie — Chuck is a member of the “Nerd Herd,” a smug shoutout to the corporately-manufactured tech cred of Best Buy’s Geek Squad). The conceit on which the show gambles the audience’s disbelief is that Chuck has acquired a vast trove of classified “intel” through exposure to an image-filled email (think the cursed videotape of The Ring crossed with the brainwashing montage of The Parallax View) and now finds himself at the intersecting foci of several large, conspiratorial, deadly government organizations. Not a setup that necessarily oozes laughs and romance. But laughs and romance is what, in this case, we get. Like Levi’s unassuming good looks, Chuck‘s puppydog appeal seems destined to win over audiences — though handicapping shows this early is a fool’s game.

Monday’s pilot episode efficiently established that Chuck will be aided in his navigation of the dangers ahead not just by a beautiful female secret agent (Yvonne Strahovski), but his nebbishly sidekick at “Buy More.” In case we were skeptical of Chuck’s pedigree as a leading man, that is, the scenario helpfully supplies a socially-stunted homunculus in the form of his buddy Morgan Grimes (Joshua Gomez), shifting the burden of comic relief from Chuck and thereby moving him a crucial rung up the status ladder. While Chuck responds to his sudden aquisition of epistemological ordnance with deer-in-the-headlights cluelessness, Morgan takes to the larger game right away, advising Chuck on sexual tactics and the correct way to defeat a ninja with a sureness of touch derived equally, one presumes, from excessive masturbation and too many hours playing Splinter Cell and Metal Gear Solid.

What impresses me is that Chuck not only gives us a scenario geared toward geek fetishes, but embeds within itself a passel of geeks as decoding agents and centers of action. In this regard, it’s something like a virtual-reality program running on Star Trek‘s holodeck: a choose-your-own-adventure game whose ideological lure consists of nakedly mirroring its player/viewer in the form of a central character who is not, at least at first glance, a fantasy ideal, a performative mask, a second skin. It impresses me, but also alarms me. It is a storytelling tactic that all too serendipitously echoes the larger strategies of that most expert and ephemeral of modern commodities, the TV serial, which seeks to win from us one simple thing: our ongoing commitment. Chuck acknowledges that its audience is made up of prefab fans, eager for new affiliations, no matter how machine-lathed and focus-group-tested those engines of imaginary engagement may be. (In fact, it congratulates the audience for “getting” its nested inside jokes, chief among which is the fact of their own commodification; the ideal viewer of Chuck is precisely the illegal torrenter of media that NBC hopes to convert with its downloadable content.) Plotwise, with what I’m confident will be a spiraling shell-game of reveals and cliffhangers building toward some season-ending epiphany, Chuck will surely feed the jouissance of genre-based prediction and diegetic decryption that Tim Burke has labelled “nerd hermeneutics.”

I may or may not watch Chuck, depending on my taste for my own taste for cotton candy. By comparison, Heroes, which opened its second season on Monday night, was reassuringly byzantine and willing to frustrate. Perhaps that’s the value of new TV shows, which seem so offputtingly transparent in the way they play on our pleasures — revealing to us how easily we are taken in (and taken aboard), again and again. New shows come along and retroactively establish the authenticity of their predecessors, just as Chuck, measured against next year’s new offerings, will surely ripen from empty copy to cherished original.

Harry Potter 3D

I very much planned today’s expedition to see Harry Potter and the Order of the Phoenix as an experiential field trip. I’m a fan of the books, have been speeding my way through #7 (Deathly Hallows), and would have seen the film of #5 in any case. But it happened to be playing near me on an Imax screen in 3D. The last movie I saw at an Imax theater was The Matrix Reloaded (2003), and though I noted last year’s release of Superman Returns in Imax 3D, the general awfulness of the 2D, economy-sized version of Bryan Singer’s weird waxwork tribute was enough to keep me from going. But Order of the Phoenix is getting strong reviews, and I wanted to sample this latest in high-tech visual splendor.

Order of the Phoenix poster

I already knew that, as with Superman Returns, only a portion of Phoenix would be 3D. What surprised me was how explicitly this was made clear to spectators, both as a matter of publicity and in ad-hoc fashion. Warnings were taped on the ticket window: ONLY THE LAST 20 MINUTES OF HARRY POTTER ARE IN 3D. The man who tore my ticket told me the samew thing, in a rote voice, as he handed me the yellow plastic glasses. As the lights went down, a recorded announcement reiterated the point a third time, except in a tone of awe and promise: “When you see the flashing icon at the bottom of the screen, put on your glasses, and prepare to enter the spectacular world of Harry Potter in an amazing action climax” was the gist of it.

All this tutoring, not just in the timing of the glasses, but the proper level of anticipation! Calibrating the audience’s reactions, indeed their perceptions, stoking the excitement while warning us not to get too excited. It went hand-in-hand with the promos for the Imax format itself, playing before the film and describing the awesome fidelity and sensory intensification we were about to experience. It seemed odd that we needed such schooling; aren’t 3D and giant-screen technologies about removing layers of mediation?

But of course that’s naïve; the most basic theory of cinematic spectacle reminds us that special effects (and Imax 3D, like sound, color, widescreen, and other threshold bumps, is a kind of meta-special-effect, an envelope or delivery system for smaller, more textually specific clusters of effects) function both as enhancements of illusion’s power and as a reminder of the technology involved in bringing the illusion to us. At the movies, we’re perfectly capable of believing in what we see while also believing in (and celebrating) its constructed nature; this is as true of special effects as it is of the editing that strings together a story, or our perception of Albus Dumbledore as being simultaneously the headmaster of Hogwarts and a performance (of subtle strength, in this case) by Michael Gambon.

And we do need the calibration of expectation, the technological tutoring, that frames the Imax 3D experience, as demonstrated by the woman buying tickets in front of me who asked, “What’s Knocked Up? Is that 3D? Do you have a list of which movies are 3D and which aren’t?” I’m not mocking her: in our imaginations, all movies are 3D, in the sense of possessing a life beyond the play of light and shadow, in the sense that all media “realities” are to some extent already “virtual.” The interesting thing right now is that the technological assist provided by 3D is so prohibitively expensive and labor-intensive that only short sequences – those corresponding, as in Order of the Phoenix, to climactic stretches of action – enjoy it. For the moment, 3D is a sharply bordered land within the larger imaginary domain of movies. Like any border crossing, there must be signposts to let us know when we are entering and leaving the territory.

What of Phoenix’s 3D itself? I sure got a kick out of it – that final 20 is fantastic in dramatic terms, and as critics have pointed out, it’s designed perfectly for spatial amplification. I had tears in my eyes by the end of it, but partly this was due to the strange nostalgia the 3D experience brought out – nostalgia for all the promises movies make, whether in 2006, 1956, or 1906: each era offering its technological tricks so eagerly, convinced of its charms but taking time to warn us not to get too excited, lest we be disappointed and never return.