The zen of model kits

I speak often and with great satisfaction of my Man Cave, our house’s finished basement where my nerdish technophilia is allowed free reign. My PC tower and its domino-line backdrop of external hard drives; my big, flat TV atop its nest of audio components and cables; a small museum of video-game consoles; and the nonelectronic pleasures of my John D. MacDonald paperbacks (inherited from my father, who freshly arrived from Czechoslovakia in the 1950s used detective and spy fiction to hone his English-language skills), white cardboard longboxes of unexamined comics which with every passing year come more to resemble stacked sarcophagi, a dusty Millennium Falcon playset packed with Star Wars action figures in various stages of dismemberment (the latter a gift from my brother in law).

As this inventory suggests, the contents of the Man Cave embody not just arrested development but a certain ongoing regression: a march in reverse through the stages and artifacts of the enthusiasms that made me what I am today. For that reason, it’s fitting that I have opened a new wing whose title might be “Boy Cave”: a model-kit-building station in a side workroom where the heating-oil tank and cat-litter box vie with paint thinner and acrylic glue for the prize of most fascinatingly noxious scent.

Currently on the workbench is Polar Lights’s Robby the Robot, a kit I’ve been dabbling with for more than a year, but which a few nights ago I decided to buckle down and finish. (Pictured above, the 1/12-scale figure is still missing an ornamental arrangement of gyroscopes on top of its head, and over that a clear dome that seals its brain circuitry in place.) Model kits based on science fiction and fantasy have become a central preoccupation in my scholarship, and I guess in some ways I have returned to kit-building in order to (re)gain firsthand experience of this strange subculture of artifactual play and constructivist leisure — its material investments as well as its surrounding discursive community (see, for example, the reviews and build-guides here, here, and here).

But I’m also realizing a simple and zenlike truth, which is that to build a kit you must build it; it won’t finish itself. And the difference between dreaming and doing, which has so often constituted an agonizing contrapunto to my publishing life, is like the difference between the unassembled plastic parts still on their sprue and the built, painted, finalized thing: a matter of making. If I can fit the pieces of Robby together in stray minutes (and it turns out that the rhythms of model-kit assembly fit nicely into the scattered but semipredictable intervals of parenting), what else might I accomplish, simply by opting to complete — rather than just contemplate — the process?

Don’t Be Afraid of the Dark

It’s hard to pinpoint the primal potency of the original Don’t Be Afraid of the Dark, the 1973 telefilm about a woman stalked by hideous little troll monsters in the shadowy old house where she lives with her husband. The story itself, a wisp of a thing, has the unexplained purity of a nightmare piped directly from a fevered mind: both circuitously mazelike and stiflingly linear, it’s like watching someone drown in a room slowly filling with water. As with contemporaries Rosemary’s Baby and The Stepford Wives, it’s a parable of domestic disempowerment built around a woman whose isolation and vulnerability grow in nastily direct proportion to her suspicion that she is being hunted by dark forces. All three movies conclude in acts of spiritual (if not quite physical) devouring and rebirth: housewives eaten by houses. To the boy I was then, Don’t Be Afraid of the Dark provoked a delicious, vertiginous sliding of identification, repulsion, and desire: the doomed protagonist, played by Kim Darby, merged the cute girl from the Star Trek episode “Miri” with the figure of my own mother, whose return to full-time work as a public-school librarian, I see now, had triggered tectonic shifts in my parents’ relationship and the continents of authority and affection on which I lived out my childhood. These half-repressed terrors came together in the beautiful, grotesque design of the telefim’s creatures: prunelike, whispery-voiced gnomes creeping behind walls and reaching from cupboards to slice with razors and switch off the lights that are their only weakness.

The 2011 remake, which has nothing of the original’s power, is nevertheless valuable as a lesson in the danger of upgrading, expanding, complicating, and detailing a text whose low-budget crudeness in fact constitutes televisual poetry. Produced by Guillermo del Toro, the movie reminds me of the dreadful watering-down that Steven Spielberg experienced when he shifted from directing to producing in the 1980s, draining the life from his own brand (and is there not a symmetry between this industrial outsourcing of artistry and the narrative’s concern with soul-sucking?). The story has been tampered with disastrously, introducing a little girl to whom the monsters (now framed as vaguely simian “tooth fairies”) are drawn; the wife, played by a bloodless Katie Holmes, still succumbs in the end to the house’s demonic sprites, but the addition of a maternal function forces us to read her demise as noble sacrifice rather than nihilistic defeat, and when husband (Guy Pearce) walks off with daughter in the closing beat, it comes unforgivably close to a happy ending. As for the monsters, now more infestation than insidiousness, they skitter and leap in weightless CGI balletics, demonstrating that, as with zombies, faster does not equal more frightening. But for all its evacuation of purpose and punch, the remake is useful in locating a certain undigestible blockage in Hollywood’s autocannibalistic churn, enshrining and immortalizing — through its very failure to reproduce it — the accidental artwork of the grainy, blunt, wholly sublime original.

Boy story

Nostalgia time: I’ve spent the last few days in my home town of Ann Arbor, where the streets of my old neighborhood and the spaces of my parents’ house have about them a strangely denuded look — less the cratered remains of a bombed-out city than the blankly spartan truth of a theater stage once the sets have been struck and the house lights turned on. My visits here as an adult are riddled with little eruptions of personal history, the hot magma of memory oozing orangely through cracks in the sidewalks.

This morning I was driving my parents to breakfast, and the topic came up of a boy who used to live across the street from us. Ricky Clark (a pseudonym) was a little older than me, and in the mid- to late-seventies we were friends. Not a close, confide-in-each-other friendship, but a friendship based around our mutual appreciation of comic books and horror movies; Ricky had a ton of the former, arranged in neat stacks in his cool basement bedroom, and we stayed up late to watch the latter on late-night creature features, also in his basement.

Mostly, our friendship was a kind of partnership and collaboration in building cool things and pulling off stunts. We made Super 8 movies together, glued together model kits, launched model rockets and chased down their windblown nose cones adangle from red-and-white-checked plastic parachutes. We camped out in a tent in Ricky’s back yard to watch a lunar eclipse (his mom brought hamburgers out to us at the unprecedentedly late hour of 10 p.m.) and on August nights stayed up to watch the Perseid meteor shower.

I think our parents appreciated and approved of our friendship, because each of us supplied something that was missing in the other. I was a chubby, loquacious nerd who would rather stay inside reading than play outside. I was the intellectual, neurotically charming counterpart to Ricky, a compact blond kid with a toughness about him that had nothing to do with beating other kids up and everything to do with surviving dirtbike wipeouts and falls from his own roof.

For our most elaborate joint ventures invariably centered on risk and danger. Ricky built a go-cart powered by a lawnmower engine, and the perpetual smell of gasoline in his garage bay was a giddy miasma of peril and possibility. We raced the cart down the longest, steepest street in our neighborhood and filmed it using the slow-motion button on my dad’s movie camera. We launched bottle rockets from our own hands, our pink and unprotected fingers clutching the wooden stabilizing rod until a hissing shock of sparks carried the rocket away on its whistling trajectory, ending with a bang. We doused model kits in gasoline and ignited them on camera, squirting gas from a spray bottle to lift the flames into clouds of glowing glitter. We poured substances from one test tube to another, mapping the phase space of the chemistry set for colorful, smelly, or pyrotechnic reactions. Once we applied horror-movie makeup and tried to scare our mothers by pretending we’d been in gruesome, face-shredding bike accidents.

Last summer, on another of my visits, Mom called me outside to meet someone, a trim middle-aged man with a friendly smile and a brisk handshake. Irritated, I had no idea who he was. But of course it was Ricky Clark (now “Rick”), grown up like me, our experimental past buried under thirty-odd years of time.

Looking back on our friendship, I see that Ricky and I comprised two polarities of boy culture: the rough-and-tumble daredevil and the creative daydreamer. He built gadgets in his garage while I sketched in my notebook, and when, occasionally, our goals aligned, the results were vital and naive, stupid and clever at the same time. I’m glad we knew each other.

Super 8: The Past Through Tomorrow

Ordinarily I’d start this with a spoiler warning, but under our current state of summer siege — one blockbuster after another, each week a mega-event (or three), movies of enormous proportion piling up at the box office like the train-car derailment that is Super 8‘s justly lauded setpiece of spectacle — the half-life of secrecy decays quickly. If you haven’t watched Super 8 and wish to experience its neato if limited surprises as purely as possible, read no further.

This seems an especially important point to make in relation to J. J. Abrams, who has demonstrated himself a master of, if not precisely authorship as predigital literary theory would recognize it, then a kind of transmedia choreography, coaxing a miscellany of texts, images, influences, and brands into pleasing commercial alignment with sufficient regularity to earn himself his own personal brand as auteur. As I noted a few years back in a pair of posts before and after seeing Cloverfield, the truism expounded by Thomas Elsaesser, Jonathan Gray, and others — that in an era of continuous marketing and rambunctiously recirculated information, we see most blockbusters before we see them — has evolved under Abrams into an artful game of hide and seek, building anticipation by highlighting in advance what we don’t know, first kindling then selling back to us our own sense of lack. More important than the movie and TV shows he creates are the blackouts and eclipses he engineers around them, dark-matter veins of that dwindling popular-culture resource: genuine excitement for a chance to encounter the truly unknown. The deeper paradox of Abrams’s craft registered on me the other night when, in an interview with Charlie Rose, he explained his insistence on secrecy while his films are in production not in terms of savvy marketing but as a peace offering to a data-drowned audience, a merciful respite from the Age of Wikipedia and TMZ. No less clever for their apparent lack of guile, Abrams’s feats of paratextual prestidigitation mingle the pleasures of nostalgia with paranoia for the present, allying his sunny simulations of a pop past with the bilious mutterings of information-overload critics. (I refuse to use Bing until they change their “too much information makes you stupid” campaign, whose head-in-the-sand logic seems so like that of Creationism.)

The other caveat to get out of the way is that Abrams and his work have proved uniquely challenging for me. I’ve never watched Felicity or Alias apart from the bits and pieces that circulated around them, but I was a fan of LOST (at least through the start of season four), and enjoyed Mission Impossible III — in particular one extended showoff shot revolving around Tom Cruise as his visage is rebuilt into that of Philip Seymour Hoffman, of which no better metaphor for Cruise’s lifelong pursuit of acting cred can be conceived. But when Star Trek came out in 2009, it sort of short-circuited my critical faculties. (It was around that time I began taking long breaks from this blog.) At once a perfectly made pop artifact and a wholesale desecration of my childhood, Abrams’s Trek did uninvited surgery upon my soul, an amputation no less traumatic for being so lovingly performed. My refusal to countenance Abrams’s deft reboot of Gene Roddenberry’s vision is surely related to my inability to grasp my own death — an intimation of mortality, yes, but also of generationality, the stabbing realization that something which defined me as for so many years as stable subject and member of a collective, my herd identity, had been reassigned to the cohort behind me: a cohort whose arrival, by calling into existence “young” as a group to which I no longer belonged, made me — in a word — old. Just as if I had found Roddenberry in bed with another lover, I must encounter the post-Trek Abrams from within the defended lands of the ego, a continent whose troubled topography was sculpted not by physical law but by tectonics of desire, drive, and discourse, and whose Lewis and Clark were Freud and Lacan. (Why I’m so touchy about border disputes is better left for therapy than blogging.)

Given my inability to see Abrams’s work through anything other than a Bob-shaped lens, I thought I would find Super 8 unbearable, since, like Abrams, I was born in 1966 (our birthdays are only five days apart!), and, like Abrams, I spent much of my adolescence making monster movies and worshipping Steven Spielberg. So much of my kid-life is mirrored in Super 8, in fact, that at times it was hard to distinguish it from my family’s 8mm home movies, which I recently digitized and turned into DVDs. That gaudy Kodachrome imagery, adance with grain, is like peering through the wrong end of a telescope into a postage-stamp Mad Men universe where it is still 1962, 1965, 1967: my mother and father younger than I am today, my brothers and sisters a blond gaggle of grade-schoolers, me a cheerful, big-headed blob (who evidently loved two things above all else: food and attention) showing up in the final reels to toddle hesitantly around the back yard.

Fast-forward to the end of the 70s, and I could still be found in our back yard (as well as our basement, our garage, and the weedy field behind the houses across the street), making movies with friends at age twelve or thirteen. A fifty-foot cartridge of film was a block of black plastic that held about 3 minutes of reality-capturing substrate. As with the Lumiere cinematographe, the running time imposed formal restraints on the stories one could tell; unless or until you made the Griffithian breakthrough to continuity editing, scenarios were envisioned and executed based on what was achievable in-camera. (In amateur cinema as in embryology, ontology recapitulates phylogeny.) For us, this meant movies built around the most straightforward of special effects — spaceship models hung from thread against the sky or wobbled past a painted starfield, animated cotillions of Oreo cookies that stacked and unstacked themselves, alien invaders made from friends wrapped in winter parkas with charcoal shadows under the eyes and (for some reason) blood dripping from their mouths — and titles that seemed original at the time but now announce how occupied our processors were with the source code of TV and movie screens: Star Cops, Attack of the Killer Leaves, No Time to Die (a spy thriller containing a stunt in which my daredevil buddy did a somersault off his own roof).

Reconstructing even that much history reveals nothing so much as how trivial it all was, this scaling-down of genre tropes and camera tricks. And if cultural studies says never to equate the trivial with the insignificant or the unremarkable, a similar rule could be said to guide production of contemporary summer blockbusters, which mine the detritus of childhood for minutiae to magnify into $150 million franchises. Compared to the comic-book superheroes who have been strutting their Nietzschean catwalk through the multiplex this season (Thor, Green Lantern, Captain America et-uber-al), Super 8 mounts its considerable charm offensive simply by embracing the simple, giving screen time to the small.

I’m referring here less to the protagonists than to the media technologies around which their lives center, humble instruments of recording and playback which, as A. O. Scott points out, contrast oddly with the high-tech filmmaking of Super 8 itself as well as with the more general experience of media in 2011. I’m not sure what ideology informs this particular retelling of modern Genesis, in which the Sony Walkman begat the iPod and the VCR begat the DVD; neither Super 8′s screenplay nor its editing develop the idea into anything like a commentary, ironic or otherwise, leaving us with only the echo to mull over in search of meaning.

A lot of the movie is like that: traces and glimmers that rely on us to fill in the gaps. Its backdrop of dead mothers, emotionally-checked-out fathers, and Area 51 conspiracies is as economical in its gestures as the overdetermined iconography of its Drew Struzan poster (below), an array of signifiers like a subway map to my generation’s collective unconscious. Poster and film alike are composed and lit to summon a linkage of memories some thirty years long, all of which arrive at their noisy destination — there’s that train derailment again — in Super 8.

I don’t mind the sketchiness of Super 8‘s plot any more than I mind its appropriation of 1970s cinematography, which trades the endlessly trembling camerawork of Cloverfield and Star Trek for the multiplane composition, shallow focus, and cozily cluttered frames of Close Encounters of the Third Kind. (Abrams’s film is more intent on remediating CE3K‘s rec-room mise-en-scene than its Douglas-Trumbull lightshows.) To accuse Super 8 of vampirizing the past is about as productive as dropping by the castle of that weird count from Transylvania after dark: if the full moon and howling wolves haven’t clued you in, you deserve whatever happens, and to bring anything other than a Spielberg-trained sensibility to a screening of Super 8 is like complaining when Pop Rocks make your mouth feel funny.

What’s exceptional, in fact, about Super 8 is the way it intertextualizes two layers of history at once: Spielberg films and the experience of watching Spielberg films. It’s not quite Las Meninas, but it does get you thinking about the inextricable codependence between a text and its reader, or in the case of Spielberg and Abrams, a mainstream “classic” and the reverential audience that is its condition of possibility. With its ingenious hook of embedding kid moviemakers in a movie their own creative efforts would have been inspired by (and ripped off of), Super 8 threatens to transcend itself, simply through the squaring and cubing of realities: meta for the masses.

Threatens, but never quite delivers. I agree with Scott’s assessment that the film’s second half fails to measure up to its first — “The machinery of genre … so ingeniously kept to a low background hum for so long, comes roaring to life, and the movie enacts its own loss of innocence” — and blame this largely on the alien, a digital McGuffin all too similar to Cloverfield‘s monster and, now that I think about it, the thing that tried to eat Kirk on the ice planet. Would that Super 8‘s filmmakers had had the chutzpah to build their ode to creature features around a true evocation of 70s and 80s special effects, recreating the animatronics of Carlo Rambaldi or Stan Winston, the goo and latex of Rick Baker or Rob Bottin, the luminous optical-printing of Trumbull or Robert Abel, even the flickery stop motion of Jim Danforth and early James Cameron! It might have granted Super 8‘s Blue-Fairy wish with the transmutation the film seems so desperately to desire — that of becoming a Spielberg joint from the early 1980s (or at least time-traveling to climb into its then-youthful body like Sam Beckett or Marty McFly).

Had Super 8‘s closely-guarded secret turned out to be our real analog past instead of a CGI cosmetification of it, the movie would be profound where it is merely pretty. Super 8 opens with a wake; sour old man that I am, I wish it had had the guts to actually disinter the corpse of a dead cinema, instead of just reminiscing pleasantly beside the grave.



Reminded by the media that today marks the 25th anniversary of the Challenger disaster, I think not of that event directly, but of how even in the moment of its occurrence in 1986, I registered the horror only distantly, as a background image on TV that followed me throughout the rest of that cold January day. It was not simply my first media event — that useful term coined by Daniel Dayan and Elihu Katz to describe those moments, planned and unplanned, when the screens of the country or the world fixate on a singular happening of shared cultural, historical, or political significance — but the first time I recognized the importance of something alongside my failure to connect with it the way I suspected I should.

I was 19 years old at the time, struggling to find my place at Eastern Michigan University, where I was fitfully taking classes in the Theater Department. (The focusing discipline of English, with a “penance minor” of History, was a year or two in my future.) Shocking events on the national stage had penetrated my consciousness before — I remember whispers leaping like sparks among the nested semicircles of our orchestra chairs the day John Hinckley tried to kill Ronald Reagan in 1981, and when John Lennon died a year before, I stayed up late with the TV, waiting for details to emerge. I even remember, vaguely, being shushed by my family the night in 1978 that news of the Jonestown massacre broke. But whatever organs of emotion were developing inside me had not yet matured enough to apprehend the vastness of collective trauma, to sound along the single string of my soul a thrum of sympathetic resonance with some larger chorus of lament. That would not come for years, until after the profound illness of a close friend and the death of my older brother had broken down and rebuilt my heart in more human form.

But other factors were behind the cold inertia of my feelings the day we lost Challenger. I had drifted from a childhood infatuation with the space program, one fueled by family visits to the National Air and Space Museum and long hours spent poring over books written by and about NASA astronauts. Though I found the Mercury flights too early and primitive to hold my interest, the cool capsule design and paired teamwork of the Gemini program (like going into space with your best friend!) fired my imagination, and the Apollo moon landings were like repeated assaults on Mount Olympus, bold conquests by super-dads wrapped in science-fiction armor. Skylab, under whose orbits I aged from 7 to 12, was like a funky rec room in the sky, a padded cylinder where playful scientists in zero-G did somersaults and wobbled water globes for the cameras, which relayed their bearded grins to us.

By contrast with the alternately goofy and glorious NASA missions of the 60s and early 70s, the Space Shuttle program seemed a retreat into something more pedestrian and timid. The orbiter with its aerodynamic surfaces struck me as being too much like an airplane, a standard streamlined creature of the atmosphere instead of a spiky, boxy, bedished emissary to the stars. Those external fuel tanks turned the coolest part of the show, the rockets, into mere vestigial workhorses, to be dropped away like a shameful secret, disavowed by the pristine delta wing as it did its boring ballet turns — never going anywhere, just cautiously circling the earth, expecting applause each time it landed, though the goal seemed to be to make launches and returns as common and unexceptional as elevator rides. The whole concept, in short, was a triumph of the disposable and interchangeable over the unique and dramatic.

So I stopped paying attention to the shuttle missions, until January 28, 1986, when the predictable, in the space of a few seconds, mutated into the unique and dramatic.

That strangely horned explosion, a forking fireball marking the moment at which a precisely calibrated flightpath dissolved into the chaotic trajectories of system failure, was at first glance reminiscent of the impressive explosions Hollywood had rigged for my awed pleasure — the sabotage of the Death Star, the electrical rapture of the opened Ark of the Covenant, the Nostromo’s timed destruction — and it took much repetition and analysis for me to begin to grasp the semiotics of this particular combustion. Challenger, for me and maybe for a lot of people, was the beginning of an education in explosions, and over the decades that followed, I thought back to its incendiary lessons: Waco in 1993, Oklahoma City in 1995. The USS Cole in 2000, Columbia over Texas in 2003. And that master class in the forensics of fiery disaster, September 11, 2001.

I don’t think I’m building to any profound point here; indeed, writing about such moments makes me unhappily aware of how provincial and shallow my thinking about spectacle can be. Scenes of death, wrapped in the double abstractions of physical laws and media presentation, are still hard for me to feel, though — scored into my optics — they are never hard to remember.

Tron: Legacy

This review is dedicated to my friends David Surman and Will Brooker.

Part One: We Have Never Been Digital


If Avatar was in fact the “gamechanger” its prosyletizers claimed, then it’s fitting that the first film to surpass it is itself about games, gamers, and gaming. Arriving in theaters nearly a year to the day after Cameron’s florid epic, Tron: Legacy delivers on the promise of an expanded blockbuster cinema while paradoxically returning it to its origins.

Those origins, of course, date back to 1982, when the first Tron — brainchild of Steven Lisberger, who more and more appears to be the Harper Lee of pop SF, responsible for a single inspired act of creation whose continued cultural resonance probably doomed any hope of a career — showed us what CGI was really about. I refer not to the actual computer-generated content in that film, whose 96-minute running time contains only 15-20 minutes of CG animation (the majority of the footage was achieved through live-action plates shot in high contrast, heavily rotoscoped, and backlit to insert glowing circuit paths into the environment and costumes), but instead to the discursive aura of the digital frontier it emits: another sexy, if equally illusory, glow. Tron was the first narrative feature film to serve up “the digital” as a governing design aesthetic as well as a marketing gimmick. Sold as high-tech entertainment event, audiences accepted Lisberger’s folly as precisely that: a time capsule from the future, coming attraction as main event. Tron taught us, in short, to mistake a hodgepodge of experiment and tradition as a more sweeping change in cinematic ontology, a spell we remain under to this day.

But the state of the art has always been a makeshift pact between industry and audience, a happy trance of “I know, but even so …” For all that it hinges on a powerful impression of newness, the self-applied declaration of vanguard status is, ironically, old hat in filmmaking, especially when it comes to the periodic eruptions of epic spectacle that punctuate cinema’s more-of-the-same equilibrium. The mutations of style and technology that mark film’s evolutionary leaps are impossible to miss, given how insistently they are promoted: go to YouTube and look at any given Cecil B. DeMille trailer if you don’t believe me. “Like nothing you’ve ever seen!” may be an irresistible hook (at least to advertisers), but it’s rarely true, if only because trailers, commercials, and other advance paratexts ensure we’ve looked at, or at least heard about, the breakthrough long before we purchase our tickets.

In the case of past breakthroughs, the situation becomes even more vexed. What do you do with a film like Tron, which certainly was cutting-edge at the time of its release, but which, over the intervening twenty-eight years, has taken on an altogether different veneer? I was 16 when I first saw it, and have frequently shown its most famous setpiece — the lightcycle chase — in courses I teach on animation and videogames. As a teenager, I found the film dreadfully inert and obvious, and rewatching it to prepare for Tron: Legacy,  I braced myself for a similarly graceless experience. What I found instead was that a magical transformation had occurred. Sure, the storytelling was as clumsy as before, with exposition that somehow managed to be both overwritten and underexplained, and performances that were probably half-decent before an editor diced them them into novocained amateurism. The visuals, however, had aged into something rather beautiful.

Not the CG scenes — I’d looked at those often enough to stay in touch with their primitive retrogame charm. I’m referring to the live-action scenes, or rather, the suturing of live action and animation that stands in for computer space whenever the camera moves close enough to resolve human features. In these shots, the faces of Flynn (Jeff Bridges), Tron (Bruce Boxleitner), Sark (David Warner), and the film’s other digital denizens are ovals of flickering black-and-white grain, their moving lips and darting eyes hauntingly human amid the neon cartoonage.

Peering through their windows of backlit animation, Tron‘s closeups resemble those in Dreyer’s Passion of Joan of Arc — inspiration for early film theorist Béla Balázs’s lyrical musings on “The Face of Man” — but are closer in spirit to the winking magicians of George Méliès’s trick films, embedded in their phantasmagoria of painted backdrops, double exposures, and superimpositions. Like Lisberger, who would intercut shots of human-scaled action with tanks, lightcycles, and staple-shaped “Recognizers,” Méliès alternated his stagebound performers with vistas of pure artifice, such as animated artwork of trains leaving their tracks to shoot into space. Although Tom Gunning argues convincingly that the early cinema of attractions operated by a distinctive logic in which audiences sought not the closed verisimilar storyworlds of classical Hollywood but the heightened, knowing presentation of magical illusions, narrative frameworks are the sauce that makes the taste of spectacle come alive. Our most successful special effects have always been the ones that — in an act of bistable perception — do double duty as story.

In 1982, the buzzed-about newcomer in our fantasy neighborhoods was CGI, and at least one film that year — Star Trek II: The Wrath of Khan — featured a couple of minutes of computer animation that worked precisely because they were set off from the rest of the movie, as special documentary interlude. Other genre entries in that banner year for SF, like John Carpenter’s remake of The Thing and Steven Spielberg’s one-two punch of E.T. and Poltergeist (the latter as producer and crypto-director), were content to push the limits of traditional effects methods: matte paintings, creature animatronics, gross-out makeup, even a touch of stop-motion animation. Blade Runner‘s effects were so masterfully smoggy that we didn’t know what to make of them — or of the movie, for that matter — but we seemed to agree that they too were old school, no matter how many microprocessors may have played their own crypto-role in the production.

“Old school,” however, is another deceptively relative term, and back then we still thought of special effects as dividing neatly into categories of the practical/profilmic (which really took place in front of the camera) and optical/postproduction (which were inserted later through various forms of manipulation). That all special effects — and all cinematic “truths” — are at heart manipulation was largely ignored; even further from consciousness was the notion that soon we would redefine every “predigital” effect, optical or otherwise, as possessing an indexical authenticity that digital effects, well, don’t. (When, in 1998, George Lucas replaced some of the special-effects shots in his original Star Wars trilogy with CG do-overs, the outrage of many fans suggested that even the “fakest” products of 70’s-era filmmaking had become, like the Velveteen Rabbit, cherished realities over time.)

Tron was our first real inkling that a “new school” was around the corner — a school whose presence and implications became more visible with every much-publicized advance in digital imaging. Ron Cobb’s pristine spaceships in The Last Starfighter (1984); the stained-glass knight in Young Sherlock Holmes (1985); the watery pseudopod in The Abyss (1989); each in its own way raised the bar, until one day — somewhere around the time of Independence Day (1996), according to Michele Pierson — it simply stopped mattering whether a given special effect was digital or analog. In the same way that slang catches on, everything overnight became “CGI.” That newcomer to the neighborhood, the one who had people peering nervously through their drapes at the moving truck, had moved in and changed the suburb completely. Special-effects cinema now operated under a technological form of the one-drop rule: all it took was a dab of CGI to turn the whole thing into a “digital effects movie.” (Certain film scholars regularly use this term to refer to both Titanic [1997] and The Matrix [1999], neither of which employs more than a handful of digitally-assisted shots — many of these involving intricate handoffs from practical miniatures or composited live-action elements.)

Inscribed in each frame of Tron is the idea, if not the actual presence, of the digital; it was the first full-length rehearsal of a special-effects story we’ve been telling ourselves ever since. Viewed today, what stands out about the first film is what an antique and human artifact — an analog artifact — it truly is. The arrival of Tron: Legacy, simultaneously a sequel, update, and reimagining of the original, gives us a chance to engage again with that long-ago state of the art; to appreciate the treadmill evolution of blockbuster cinema, so devoted to change yet so fixed in its aims; and to experience a fresh and vastly more potent vision of what’s around the corner. The unique lure (and trap) of our sophisticated cinematic engines is that they never quite turn that corner, never do more than freeze for an instant, in the guise of its realization, a fantasy of film’s future. In this sense — to rephrase Bruno Latour — we have never been digital.

Part Two: 2,415 Times Smarter


In getting a hold on what Tron: Legacy (hereafter T:L) both is and isn’t, I find myself thinking about a line from its predecessor. Ed Dillinger (David Warner), figurative and literal avatar of the evil corporation Encom, sits in his office — all silver slabs and glass surfaces overlooking the city’s nighttime gridglow, in the cleverest and most sustained of the thematic conceits that run throughout both films: the paralleling, to the point of indistinguishability, of our “real” architectural spaces and the electronic world inside the computer. (Two years ahead of Neuromancer and a full decade before Snow Crash, Tron invented cyberspace.)

Typing on a desk-sized touchscreen keyboard that neatly predates the iPad, Dillinger confers with the Master Control Program or MCP, a growling monitorial application devoted to locking down misbehavior in the electronic world as it extends its own reach ever outward. (The notion of fascist algorithm, policing internal imperfection while growing like a malignancy, is remapped in T:L onto CLU — another once-humble program omnivorously metastasized.) MCP complains that its plans to infiltrate the Pentagon and General Motors will be endangered by the presence of a new and independent security watchdog program, Tron. “This is what I get for using humans,” grumbles MCP, which in terms of human psychology we might well rename OCD with a touch of NCP. “Now wait a minute,” Dillinger counters, “I wrote you.” MCP replies coldly, “I’ve gotten 2,415 times smarter since then.”

The notion that software — synecdoche for the larger bugaboo of technology “itself” — could become smarter on its own, exceeding human intelligence and transcending the petty imperatives of organic morality, is of course the battery that powers any number of science-fiction doomsday scenarios. Over the years, fictionalizations of the emergent cybernetic predator have evolved from single mainframe computers (Colossus: The Forbin Project [1970], WarGames [1983]) to networks and metal monsters (Skynet and its time-traveling assassins in the Terminator franchise) to graphic simulations that run on our own neural wetware, seducing us through our senses (the Matrix series [1999-2003]). The electronic world established in Tron mixes elements of all three stages, adding an element of alternative storybook reality a la Oz, Neverland … or Disneyworld.

Out here in the real world, however, what runs beneath these visions of mechanical apocalypse is something closer to the Technological Singularity warned of by Ray Kurzweil and Vernor Vinge, as our movie-making machinery — in particular, the special-effects industry — approaches a point where its powers of simulation merge with its custom-designed, mass-produced dreams and nightmares. That is to say: our technologies of visualization may incubate the very futures we fear, so intimately tied to the futures we desire that it’s impossible to sort one from the other, much less to dictate which outcome we will eventually achieve.

In terms of its graphical sophistication as well as the extended forms of cultural and economic control that have come to constitute a well-engineered blockbuster, Tron: Legacy is at least 2,415 times “smarter” than its 1982 parent, and whatever else we may think of it — whatever interpretive tricks we use to reduce it to and contain it as “just a movie” — it should not escape our attention that the kind of human/machine fusion, not to mention the theme of runaway AI, at play in its narrative are surface manifestations of much more vast and far-reaching transformations: a deep structure of technological evolution whose implications only start with the idea that celluloid art has been taken over by digital spectacle.

The lightning rod for much of the anxiety over the replacement of one medium by another, the myth of film’s imminent extinction, is the synthespian or photorealistic virtual actor, which, following the logic of the preceding paragraphs, is one of Tron: Legacy‘s chief selling points. Its star, Jeff Bridges, plays two roles — the first as Flynn, onetime hotshot hacker, and the second as CLU, his creation and nemesis in the electronic world. Doppelgangers originally, Flynn has aged while CLU remains unchanged, the spitting image of Flynn/Bridges circa 1982.

Except that this image doesn’t really “spit.” It stares, simmers, and smirks; occasionally shouts; knocks things off tables; and does some mean acrobatic stunts. But CLU’s fascinating weirdness is just as evident in stillness as in motion (see the top of this post), for it’s clearly not Jeff Bridges we’re looking at, but a creepy near-miss. Let’s pause for a moment on this question: why a miss at all? Why couldn’t the filmmakers have conjured up a closer approximation, erasing the line between actor and digital double? Nearly ten years after Final Fantasy: The Spirits Within, it seems that CGI should have come farther. After all, the makers of T:L weren’t bound by the aesthetic obstructions that Robert Zemeckis imposed on his recent films, a string of CG waxworks (The Polar Express [2004], Beowulf [2007], A Christmas Carol [2009], and soon — shudder — a Yellow Submarine remake) in which the inescapable wrongness of the motion-captured performances are evidently a conscious embrace of stylization rather than a failed attempt at organic verisimilitude. And if CLU were really intended to convince us, he could have been achieved through the traditional retinue of doubling effects: split-frame mattes, body doubles in clever shot-reverse-shot arrangements, or the combination of these with motion-control cinematography as in the masterful composites of Back to the Future 2, which, made in 1989, is only seven years older than the first Tron.

The answer to the apparent conundrum is this: CLU is supposed to look that way; we are supposed to notice the difference, because the effect wouldn’t be special if we didn’t. The thesis of Dan North’s excellent book Performing Illusions is that no special effect is ever perfect — we can always spot the joins, and the excitement of effects lies in their ceaseless toying with our faculties of suspicion and detection, the interpretation of high-tech dreams. Updating the argument for synthespian performances like CLU’s, we might profitably dispose of the notion that the Uncanny Valley is something to be crossed. Instead, smart special effects set up residence smack-dab in the middle.

Consider by analogy the use of Botox. Is the point of such cosmetic procedures to absolutely disguise the signs of age? Or are they meant to remain forever fractionally detectable as multivalent signifiers — of privilege and wealth, of confident consumption, of caring enough about flaws in appearance to (pretend to) hide them? Here too is evidence of Tron: Legacy’s amplified intelligence, or at least its subtle cleverness: dangling before us a CLU that doesn’t quite pass the visual Turing Test, it simultaneously sells us the diegetically crucial idea of a computer program in the shape of human (which, in fact, it is) and in its apparent failure lulls us into overconfident susceptibility to the film’s larger tapestry of tricks. 2,415 times smarter indeed!

Part Three: The Sea of Simulation


Doubles, of course, have always abounded in the works that constitute the Tron franchise. In the first film, both protagonist (Flynn/Tron) and antagonist (Sark/MCP) exist as pairs, and are duplicated yet again in the diegetic dualism of real world/electronic world. (Interestingly, only MCP seems to lack a human manifestation — though it could be argued that Encom itself fulfills that function, since corporations are legally recognized as people.) And the hall of mirrors keeps on going. Along the axis of time, Tron and Tron: Legacy are like reflections of each other in their structural symmetry. Along the axis of media, Jeff Bridges dominates the winter movie season with performances in both T:L and True Grit, a kind of intertextual cloning. (The Dude doesn’t just abide — he multiplies.)

Amid this rapture of echoes, what matters originality? The critical disdain for Tron: Legacy seems to hinge on three accusations: its incoherent storytelling; its dependence on special effects; and the fact that it’s largely a retread of Tron ’82. I’ll deal with the first two claims below, but on the third count, T:L must surely plead “not guilty by reason of nostalgia.” The Tron ur-text is a tale about entering a world that exists alongside and within our own — indeed, that subtends and structures our reality. Less a narrative of exploration than of introspection, its metaphysics spiral inward to feed off themselves. Given these ouroboros-like dynamics, the sequel inevitably repeats the pattern laid down in the first, carrying viewers back to another embedded experience — that of encountering the first Tron — and inviting us to contrast the two, just as we enjoy comparing Flynn and CLU.

But what about those who, for reasons of age or taste, never saw the first Tron? Certainly Disney made no effort to share the original with us; their decision not to put out a Blu-Ray version, or even rerelease the handsome two-disk 20th anniversary DVD, has led to conspiratorial muttering in the blogosphere about the studio’s coverup of an outdated original, whose visual effects now read as ridiculously primitive. Perhaps this is so. But then again, Disney has fine-tuned the business of selectively withholding their archive, creating rarity and hence demand for even their flimsiest products. It wouldn’t at all surprise me if the strategy of “disappearing” Tron pre-Tron: Legacy were in fact an inspired marketing move, one aimed less at monetary profit than at building discursive capital. What, after all, do fans, cineastes, academics, and other guardians of taste enjoy more than a privileged “I’ve seen it and you haven’t” relationship to a treasured text? Comic-Con has become the modern agora, where the value of geek entertainment items is set for the masses, and carefully coordinated buzz transmutes subcultural fetish into pop-culture hit.

It’s maddeningly circular, I know, to insist that it takes an appreciation of Tron to appreciate Tron: Legacy. But maybe the apparent tautology resolves if we substitute terms of evaluation that don’t have to do with blockbuster cinema. Does it take appreciation of Ozu (or Tarkovsky or Haneke or [insert name here]) to appreciate other films by the same director? Tron: Legacy is not in any classical sense an auteurist work — I couldn’t tell you who directed it without checking IMDb — but who says the brand itself can’t function as an auteur, in the sense that a sensitive reading of it depends on familiarity with tics and tropes specific to the larger body of work? Alternatively, we might think of Tron as sub-brand of a larger industrial genre, the blockbuster, whose outward accessibility belies the increasingly bizarre contours of its experience. With its diffuse boundaries (where does a blockbuster begin and end? — surely not within the running time of a single feature-length movie) and baroque textual patterns (from the convoluted commitments of transmedia continuity to rapidfire editing and slangy shorthands of action pacing), the contemporary blockbuster possesses its own exotic aesthetic, one requiring its own protocols of interpretation, its own kind of training, to properly engage. High concept does not necessarily mean non-complex.

Certainly, watching Tron: Legacy, I realized it must look like visual-effects salad to an eye untrained in sensory overwhelm. I don’t claim to enjoy everything made this way: Speed Racer made me queasy, and Revenge of the Fallen put me into an even deeper sleep than did the first Transformers. T:L, however, is much calmer in its way, perhaps because its governing look — blue, silver, and orange neon against black — keeps the frame-cramming to a minimum. (The post-1983 George Lucas committed no greater sin than deciding to pack every square inch of screen with nattering detail.) Here the sequel’s emulation of Tron‘s graphics is an accidental boon: limited memory and storage led in the original to a reliance on black to fill in screen space, a restriction reinvented in T:L as strikingly distinctive design. Our mad blockbusters may indeed be getting harder to watch and follow. But perhaps we shouldn’t see this as proof of commercially-driven intellectual bankruptcy and inept execution, but as the emergence of a new — and in its way, wonderfully difficult and challenging — mode of popular art.

T:L works for me as a movie not because its screenplay is particularly clever or original, but because it smoothly superimposes two different orders of technological performance. The first layer, contained within the film text, is the synthesis of live action and computer animation that in its intricate layering succeeds in creating a genuinely alternate reality: action-adventure seen through the kino-eye. Avatar attempted this as well, but compared to T:L, Cameron’s fantasia strikes me as disingenuous in its simulationist strategy. The lush green jungles of Pandora and glittering blue skin of the Na’vi are the most organic of surfaces in which CGI could cloak itself: a rendering challenge to be sure, but as deceptively sentimental in its way as a Thomas Kinkade painting. Avatar is the digital performing in “greenface,” sneakily dissembling about its technological core. Tron: Legacy, by contrast, takes as its representational mission simulation itself. Its tapestry of visual effects is thematically and ontologically coterminous with the world of its narrative; it is, for us and for its characters, a sea of simulation.

Many critics have missed this point, insisting that the electronic world the film portrays should have reflected the networked environment of the modern internet. But what T:L enshrines is not cyberspace as the shared social web it has lately become, but the solipsistic arena of first-person combat as we knew it in videogames of the late 1970s. As its plotting makes clear, T:L is at heart about the arcade: an ethos of rastered pyrotechnics and three-lives-for-a-quarter. The adrenaline of its faster scenes and the trances of its slower moments (many of them cued by the silver-haired Flynn’s zen koans) perfectly capture the affective dialectics of cabinet contests like Tempest or Missile Command: at once blazing with fever and stoned on flow.

The second technological performance superimposed on Tron: Legacy is, of course, the exhibition apparatus of IMAX and 3D, inscribed in the film’s planning and execution even for those who catch the print in lesser formats. In this sense, too, T:L advances the milestone planted by Avatar, beacon of an emerging mode of megafilm engineering. It seems the case that every year will see one such standout instance of expanded blockbuster cinema — an event built in equal parts from visual effects and pop-culture archetypes, impossible to predict but plain in retrospect. I like to imagine that these exemplars will tend to appear not in the summer season but at year’s end, as part of our annual rituals of rest and renewal: the passing of the old, the welcoming of the new. Tron: Legacy manages to be about both temporal polarities, the past and the future, at once. That it weaves such a sublime pattern on the loom of razzle-dazzle science fiction is a funny and remarkable thing.


To those who have read to the end of this essay, it’s probably clear that I dug Tron: Legacy, but it may be less clear — in the sense of “twelve words or less” — exactly why. I confess I’m not sure myself; that’s what I’ve tried to work out by writing this. I suppose in summary I would boil it down to this: watching T:L, I felt transported in a way that’s become increasingly rare as I grow older, and the list of movies I’ve seen and re-seen grows ever longer. Once upon a time, this act of transport happened automatically, without my even trying; I stumbled into the rabbit-holes of film fantasy with the ease of … well, I’ll let Laurie Anderson have the final words.

I wanted you. And I was looking for you.
But I couldn’t find you.
I wanted you. And I was looking for you all day.
But I couldn’t find you. I couldn’t find you.

You’re walking. And you don’t always realize it,
but you’re always falling.
With each step you fall forward slightly.
And then catch yourself from falling.
Over and over, you’re falling.
And then catching yourself from falling.
And this is how you can be walking and falling
at the same time.

Back to Back to the Future

Revisiting Back to the Future on Blu-Ray, it’s hard not to get sucked into an infinite regress the likes of which would probably have pleased screenwriters Robert Zemeckis and Bob Gale: the 1985 original, viewed 25 years later, has acquired layers of unintended convolution, discovery, and loss.

As with any smartly-executed time travel story (see: La JeteePrimer, Timecrimes, and “City on the Edge of Forever”), the plot gets you thinking in loops, attentive to major and minor patterns of similarity and difference, playing textual detective long before Christopher Nolan came along with the narrative games that are his auteurist signature. And BTTF is nothing if not a well-constructed mousetrap, full of cleverly self-referential production design, dropping hints and clues from its very first frames about the saga that’s about to unfold. (Panning across Doc Brown’s Rube-Goldberg-esque jumble of a laboratory, the camera lingers on a miniature clock from whose hands a figure hangs comically, in a shoutout both to Harold Lloyd in Safety Last! [1923] and to BTTF’s own climax.) In this sense, it’s a film designed for multiple viewings, though the valence of that repetition has changed over the quarter-century since the film’s first release: in the mid-eighties, BTTF was a quintessential summer blockbuster, its commercial life predicated on repeat business and the just-emerging market of home rentals. Nowadays, the fuguelike structure of BTTF lends itself perfectly to the digitalized echo chamber of what Barbara Klinger terms replay culture and the encrusted supplementation of making-of materials sparked in the random-access format of DVDs and fanned into a blaze by the enormously expanded storage of Blu-Rays. (Learning to navigate the interlocked and labyrinthine documentaries on the new Alien set is like being lost on the Nostromo.)

But in 1985, all this was yet to come, and BTTF’s bubbly yet machine-lathed feat of comic SF is all the more remarkable for maintaining its innocence across the gulf of the changing technological and economic contexts that define modern blockbuster culture. It still feels fresh, crisp, alive with possibilities. If anything, it’s somehow gotten younger and more graceful: expository setups that seemed leaden now trip off the actors’ tongues like inspired wordplay, while action setpieces that seemed unnecessarily prolonged — in particular, the elaborate climax in which a time-traveling DeLorean must intersect a lightning-bolt-fueled surge of energy while traveling at exactly 88 miles per hour — now unspool with the precision of a Fred Astaire dance routine. Perhaps the inspired lightness of BTFF is simply a matter of contrast with our current blockbusters, which have grown so dense and heavy with production design and ubiquitous visual effects, and whose chapterized storyworlds so entangled with encyclopedic continuity, that engaging with them feels like subscribing to a magazine — or a gym membership.

BTTF, of course, is a guilty player in this evolution. Its two sequels were filmed back-to-back, an early instance of the “threequelization” that would later result in such elephantine letdowns as the Matrix and Pirates of the Caribbean followups. But just as the story of BTTF involves traveling to an unblemished past, when sin was a tantalizing temptation to be toyed with rather than a buried regret, the reappearance of the film in 2010 allows us to revisit a lost moment of cinema, newly pure-looking in relation to the rote and tired wonders of today. For a special-effects comedy, there are surprisingly few visual tricks in evidence: ILM’s work is confined to some animated electrical discharges, a handful of tidy and unshowy matte shots, and a motion-controlled flying-car takeoff at the end that pretty much sums up the predigital state of the art. Far in the future is Robert Zemeckis’s unfortunate obsession with CGI waxworks a la The Polar Express, Beowulf, A Christmas Carol, and soon — shudder — Yellow Submarine.

As for the Blu-Ray presentation, the prosthetic wattles of old-age makeup stand out as sharply as the heartbreakingly unlined and untremored features of Michael J. Fox, then in his early 20s and a paragon of comic timing. It makes me think of how I’ve changed in the twenty-five years since I first saw the movie, at the age of 19. And somewhere in this circuit of private recollection and public viewing, I get lost again, with both sadness and joy, in the temporal play of popular culture that defines so much of my imagination and experience. The originating cleverness of BTTF’s high concept has been mirrored and folded in on itself as much by the passage of time as by Universal’s marketing strategies, so that in 2010 — once an inconceivably far-flung punchline destination for Doc Brown in the tag that closes the film and sets up its continuation in BTTF 2 — we encounter our own future-as-past, past-as-future: time travel of an equally profound, if more reversible, kind.

British Invasion


Ordinarily I’d start my post with a by-now-boilerplate apology for lagging behind the news, but in this case I will leave aside the ritual lament (“I’m just so busy this semester!”) and instead make proud boast of my lateness, boldly owning up to the fact that, although it was forty years ago last week that Monty Python’s Flying Circus had its first broadcast, I’m just getting around to remarking on it today. Seems only (il)logical to do so, given that one of Python‘s most fundamental and lasting alterations to the cultural landscape in which I grew up was to validate the non sequitur as an acceptable conversational — and often behavioral — gambit.

Let me explain. For me and my friends in grade school, the early-to-mid-seventies were a logarithmically-increasing series of social revelations, sometimes depressingly gradual, other times bruisingly abrupt, that we were “weird.” Our weirdness went by several aliases. The labels bestowed by forgiving parents and teachers were things like “smart,” “bright,” “eccentric,” “unusual,” and “creative.” Whereas the ones that arrived not from above but laterally, hurled like snowballs in the schoolyard or graffitied in ball-point across our notebooks, were more brutally and colorfully direct, and thus of course more convincing: “freak,” “spaz,” and — for me in particular, since it vaguely rhymes with Rehak — “retard.”

I see now that almost all of these phrases had their grain of truth, their icy core, their scored ink-line. In our weirdness we were smart and unusual and creative; we were also undeniably freakish, and as our emotional gyroscopes whirled wildly in search of some stable configuration, we were, by turns, spastically overenthusiastic and retardedly slow to adapt. We were book and comic readers, TV watchers, play actors, cartoon artists, model builders, rock collectors. We were boys. We liked science fiction and fantasy. Our skills and deficits were misdistributed and extreme: vastly vocabularied but garbled by braces and retainers; carefully observant but blindered by thick glasses; handsome heroes in our hearts, chubby or skinny buffoons in person. Many of us were good at science and math, others at art and theater. None of us did particularly well on the athletic field, though we did provide workouts for the kids who chased us.

Me, I made model kits of monsters like the Mummy, the Wolfman, and the Creature from the Black Lagoon — all supplied by the great company Aurora, with the last mile from hobby store to home facilitated by my indulgent parents — painted them in garish and inappropriate colors, situated them behind cardboard drum kits and guitars on yarn neckstraps, and pretended they were a rock supergroup while blasting the Monkees and the Archies from my record player. (I am not making this up.)

I was also a media addict, even back then, and when Monty Python episodes began airing over our local PBS station, I was instantly and utterly devoted to it. Which is not to say I liked everything I saw — a nascent fan, I quickly began drawing distinctions between the unquestionably great, the merely good, the tolerably adequate, and the terminally lame paroles that constituted the show’s langue, learning connections between these variations in quality and the industrial microepochs that gave rise to them: early, middle, and late Python. I had my favorite bits (Terry Gilliam’s animations, anything ever done or said by John Cleese) and my “mehs” (Terry Gilliam’s acting and the episode devoted to hot-air ballooning). Although or because I was stranded somewhere in the long latency separating my phallic and genital stages, I found every mention of sex and every glimpse of boob a fascinating magma of hypothetical desire and unearned shame. And, of course, it was all hysterically, tear-squirting, stomach-cramp-inducing funny.

The downside of Monty Python‘s funniness was the same as its upside: it gave all of us weirdos a shared social circuit. The show’s peculiar and specific argot of slapstick and trangression, dada and doo-doo, spread overnight to recess and classroom, connecting by a kind of dedicated party line any schlub who could memorize and repeat lines and skits from the show. In short, Monty Python colonized us, or more accurately it lit up like a discursive barium trace the preexisting nerd colony that theretofore had hidden underground in a nervous relay of quick glances, buried smiles, and raised eyebrows. Suddenly outed by a humor system from across the sea, we pint-sized Python fans stood revealed as a brotherhood of nudge-nudge-wink-wink, a schoolyard samizdat.

A good thing, but also a bad thing. The New York Times gets it exactly wrong when describing the “couple of guys in your dorm (usually physics majors, for some reason, and otherwise not known for their wit) who could recite every sketch”; according to Ben Brantley, “They could be pretty funny, those guys, especially if you hadn’t seen the real thing.” Nope — people who recite every Monty Python sketch are by definition not funny, or rather are funny only within an extremely bounded circle of folks who (A) already know the jokes and (B) accept said recitation as legal tender in their subcultural social capital. In my experience, there was no surer date-killer, no quicker way to get people to edge away from you at parties than by launching into such bonafide gems of genius as the Cheese Shoppe or the Argument Clinic. Yet we went on tagging each other as geek untouchables, comedy as contagion, as helpless before Pythonism’s viral spread as we would be, a few years on, by the replicating errata of Middle Earth and the United Federation of Planets.

Monty Python was merely the first infusion of obsessive-compulsive nerd scholarship into which I and my friends were forced by a series of cultural imports from Britain: grand stuff like The Fall and Rise of Reginald Perrin, The Hitchhiker’s Guide to the Galaxy, Alan Moore, and the computer game Elite. The three movies I like to name as my favorites of all time each have substantial UK components: Star Wars (1977) was filmed partly at Elstree Studios, Superman (1978) at Pinewood and Shepperton Studios, and Alien (1979), with Ridley Scott at the helm, at Shepperton and Bray Studios. And the trend continues right to present day: my favorite band is Genesis, I can’t get enough of Robbie Coltrane’s Cracker, and the science-fiction masterpiece of the summer was not District 9 (which gets high marks nevertheless) but the superb Children of Earth.

I sometimes wonder what to call this collection of British art and entertainment, this odd cultural constellation that seems to obey no organizing principle except its origins in England and its relevance to my development. How do you draw a boundary around a miscellany of so much that is good and essential about imaginary lives and their real social extrusions? Maybe I’m seeking a word like supergenre or metagenre, but those seem too big; try idiogenre, some way of systematizing a group of texts whose common element is their locus in a particular, historically-shaped subjectivity (my own) that is simultaneously a shared condition. The comic tragedy of the nerd, a figure both stranded on the social periphery yet crowded by his peers, lonely yet overfriended, renegade frontiersman and communal sheep, a silly-walking man with an entire Ministry of Silly Walks looming behind him.

I blame, and thank, England.


Counting Down Galactica (4 of 4)

[This is the last of four posts counting down the final episodes of Battlestar Galactica. To see the others, click here.]

I’d meant to write my final entry in the “Counting Down Galactica” series before the airing of the finale on Friday night; a power outage in my neighborhood prevented me from doing so. Hence everything I’m about to say is colored by having seen the two-hour-and-eleven-minute conclusion, and spoilers lie in wait.

On the topic of spoilers, I know of a few ambitious souls (hi, Suzanne!) who are holding the finale in reserve, planning to watch it next week. Let me note how sympathetic I am toward, and dubious about the chances of, their or anyone’s ability to navigate the days ahead without having the ending spoiled. I haven’t even dared to visit Facebook yet, for fear of destabilizing my own still-coalescing thoughts on the experience; similarly, I won’t go near the various blogs I read. When I got up this morning, I turned on NPR’s Weekend Edition, only to find myself smack-dab in the middle of a postmortem with Mary McDonnell. It was like coming out of hyperspace into an asteroid field, or — a more somber echo — waking on the morning of 9/11 to a puzzled voice on the radio saying, in perhaps our last moment of innocence, that pilot error seemed to be behind a plane’s freak collision with the World Trade Center.

Comparing BSG’s wrapup to the events of 9/11 might seem the nadir of taste, except that Galactica probably did more in its four seasons than any other media artifact besides 24 — I’m discounting Oliver Stone movies and the Sarah Silverman show — to process through pop culture the terrorist attacks and their corrosive aftereffects on American psychology and policy. It became, in fact, an easy truism about the show, to the point where I’d roll my eyes when yet another commentator assured me that BSG was about serious things like torture and human rights. But then I shouldn’t let cynicism blind me to the good that stories and metaphors can do; I myself publicly opined that the season-two Pegasus arc marked a “prolapse of the national myth,” a moment at which BSG “strode right over the line of allegory to hold up a mirror in which the United States could no longer misrecognize its practices of dehumanization and torture.” And who am I to argue with the United Nations, anyway?

But maybe the more fitting connection is local rather than global, for losing power yesterday reminded me how absolutely dependent the current state of my life is on technology: the uninterrupted flow of internet, television, radio. My wife and I were able to brew coffee by plugging the pot into one remaining active outlet, and our cell phones enabled us to maintain contact with the outside world (until their batteries died). After that, it was leave the house and brave the bright outdoors and actual, face-to-face conversation with other human beings.

I bring this up because, in its final hours, BSG plainly announced itself as concerned, more than anything else, with the relationship between nature and technology — between humans and their creations. In retrospect, this dialectic is so obvious that I’m embarrassed to admit it never quite came into focus for me when the series was running. Sure, the initiating incident was a sneak attack by Cylons, a race of human-built machines who got all uppity and sentient on us. (Or maybe it’s the case that the rebellious Cylons descended from some other, ancient caste of Cylons — I’m not entirely clear on this aspect of the mythology, and consider it the show’s failing for not explaining it more clearly. But more about that in a moment.) Even in that first, fateful moment of aggression, though, the lines between us and them were blurred; in “reimagining” the 1970s series that was its precursor, Ronald D. Moore’s smartest decision — apart from scuffing up the mise-en-scene — was to posit Cylons who look like us; who think, feel, and believe like us. As the series wore on, this relationship became ever more intimate, incestuous, and uncomfortable, so that finally it seemed neither species could imagine itself outside of the other. It was differance, supplement, and probably several other French words, operationalized in the tropes of science fiction.

A more detailed textual analysis than I have the patience to attempt here would likely find in “Daybreak” an eloquent mapping of these tense territories of interdependent meanings. One obvious starting point would be the opposition between Cavil’s Cylon colony, a spidery, Gigeresque encrustation perched in a maelstrom of toxic-looking space debris, and the plains of Africa, evoked so emphatically in the finale’s closing third hour that I began to wonder if the story’s logic could admit the existence of any sites on Earth (or pseudo-Earth, as the story cutely frames it) that aren’t sunny, hospitable, and friendly. In this blunt binary I finally saw BSG’s reactionary (one might say luddite) ethos emerge in full flower: a decision on the undecidable, a brake on the sliding of signifiers. For all the show’s interest in hybrids of every imaginable flavor, it did finally come down to a rejection of technology, signaled most starkly in Lee Adama’s call to “break the cycle” by not building more cities — and the sailing of Galactica and her fleet into the sun. Even as humans and Cylons decide to live together (and, it’s suggested in the coda, provide the seed from which contemporary civilization sprouted), it seems to me the metaphor has been settled in humanity’s favor.

That’s fine; at least the show had the courage to finally call heads or tails on its endless coin-flipping. Interesting, though, that the basic division over which the narrative obsessed was reflected formally in the series’ technical construction and audience reception. I refer here to a dialectic that emerged late in the show’s run, between visual effects and everything else — between space porn and character moments. Reading fan forums, I lost count of the number of times BSG was castigated by some for abandoning action sequences and space battles, only to be countered by another group tut-tutting along the lines of This show has never been about action; it’s about the people. For what it’s worth, I’m firmly in the first camp (as my post last week demonstrates): the best episodes of Galactica were those that featured lots of space-set action (the Hugo-winning pilot, “33”; “The Hand of God”; most of the first season, for that matter, and bright moments sprinkled throughout the rest of the series). Among the worst were those that confined themselves exclusively to character interaction, such as “Black Market,” “Unfinished Business,” and most of the latter half of season four.

It’s not that the show was ever poorly written, or the characters uninteresting. But it did seem for long stretches to develop an allergy to action, with the result a bifurcated structure that drove some fans crazy. Much like the pointless squabbles around Lost, whose flashback structure still provokes some to shout “filler episode!” where others cry “Character development!”, debate on the merits of BSG too often devolved into untenable assertions about the antithetical relationship between spectacle and narrative, with space-porn fans lampooned as short-attention-span stimulus junkies and character-development fans mocked as pretentious blowhards. Speaking as a stimulus junkie and pretentious blowhard, I feel safe in pointing out the obvious: it’s hard to pull off compelling science fiction characters without some expertly integrated shiny-things-go-boom, while spaceships and ‘splosions by themselves get you nowhere. You need, in short, both — which is why BSG’s industrial dimension neatly homologized its thematic concerns.

I’m relieved that last night’s conclusion managed to reconcile the show’s many competing elements, and that it did so stirringly, dramatically, and movingly. I expected nothing less than a solid sendoff from RDM, one-half of the writing team behind perhaps the greatest series finale ever, Star Trek: The Next Generation‘s “All Good Things …” — but that’s not to say he couldn’t have screwed it up in the final instance. Indeed, if there is a worm in the apple, it’s my sneaking suspicion that the game was fixed: the four episodes leading up to “Daybreak” were a maddening mix of turgid soap opera and force-fed exposition, indulgent overacting and unearned emotion. It’s almost as though they wanted to lower our expectations, then stun us with a masterpiece.

I don’t know yet if “Daybreak” deserves that particular label, but we’ll see. In any case, there is something magical about so optimistic an ending to such a downbeat series. If the tortured soul of this generation’s Battlestar Galactica was indeed forged in the flames of 9/11 and the collective neurotic reaction spearheaded by the Bush administration, perhaps its happy ending reflects a national movement toward something better: the unexpected last-minute emergence, through parting clouds, of hope.

Requiem for a Craptop

Today I said goodbye to the MacBook that served me and my wife for almost three years — served us tirelessly, loyally, without ever judging the uses to which we put it. It was part of our household and our daily routines, funneling reams of virtual paper past our eyeballs, taking our email dictation, connecting us with friends through Facebook and family through Skype. (Many was the Sunday afternoon I’d walk the MacBook around our house to show my parents the place; I faced into its camera as the bedrooms and staircases and kitchens scrolled behind me like a mutated first-person shooter or a Kubrickian steadicam.) We called it, affectionately, the Craptop; but there was nothing crappy about its animal purity.

It’s odd, I know, to speak this way about a machine, but then again it isn’t: I’m far too respectful of the lessons of science fiction (not to mention those of Foucault, Latour, and Haraway) to draw confident and watertight distinctions between our technologies and ourselves. My sadness about the Craptop’s departure is in part a sadness about my own limitations, including, of course, the ultimate limit: mortality. Even on a more mundane scale, the clock of days, I was unworthy of the Craptop’s unquestioning service, as I am unworthy of all the machines that surround and support me, starting up at the press of a button, the turn of a key.

The Craptop was not just a machine for the home, but for work: purchased by Swarthmore to assist me in teaching, it played many a movie clip and Powerpoint presentation to my students, flew many miles by airplane and rode in the back seat of many a car. It passes from my world now because the generous College has bought me a new unit, aluminum-cased and free of the little glitches and slownesses that were starting to make the Craptop unusable. It’s a mystery to me why and how machines grow old and unreliable — but no more, I suppose, than the mystery of why we do.

What happens to the Craptop now? Swarthmore’s an enlightened place, and so, the brand assures me, is Apple: I assume a recycling program exists to deconstruct the Craptop into ecologically-neutral components or repurpose its parts into new devices. In his article “Out with the Trash: On the Future of New Media” (Residual Media, Ed. Charles R. Acland, University of Minnesota Press, 2007), Jonathan Sterne writes eloquently and sardonically of the phenomenon of obsolete computer junk, and curious readers are well advised to seek out his words. For my part, I’ll just note my gratitude to the humble Craptop, and try not to resent the newer model on which, ironically, I write its elegy: soon enough, for it and for all of us, the end will come, so let us celebrate the devices of here and now.