Review: Masters of Doom

Masters of Doom: How Two Guys Created An Empire and Transformed Pop Culture
David Kushner
New York: Random House, 2003
Review originally published in January 2004 at the Resource Center for Cyberculture Studies (RCCS)


David Kushner’s Masters of Doom, a spirited, winningly naive history of the personal-computer boom and the offbeat souls who design and play videogames, reminded me a lot of Steven Levy’s Hackers. Published in 1984, Levy’s sprawling book was an early and influential attempt to condense thirty years of computer culture into a form fit for mainstream readers. Between the 1950s and the 1980s, computers went from private to public, from mini to micro. Room-sized ENIACs housed on college campuses and guarded by “priesthoods” of elite technicians were reinvented for the home market as funky desktop boxes, built from kits or purchased from companies like Tandy, Commodore, and Apple. Possessing little in the way of storage capacity or graphic capability, these machines’ meager 8K and 16K memories would nonetheless be jammed with programs written in BASIC by home users eager to code a better mousetrap—or the next big game.

Hackers, which took its title from the term for virtuoso teenage programmers increasingly visible on cultural radar through films like Tron (1982) and WarGames (1983), appealed on multiple mythic fronts: it was a tale of ingenuity (most hackers were self-taught), of nonconformity (many were socially misaligned nerds), and of profitability (for a famous few, like Bill Gates, were able to turn their programs into products, making millions and founding empires). Hackers, in short, told a pleasingly American story, despite its high-tech trappings. And one of the book’s incidental but not insignificant functions was to cast in homely, unthreatening terms the single largest upheaval in society and commerce since the Industrial Revolution. Computers aren’t scary; they’re fun! Just look at all the nutty kids who’ve fallen in love with them!

Masters of Doom picks up where Hackers leaves off, breathlessly detailing the rise to fame (and fall from grace) of two such nutty kids in the late 80s and 90s, John Carmack and John Romero. These “superstars”—if we accept Kushner’s rather streamlined hagiography—dreamed up, produced, and marketed to a ravenous community of joystick jockeys the first-person shooter or FPS. Offhandedly defined by Kushner as “paintball-like contests played from a first-person point of view” (x), FPSs are notoriously violent, fast-moving games in which players peer over the barrels of large weapons (grasped in their own avatarial “hands”) while racing about mazelike levels, locked in combat with opponents dreamed up by the computer or virtually present through local-area networks (LANs) or Internet connection. The latter play mode, called “deathmatch,” is the defining function of FPSs such as standard-bearers Quake 3: Arena and Unreal Tournament 2003. (Ironically, Carmack and Romero created deathmatch almost as an afterthought, an extra feature on their seminal 1993 Doom.) Deathmatch is also, of course, the (anti)social practice that brings FPSs in for condemnation by the likes of U.S. Senator Joseph Lieberman, who argues in high-moral-panic mode that violent videogames produce violent kids—epitomized in school shootings such as the 1999 Columbine High murders. (Dylan Klebold and Eric Harris, who orchestrated the killings, happened to be Doom fans.)

Thankfully, rather than focusing on the FPS’s controversial political status or debating the cultural effects of videogames, Kushner pitches Masters more intimately. Carmack and Romero’s collision of talents and personalities led to interpersonal firefights at least as entertaining as the digital melees engineered by their company, id Software. Tracing the duo’s evolution from colleagues to friends, then from competitors to enemies, and finally to battle-weary but apparently reconciled veterans of the game industry’s incandescent 90s, Kushner characterizes the team in language reminiscent of the Lennon-McCartney partnership: Carmack is the intense, driven genius (the “rocket scientist”), while Romero is the flamboyant, crowdpleasing publicity hound (the “rock star”).

These traits are made clear—perhaps reductively so—in the initial chapters, which join Romero at the age of 11 in 1979, toting up high scores on all the Asteroids arcade consoles in Rocklin, California. Romero went on to write his own games for the Apple II, publishing them in game magazines and eventually landing a job in the industry. Carmack, born in 1970, followed a similar path, though with a technological emphasis. Grasping the promise inherent in the primitive games of the early 80s, a time when arcades crumbled economically and the home-computer and console market had yet to take off, Carmack wanted to push the envelope and create immersive experiences to rival the holodeck on Star Trek: The Next Generation.

This theme runs throughout Masters of Doom: the sense that early gamers such as “the two Johns” were always looking toward the next great thing, whether it be a sequel to a prizewinning game or a world of networked subjectivity. Refreshingly, the book doesn’t bother to defend the importance of videogames; instead, it takes for granted that games are forerunners of virtual reality, fantasy spaces realized within computer hardware and populated by human beings in avatarial masquerade. Describing Romero’s love affair with the role-playing game Ultima, for example, Kushner writes confidently that “gamers overlooked the crudeness for what the games implied: a novelistic and participatory experience, a world” (11). Later on, he assesses an arcade game whose side-scrolling graphics broke new ground: “Compared with the other games in the arcade, Defender felt big, as if the player was living and breathing in a more expansive virtual space” (46).

And against the backdrop of the Internet and World Wide Web, Kushner—presumably emulating Carmack and Romero’s own fascination with the medium’s possibilities—repeatedly invokes the science-fiction constructs of William Gibson’s (1984) cyberspace and Neal Stephenson’s (1992) metaverse. Such lofty allusions have the effect of elevating Romero and Carmack’s “mission” while inoculating Masters against dismissal by a nation tired of get-rich-quick stories. The equation of videogames and cyberspace implies that game designers are, in fact, engineering future society (a claim also put forth by John Gaeta, special-effects supervisor of the Matrix films [1]), and that Carmack and Romero were the visionaries who laid the groundwork for this online world.

If you buy that philosophy, you’ll enjoy the book. Even if you don’t, you will get something out of the insider’s perspective on the game industry, which Kushner portrays as an analog of videogames themselves: colorful, loud, profane, cheerfully violent. The bulk of the book centers on id’s profitable series of FPSs: Wolfenstein 3DDoom, and Quake. Each game, along with its sequels and expansion packs, is presented as a risky undertaking, a dream of a product that might or might not find its audience. Each game, of course, turns out to be enormously popular, lending Masters the feel of a Broadway musical about underdog victories and overnight successes. Come on, kids, let’s put on a show!

Some will find this picture of software development disingenuous—hackers can’t all be scrappy outsiders—but it works well enough to fill 300 pages of fast-moving prose. And among the exuberance of all-night programming sessions, endless pizza deliveries, and the fleet of Ferraris so fetishized by Carmack and Romero, Masters of Doom casually outlines the emergence of a new business paradigm, one keyed to the breakneck rhythms of Moore’s Law. That maxim, coined in 1965, states that the power of microprocessors doubles every 18 months. For id, this meant that not just the outward design of games, but their underlying architecture, had to undergo constant reinvention. While Romero hyped each upcoming release (in terms that often landed him in trouble with the avid but skeptical gaming community), Carmack labored to produce software “engines” capable of rendering 3D graphics with unprecedented speed and realism. And each new version of the FPS encouraged, or forced, gamers to upgrade their computers in order to run the new software. At the same time, id’s practice of releasing its games as shareware (downloadable files whose first levels could be played for free, with the full version requiring purchase) cut distributors out of the circuit, amplifying the profits of developers.

The end result is that id’s games pushed the industry in specific directions. Cyberspace may not yet be here, but according to Kushner, the world of computing in 2003 would be radically different (and a lot less fun) if not for Carmack and Romero.

  1. In a Wired cover story on the Matrix sequels, Gaeta warned of the dangers posed by advanced image-manipulation techniques. “You have these paranoid films about the Matrix depicting how people are put in a mental prison by misusing this technology . . . maybe our technology will become the actual Matrix, and we have inadvertently spilled the vial of green shit out onto the planet.”

Gibson, W. (1984). Neuromancer. New York: Ace Books.
Levy, S. (1984: 1994). Hackers: Heroes of the Computer Revolution. New York and London: Penguin.
Silberman, S. (2003). “Matrix2.” Wired 11.05.
Stephenson, N. (1992). Snow Crash. New York: Bantam Books.

Jurassic World

maxresdefault

Driving home from last night’s screening of Jurassic World, I kept thinking back to a similarly humid summer evening in 1997, when I locked horns with a friend over the merits of an earlier film in the franchise, Jurassic Park: The Lost World. We were lingering outside the theater before heading to our cars, digesting the experience of the movie we had just watched together. He’d disliked The Lost World, which (if I remember his stance accurately) he saw as an empty commercial grab, a cynical attempt by Steven Spielberg to repeat the success of Jurassic Park (1993)—a film we agreed was a masterpiece of blockbuster storytelling and spectacle—but possessing none of the spark and snap of the original. As for my position, all I can reconstruct these 18 years later is that I appreciated The Lost World’s promotion of Ian Malcolm to primary protagonist—after The Fly (1986), any movie that put Jeff Goldblum in the driver’s seat was golden in my book—as well as its audacious final sequence in which a Tyrannosaurus Rex runs rampant through the streets and suburbs of San Diego.

Really what it came down to, though, were two psychological factors, facets of a subjectivity forged in the fantastic fictions and film frames of science-fiction media, my own version of Scott Bukatman’s “terminal identity.” The first, which I associate with my fandom, was that in those days I never backed down from a good disagreement, whether on picayune questions like Kirk versus Picard or loftier matters such as the meaning of the last twenty minutes of 2001: A Space Odyssey (1968). (Hmm, both of those links go to WhatCulture.com—maybe I should add it to my reading list.)

The second reason for my defense of The Lost World was simply my overriding wish that it not be a piece of crap: that it be, in fact, great. Because even then I could feel the tug of loyalty tying me to an emergent franchise, the sense that I’d signed on for a serial experience that might well stretch into decades, demanding fealty no matter how debased its future installments might become. I hadn’t yet read the work of Jacques Lacan—that would come a couple of years later, when I became a graduate student—but I see now that I was thinking in the futur anterieur, peering back on myself from a fantasized, later point of view: “I will have done …” It’s a twisty trick of temporality, and if I no longer stress about contradictions in my viewing practice the way I once did (following the pleasurable trauma of Abrams’s reboot, I have accepted the death of the Star Trek I grew up with), I am still haunted by a certain anxiety of the unarrived regarding my scholarly predilections and predictions (I’d hate to be the kind of academic whom future historians tsk-ingly agree got things wrong.)

But the tidal pull of the franchise commitment persists, which why I’m having a hard time deciding whether Jurassic World succeeded or sucked. Objectively I’m pretty sure the film is a muddle, certainly worse in its money-grabbing motivations and listless construction than either The Lost World or its follow-up, the copy-of-a-copy Jurassic Park III (2001). Anthony Lane in the New Yorker correctly bisects Jurassic World into two halves, “doggedly dull for the first hour and beefy with basic thrills for most of the second,” to which I’d add that most of that first hour is rushed, graceless, and elliptical to the point of incoherence. One of my favorite movie critics, Walter Chaw of Film Freak Central, damns that ramshackle execution with faint praise, writing: “Jurassic World is Dada. It is anti-art, anti-sense—willfully, defiantly, some would say exuberantly, meaningless. In its feckless anarchy, find mute rebellion against narrative convention. You didn’t come for the story, it says, you came for the set-ups and pay-offs.”

Perhaps Chaw is right, but seeing the preview for this fall’s Bridge of Spies reminded me what an effortless composer of the frame Spielberg can be–an elegance absent from Jurassic World save for one shot, which I’ll get to in a minute–and rewatching the opening minutes of Jurassic Park before tackling this review reminded me how gifted the man is at pacing. In particular, at putting a story’s elements in place: think of the careful build of the first twenty minutes of Close Encounters of the Third Kind (1977), laying out its globe-jumping puzzle pieces; or of Jaws (1975), as the sleepy beach town of Amity slowly wakes to the horror prowling its waters. Credit too the involvement of the late, great Michael Crichton. His early technothrillers–especially 1969’s The Andromeda Strain and 1973’s Westworld, both of which feed directly into Jurassic Park–might be remembered for their high concepts (and derided for their thin characterizations), but what made him such a perfect collaborator for Spielberg was the clear pleasure he took in building narrative mousetraps, one brief chapter at a time. (Nowadays someone like Dan Brown is probably seen as Crichton’s heir apparent, though I vastly prefer the superb half-SF novels of Daniel Suarez.)

I delve into these influences and inheritances because ancestry and lineage seem to be much on the mind of Jurassic World. DNA and its pandora’s box of wonders/perils have always been a fascination of the Jurassic franchise. In the first film it was mosquitoes in amber; in Jurassic World it’s the 1993 movie that’s being resurrected. Something old, something new, something borrowed, something blue: in front of the camera you’ve got B. D. Wong and a crapload of production design tying the humble beginnings of Isla Nubar to its modern, Disneyesque metastasization, while “behind the scenes” (a phrase I put in scare quotes because, let’s face it, if the material were really meant to stay behind the scenes, we wouldn’t be discussing it), videos like this one work to reassure us of a meaningful connection between the original and its copy:

The “Classic Jurassic Crew” profiled here might seem a little reachy (lead greensman? boom operator?), but the key names are obviously Jack Horner and Phil Tippett, “Paleontology Consultant” and “Dinosaur Consultant” respectively: the former conferring scientific legitimacy upon the proceedings, the latter marking a tie to traditions of special effects that predate the digital. Tippett, of course, made his name in stop-motion animation–the Imperial Walkers in The Empire Strikes Back’s Hoth battle are largely his handiwork–and at first was approached by Spielberg to provide similar animation for Jurassic Park. But when Tippett’s “go motion” technique was superseded by the computer-generated dinosaurs being developed by Dennis Muren, Tippett became a crucial figure, both technologically and rhetorically, in the transition from analog to digital eras. In the words of his Wikipedia entry:

Far from being extinct, Tippett evolved as stop motion animation gave way to Computer-generated imagery or CGI, and because of Phil’s background and understanding of animal movement and behavior, Spielberg kept Tippett on to supervise the animation on 50 dinosaur shots for Jurassic Park.

Tippett’s presence in the film’s promotional field of play thus divulges World’s interest in establishing a certain “real” at the core of its operations, inoculating itself against the argument that it is merely a simulacrum of what came before. It’s a challenge faced by every “reboot event” within the ramifying textualities of a long-running blockbuster franchise, forced by marketplace demands to periodically reinvent itself while (and this is the trick) preserving the recognizable essence that made its forerunner(s) successful. In the case of Jurassic World, that pressure surfaces symptomatically in the discourse around the movie’s visual effects–albeit in a fashion that ironically inverts the test Jurassic Park met and mastered all those years ago. Park’s dinosaurs were sold as breakthroughs in CGI, notwithstanding the brevity of their actual appearance: of the movie’s 127-minute running time, only six contained digital elements, with the rest of the creature performances supplied by Tippett’s animation and Stan Winston’s animatronics. Those old-school techniques were largely elided in the attention given to Park’s cutting-edge computer graphics.

Jurassic World, by contrast, arrives long after the end of what Michele Pierson has called CGI’s “wonder years”; inured to Hollywood’s production of digital spectacle by its sheer superfluity, audiences now seek the opposite lure, the promise of something solid, profilmic, touchable. This explains the final person featured in the video: John Rosengrant of the fittingly-named Legacy Effects, seen operating a dying apatosaurus in one of Jurassic World’s few languid moments. The dinosaur in that scene is a token of the analog era, offered up as emblem of a practical-effects integrity the filmmakers hope will generalize to their entire project. It’s an increasingly common move among makers of fantastic media, one that critics, like this writer for Grantland, are all too happy to reinforce:

J. J. Abrams got a big cheer at the recent Star Wars Celebration Anaheim when he said he was committed to practical effects. George Miller won plaudits for sticking real trucks in the desert in Mad Max: Fury Road. Similarly, [Jurassic World director Colin] Trevorrow gestured to the Precambrian world of special effects by filling his movie with rubber dinos, an old View-Master, and going to the mat to force Universal to pony up for an animatronic apatosaurus. Chris Pratt’s Owen Grady tenderly ministers to the old girl before her death — a symbolic death of the practical effect under the rampage of CGI.

I hope to say more about this phenomenon, in which the digital camouflages itself in a lost analog authenticity, in a future post. For now I will simply note that “Chris Pratt’s Owen Grady” might be the most real thing within Jurassic World. Pratt’s performance here is strikingly different from the jokester he played so winningly in Guardians of the Galaxy: he’s tougher, harsher, more brutish. Spielberg is rumored to want him to play Indiana Jones, and I can see how that would work: like the young Harrison Ford, Pratt can convincingly anchor a fantasy scenario without seeming like’s playing dress-up. But the actor Pratt reminds me of even more than Harrison Ford is John Wayne: his first shot in Jurassic World, dazzlingly silhouetted against sunlight, recalls that introductory dolly in Stagecoach (1939) when the camera rushes up to Wayne’s face as though helpless to resist his primordial photogenie.

As for the rest of Jurassic World, I enjoyed some of it and endured most of it, borne along by the movie’s fitful but generally successful invocations of the 1993 original. “Assets” are one of the screenplay’s keywords, and they apply not only to the dinosaurs that are Isla Nubar’s central attractions but to the swarm of intellectual property that constitutes the franchise’s brand: the logos, the score, the velociraptors’ screeches and the T-Rex’s roar. (Sound libraries, like genetically-engineered dinosaurs, are constructed and owned things too.) Anthony Lane jeers that there is “something craven and constricting in the attitude of the new film to the old,” but I found the opposite: it’s when World is most clearly conscious of Park that it works the best, which is probably why I enjoyed its final half an hour–built, as in Park, as an escalating series of action beats, culminating in a satisfying final showdown–the most.

But that might just be my franchise loyalty talking again. As with The Lost World in that 1997 argument outside the theater, I may be talking myself into liking something not because of its actual qualities, but because it’s part of my history–my identity.

It’s in my DNA.

Making Mine Marvel


marvuntold

Reading Sean Howe’s Marvel Comics: The Untold Story (New York: Harper Perennial, 2013) I am learning all sorts of things. Or rather, some things I am learning and some things I am relearning, as Marvel’s publications are woven into my life as intimately as are Star Trek and Star Wars: other franchises of the fantastic whose fecundity — the sheer volume of media they’ve spawned over the years — mean that at any given stage of my development they have been present in some form. Our biographies overlap; even when I wasn’t actively reading or watching them, they served at least as a backdrop. I would rather forget that The Phantom Menace or Enterprise happened, but I know precisely where I was in my life when they did.

Star Wars, of course, dates back to 1977, which means my first eleven years were unmarked by George Lucas’s galvanic territorialization of the pop-culture imaginary. Trek, on the other hand, went on the air in 1966, the same year I was born. Save for a three-month gap between my birthday in June and the series premiere in September, Kirk, Spock and the universe(s) they inhabit have been as fundamental and eternal as my own parents. Marvel predates both of them, coming into existence in 1961 as the descendent of Timely and Atlas. This makes it about as old as James Bond (at least in his movie incarnation) and slightly older than Doctor Who, arriving via TARDIS, er, TV in 1963.

My chronological preamble is in part an attempt to explain why so much of Howe’s book feels familiar even as it keeps surprising me by crystallizing things about Marvel I kind of already knew, because Marvel itself — avatarizalized in editor/writer Stan Lee — was such an omnipresent engine of discourse, a flow of interested language not just through dialogue bubbles and panel captions but the nondiegetic artists’ credits and editorial inserts (“See Tales of Suspense #53! — Ed.”) as well as paratextual spaces like the Bullpen Bulletins and Stan’s Soapbox. Marvel in the 1960s, its first decade of stardom, was very, very good not just at putting out comic books but at inventing itself as a place and even a kind of person — a corporate character — spending time with whom was always the unspoken emotional framework supporting my issue-by-issue excursions into the subworlds of Spider-Man, the Fantastic Four, and Dr. Strange.

Credit Howe, then, with taking all of Marvel’s familiar faces, fictional and otherwise, and casting each in its own subtly new light: Stan Lee as a liberal, workaholic jack-in-the-box in his 40s rather than the wrinkled avuncular cameo-fixture of recent Marvel movies; Jack Kirby as a father of four, turning out pages at breakneck speed at home in his basement studio with a black-and-white TV for company; Steve Ditko as — and this genuinely took me by surprise — a follower of Ayn Rand who increasingly infused his signature title, The Amazing Spider-Man, with Objectivist philosophy.

It’s also interesting to see Marvel’s transmedial tendencies already present in embryo as Lee, Kirby, and Ditko shared their superhero assets across books: Howe writes, “Everything was absorbed into the snowballing Marvel Universe, which expanded to become the most intricate fictional narrative in the history of the world: thousands upon thousands of interlocking characters and episodes. For generations of readers, Marvel was the great mythology of the modern world.” (Loc 125 — reading it on my Kindle app). Of course, as with any mythology of sufficient popular mass, it becomes impossible to read history as anything but a teleologically overdetermined origin story, so perhaps Howe overstates the case. Still, it’s hard to resist the lure of reading marketing decisions as prescient acts of worldbuilding: “It was canny cross-promotion, sure, but more important, it had narrative effects that would become a Marvel Comics touchstone: the idea that these characters shared a world, that the actions of each had repercussions on the others, and that each comic was merely a thread of one Marvel-wide mega-story.” (Loc 769)

I like too the way Untold Story paints comic-book fandom in the 1960s as a movement of adults, or at least teenagers and college students, rather than the children so often caricatured as typical comic readers; Howe notes July 27, 1964 as the date of “the first comic convention” at which “a group of fans rented out a meeting hall near Union Square and invited writers, artists, and collectors (and one dealer) of old comic books to meet.” (Loc 876) The company’s self-created fan club, the Merry Marvel Marching Society or M.M.M.S., was in Howe’s words

an immediate smash; chapters opened at Princeton, Oxford, and Cambridge. … The mania wasn’t confined to the mail, either — teenage fans started calling the office, wanting to have long telephone conversations with Fabulous Flo Steinberg, the pretty young lady who’d answered their mail so kindly and whose lovely picture they’d seen in the comics. Before long, they were showing up in the dimly lit hallways of 625 Madison, wanting to meet Stan and Jack and Steve and Flo and the others. (Loc 920)

A forcefully engaged and exploratory fandom, then, already making its media pilgrimages to the hallowed sites of production, which Lee had so skillfully established in the fannish imaginary as coextensive with, or at least intersecting, the fictional overlay of Manhattan through which Spider-Man swung and the Fantastic Four piloted their Fantasticar. In this way the book’s first several chapters offhandedly map the genesis of contemporary, serialized, franchised worldbuilding and the emergent modern fandoms that were both those worlds’ matrix and their ideal sustaining receivers.

Howe is attentive to these resonances without overstating them: Lee, Kirby and others are allowed to be superheroes (flawed and bickering in true Marvel fashion) while still retaining their earthbound reality. And through his book, so far, I am reexperiencing my own past in heightened, colorful terms, remembering how the media to which I was exposed when young mutated me, gamma-radiation-like, into the man I am now.

Tilt-Shifting Pacific Rim

PACIFIC RIM

Two-thirds of the way through Pacific Rim — just after an epic battle in, around, and ultimately over Hong Kong that’s one of the best-choreographed setpieces of cinematic SF mayhem I have ever witnessed — I took advantage of a lull in the storytelling to run to the restroom. In the air-conditioned chill of the multiplex the lobby with its concession counters and videogames seemed weirdly cramped and claustrophobic, a doll’s-house version of itself I’d entered after accidentally stumbling into the path of a shink-ray, and I realized for the first time that Guillermo del Toro’s movie had done a phenomenological number on me, retuning my senses to the scale of the very, very big and rendering the world outside the theater, by contrast, quaintly toylike.

I suspect that much of PR’s power, not to mention its puppy-dog, puppy-dumb charm, lies in just this scalar play. The cinematography has a way of making you crane your gaze upwards even in shots that don’t feature those lumbering, looming mechas and kaiju. The movie recalls the pleasures of playing with LEGO, model kits, action figures, even plain old Matchbox Cars, taking pieces of the real (or made-up) world and shrinking them down to something you can hold in your hand — and, just as importantly, look up at. As the father of a two-year-old, I often find myself laying on the floor, my eyes situated inches off the carpet and so near the plastic dump trucks, excavators, and fire engines in my son’s fleet that I have to take my glasses off to properly focus on them. At this proximity, toys regain some of their large-scale referent’s visual impact without ever quite giving up their smallness: the effect is a superimposition of slightly dissonant realities, or in the words of my friend Randy (with whom I saw Pacific Rim) a “sized” version of the uncanny valley.

This scalar unheimlich is clearly on the culture’s mind lately, literalized — iconized? — in tilt-shift photography, which takes full-sized scenes and optically transforms them into images that look like dioramas or models. A subset of the larger (no pun intended) practice of miniature faking, tilt-shift updates Walter Benjamin’s concept of the optical unconscious for the networked antheap of contemporary digital and social media, in which nothing remains unconscious (or unspoken or unexplored) for long but instead swims to prominence through an endless churn of collective creation, commentary, and sharing. Within the ramifying viralities of Facebook, Twitter, Tumblr, Reddit, and 4chan, in which memes boil reality into existence like so much quantum foam, the fusion of lens-perception and human vision — what the formalist Soviet pioneers called the kino-eye — becomes just another Instagram filter:

tilt-shift-photography-1

The giant robots fighting giant monsters in Pacific Rim, of course, are toyetic in a more traditional sense: where tilt-shift turns the world into a miniature, PR uses miniatures to make a world, because that is what cinematic special effects do. The story’s flimsy romance, between Raleigh Beckett (Charlie Hunnam) and Mako Mori (Rinko Kikuchi) makes more sense when viewed as a symptomatic expression of the national and generic tropes the movie is attempting to marry: the mind-meldly “drift” at the production level fuses traditions of Japanese rubber-monster movies like Gojiru and anime like Neon Genesis Evangelion with a visual-effects infrastructure which, while a global enterprise, draws its guiding spirit (the human essence inside its mechanical body, if you will) from Industrial Light and Magic and the decades of American fantasy and SF filmmaking that led to our current era of brobdingnagian blockbusters.

Pacific Rim succeeds handsomely in channeling those historical and generic traces, paying homage to the late great Ray Harryhausen along the way, but evidently its mission of magnifying 50’s-era monster movies to 21st-century technospectacle was an indulgence of giantizing impulses unsuited to U.S. audiences at least; in its opening weekend, PR was trounced by Grown Ups 2 and Despicable Me 2, comedies offering membership in a franchise where PR could offer only membership in a family. The dismay of fans, who rightly recognize Pacific Rim as among the best of the summer season and likely deserving of a place in the pantheon of revered SF films with long ancillary afterlives, should remind us of other scalar disjunctions in play: for all their power and reach (see: the just-concluded San Diego Comic Con), fans remain a subculture, their beloved visions, no matter how expansive, dwarfed by the relentless output of a mainstream-oriented culture industry.

Hugo

Count me among those few unfortunates who didn’t “get” Hugo. Not that I failed to see what Martin Scorsese’s opulent adaptation of Brian Selznick’s graphic novel was trying to accomplish; but for me the movie’s attempt to channel and amplify a certain magic succeeded merely in bringing it leadenly to earth, burying a fragile tapestry of early-cinema memories beneath a giant cathedral (or in this case, a train station) of digital visual effects and overly literal storytelling. Some of my disaffection was likely due to viewing situation: I watched it flat and small, at home on my (HD)TV, rather than big and deep, and the lack of 3D glasses or IMAX overwhelm surely contributed to the lackluster experience. Yet I resist this bullying-by-format, and find it at odds with what Scorsese, through his marvelous machinery, appeared to be promoting: the idea that “movie magic” can blossom on the squarest of screens, the grainiest and most monochromatic of film stocks, not even needing synchronized sound to work its wonders on an audience.

I watched Hugo for a class I am co-teaching this term on the varieties of realism and “reality” deployed across multiple media, starting with early cinema’s dazzling blend of spectacularized “actuality” and suddenly plentiful illusion. Often reduced to a binary opposition between the work of Louis and Auguste Lumiere on the one hand and Georges Melies on the other, the era was more like the first femtoseconds following the Big Bang, in which cinema’s principal forces had not yet unfolded themselves from the superheated plasma of possibility. Hugo tries to take us back to that primordial celluloid soup, or more precisely to a fantasized moment — itself still hauntingly young — when the medium of film began to be aware of its own past, that is, to recognize itself as a medium and not, as one Louis Lumiere quote would have it, “an invention without a future.” Through an overwrought relay of intermediaries that includes a young boy and girl, a nascent cinema scholar, a dead father, a semifunctional automaton, and a devoted wife, that past is resurrected in the form of Melies himself, cunningly impersonated by Ben Kingsley, and it is the old magician’s backstory, dominating the final third of the film, that jolted both me and my students back to full, charmed attention.

I suppose there is something smart and reflexive about the way Hugo buries its lead, staging the delights of scholarly and aesthetic discovery through a turgid narrative that only slowly unveils its embedded wonders. But for me, too much of the movie felt like a windup toy, a rusty clockwork, a train with a coal-burning engine in dire need of stoking.

The Grey

It’s always interesting to witness the birth of an articulation between actor and genre — the moment when Hollywood’s endlessly spinning tumblers click into place with a tight and seemingly inevitable fit, fusing star and vehicle into a single, satisfying commercial package. For Liam Neeson, that grave and manly icon, the slot machine hit its combinatorial jackpot with 2008’s Taken, a thriller whose narrative economy was as ruthlessly single-minded as its protagonist; Neeson played a former CIA agent whose mission to rescue his kidnapped daughter lent the bloody body count a moral justification, like a perfectly-balanced throwing knife. Next came Unknown, less successful in leaping its logical gaps, but centered nonetheless by Neeson’s morose, careworn toughness.

The Grey gives us Neeson as yet another hardened but sensitive man, another action hero whose uncompromising competence in the face of disaster is saved from implausible superhumanity by virtue of the fact that he seems so reluctant to be going through it all; you sense that Neeson’s characters really wish they were in another kind of movie. It makes him just right for something like The Grey, in which a plane full of grizzled and boastful oil-drilling workers crashes in remote Alaskan territory, on what turns out to be the hunting and nesting grounds of a community of wolves. None of these guys wants to be there, especially after a crash scene so stroboscopically wrenching and whiplashy that it made my wife leave the room. Shortly after Neeson’s character makes his way to the smoking debris and discovers a handful of survivors, there is an exquisite scene in which he coaches a terminally-wounded man through the final seconds of his life. “You’re going to die,” Neeson gently and compassionately growls, as around him the other tough guys weep and turn away.

Nothing else that happens in the story quite matches that moment, and since the rest of the film is essentially one long death scene, I found myself wishing that someone like Neeson could help put the movie out of its misery. Not that the action of being hunted by wolves isn’t gripping — but as the team’s numbers wear down and it becomes clearer that no one is going to survive this thing, the tone becomes so meditative that I found myself numbing out, as though frostbitten. I’ve written before about annihilation narratives, and I continue to respond to the mordant pleasures of their zero-sum games, which appeal, I suspect, to the same part of me that likes washing every dish in the kitchen and then hosing the sink clean while the garbage disposal whirrs. (Scott Smith’s book The Ruins is perhaps my favorite recent instance of the tale from which no one escapes.) The Grey‘s slow trudge to its concluding frames induced a trance more than a chill, but the experience stayed with me for days afterward.

Don’t Be Afraid of the Dark

It’s hard to pinpoint the primal potency of the original Don’t Be Afraid of the Dark, the 1973 telefilm about a woman stalked by hideous little troll monsters in the shadowy old house where she lives with her husband. The story itself, a wisp of a thing, has the unexplained purity of a nightmare piped directly from a fevered mind: both circuitously mazelike and stiflingly linear, it’s like watching someone drown in a room slowly filling with water. As with contemporaries Rosemary’s Baby and The Stepford Wives, it’s a parable of domestic disempowerment built around a woman whose isolation and vulnerability grow in nastily direct proportion to her suspicion that she is being hunted by dark forces. All three movies conclude in acts of spiritual (if not quite physical) devouring and rebirth: housewives eaten by houses. To the boy I was then, Don’t Be Afraid of the Dark provoked a delicious, vertiginous sliding of identification, repulsion, and desire: the doomed protagonist, played by Kim Darby, merged the cute girl from the Star Trek episode “Miri” with the figure of my own mother, whose return to full-time work as a public-school librarian, I see now, had triggered tectonic shifts in my parents’ relationship and the continents of authority and affection on which I lived out my childhood. These half-repressed terrors came together in the beautiful, grotesque design of the telefim’s creatures: prunelike, whispery-voiced gnomes creeping behind walls and reaching from cupboards to slice with razors and switch off the lights that are their only weakness.

The 2011 remake, which has nothing of the original’s power, is nevertheless valuable as a lesson in the danger of upgrading, expanding, complicating, and detailing a text whose low-budget crudeness in fact constitutes televisual poetry. Produced by Guillermo del Toro, the movie reminds me of the dreadful watering-down that Steven Spielberg experienced when he shifted from directing to producing in the 1980s, draining the life from his own brand (and is there not a symmetry between this industrial outsourcing of artistry and the narrative’s concern with soul-sucking?). The story has been tampered with disastrously, introducing a little girl to whom the monsters (now framed as vaguely simian “tooth fairies”) are drawn; the wife, played by a bloodless Katie Holmes, still succumbs in the end to the house’s demonic sprites, but the addition of a maternal function forces us to read her demise as noble sacrifice rather than nihilistic defeat, and when husband (Guy Pearce) walks off with daughter in the closing beat, it comes unforgivably close to a happy ending. As for the monsters, now more infestation than insidiousness, they skitter and leap in weightless CGI balletics, demonstrating that, as with zombies, faster does not equal more frightening. But for all its evacuation of purpose and punch, the remake is useful in locating a certain undigestible blockage in Hollywood’s autocannibalistic churn, enshrining and immortalizing — through its very failure to reproduce it — the accidental artwork of the grainy, blunt, wholly sublime original.

The Ides of March

Ryan Gosling is something of a Rorschach blot for me; he makes an entirely different impression from film to film, skeevy-charming in Lars and the Real Girl, dopey and dense in Crazy, Stupid, Love, stonily heroic in Drive (where he brings to mind both Gary Cooper and that “axiom of the cinema” Charlton Heston), pathetic to the point of unwatchability in Blue Valentine. Good movies all — Drive is actually great — and I suppose good performances, but Gosling’s particular brand of method acting creates a hollow at their center.

The Ides of March, directed by George Clooney and adapted by Clooney and his frequent collaborator Grant Heslov from the play “Farragut North,” continues this trend, giving us Gosling as Stephen Meyers, a slick political operator working for Democrat governor Mike Morris (Clooney) in his bid for the presidential nomination. The story hinges on Meyers’s loss of innocence, a pivot from sincere belief in his candidate’s values and virtues to a cynical acceptance of the actual machinations of electoral campaigns. The movie channels the bleak paranoia of any number of 70s thrillers and satires, from All the President’s Men to The Candidate, but ultimately feels slight and underdeveloped; there are too few moving parts to the plot, and the series of betrayals and reversals are spottable well in advance.

The largest problem, though, is that Gosling’s character barely seems to change, even as the narrative describes his fall from grace. Is this because of an overly economical script that follows too closely the limited staging of its theatrical existence? The film’s portrayal of a political scandal that seems almost quaint compared to the hateful and incoherent hunger games being played out by the current crop of candidates for the Republican presidential ticket? Or is it because Gosling himself, as the apotheosis of a certain kind of absent acting style, registers from the start as someone who’s merely going through the motions?

Super 8: The Past Through Tomorrow

Ordinarily I’d start this with a spoiler warning, but under our current state of summer siege — one blockbuster after another, each week a mega-event (or three), movies of enormous proportion piling up at the box office like the train-car derailment that is Super 8‘s justly lauded setpiece of spectacle — the half-life of secrecy decays quickly. If you haven’t watched Super 8 and wish to experience its neato if limited surprises as purely as possible, read no further.

This seems an especially important point to make in relation to J. J. Abrams, who has demonstrated himself a master of, if not precisely authorship as predigital literary theory would recognize it, then a kind of transmedia choreography, coaxing a miscellany of texts, images, influences, and brands into pleasing commercial alignment with sufficient regularity to earn himself his own personal brand as auteur. As I noted a few years back in a pair of posts before and after seeing Cloverfield, the truism expounded by Thomas Elsaesser, Jonathan Gray, and others — that in an era of continuous marketing and rambunctiously recirculated information, we see most blockbusters before we see them — has evolved under Abrams into an artful game of hide and seek, building anticipation by highlighting in advance what we don’t know, first kindling then selling back to us our own sense of lack. More important than the movie and TV shows he creates are the blackouts and eclipses he engineers around them, dark-matter veins of that dwindling popular-culture resource: genuine excitement for a chance to encounter the truly unknown. The deeper paradox of Abrams’s craft registered on me the other night when, in an interview with Charlie Rose, he explained his insistence on secrecy while his films are in production not in terms of savvy marketing but as a peace offering to a data-drowned audience, a merciful respite from the Age of Wikipedia and TMZ. No less clever for their apparent lack of guile, Abrams’s feats of paratextual prestidigitation mingle the pleasures of nostalgia with paranoia for the present, allying his sunny simulations of a pop past with the bilious mutterings of information-overload critics. (I refuse to use Bing until they change their “too much information makes you stupid” campaign, whose head-in-the-sand logic seems so like that of Creationism.)

The other caveat to get out of the way is that Abrams and his work have proved uniquely challenging for me. I’ve never watched Felicity or Alias apart from the bits and pieces that circulated around them, but I was a fan of LOST (at least through the start of season four), and enjoyed Mission Impossible III — in particular one extended showoff shot revolving around Tom Cruise as his visage is rebuilt into that of Philip Seymour Hoffman, of which no better metaphor for Cruise’s lifelong pursuit of acting cred can be conceived. But when Star Trek came out in 2009, it sort of short-circuited my critical faculties. (It was around that time I began taking long breaks from this blog.) At once a perfectly made pop artifact and a wholesale desecration of my childhood, Abrams’s Trek did uninvited surgery upon my soul, an amputation no less traumatic for being so lovingly performed. My refusal to countenance Abrams’s deft reboot of Gene Roddenberry’s vision is surely related to my inability to grasp my own death — an intimation of mortality, yes, but also of generationality, the stabbing realization that something which defined me as for so many years as stable subject and member of a collective, my herd identity, had been reassigned to the cohort behind me: a cohort whose arrival, by calling into existence “young” as a group to which I no longer belonged, made me — in a word — old. Just as if I had found Roddenberry in bed with another lover, I must encounter the post-Trek Abrams from within the defended lands of the ego, a continent whose troubled topography was sculpted not by physical law but by tectonics of desire, drive, and discourse, and whose Lewis and Clark were Freud and Lacan. (Why I’m so touchy about border disputes is better left for therapy than blogging.)

Given my inability to see Abrams’s work through anything other than a Bob-shaped lens, I thought I would find Super 8 unbearable, since, like Abrams, I was born in 1966 (our birthdays are only five days apart!), and, like Abrams, I spent much of my adolescence making monster movies and worshipping Steven Spielberg. So much of my kid-life is mirrored in Super 8, in fact, that at times it was hard to distinguish it from my family’s 8mm home movies, which I recently digitized and turned into DVDs. That gaudy Kodachrome imagery, adance with grain, is like peering through the wrong end of a telescope into a postage-stamp Mad Men universe where it is still 1962, 1965, 1967: my mother and father younger than I am today, my brothers and sisters a blond gaggle of grade-schoolers, me a cheerful, big-headed blob (who evidently loved two things above all else: food and attention) showing up in the final reels to toddle hesitantly around the back yard.

Fast-forward to the end of the 70s, and I could still be found in our back yard (as well as our basement, our garage, and the weedy field behind the houses across the street), making movies with friends at age twelve or thirteen. A fifty-foot cartridge of film was a block of black plastic that held about 3 minutes of reality-capturing substrate. As with the Lumiere cinematographe, the running time imposed formal restraints on the stories one could tell; unless or until you made the Griffithian breakthrough to continuity editing, scenarios were envisioned and executed based on what was achievable in-camera. (In amateur cinema as in embryology, ontology recapitulates phylogeny.) For us, this meant movies built around the most straightforward of special effects — spaceship models hung from thread against the sky or wobbled past a painted starfield, animated cotillions of Oreo cookies that stacked and unstacked themselves, alien invaders made from friends wrapped in winter parkas with charcoal shadows under the eyes and (for some reason) blood dripping from their mouths — and titles that seemed original at the time but now announce how occupied our processors were with the source code of TV and movie screens: Star Cops, Attack of the Killer Leaves, No Time to Die (a spy thriller containing a stunt in which my daredevil buddy did a somersault off his own roof).

Reconstructing even that much history reveals nothing so much as how trivial it all was, this scaling-down of genre tropes and camera tricks. And if cultural studies says never to equate the trivial with the insignificant or the unremarkable, a similar rule could be said to guide production of contemporary summer blockbusters, which mine the detritus of childhood for minutiae to magnify into $150 million franchises. Compared to the comic-book superheroes who have been strutting their Nietzschean catwalk through the multiplex this season (Thor, Green Lantern, Captain America et-uber-al), Super 8 mounts its considerable charm offensive simply by embracing the simple, giving screen time to the small.

I’m referring here less to the protagonists than to the media technologies around which their lives center, humble instruments of recording and playback which, as A. O. Scott points out, contrast oddly with the high-tech filmmaking of Super 8 itself as well as with the more general experience of media in 2011. I’m not sure what ideology informs this particular retelling of modern Genesis, in which the Sony Walkman begat the iPod and the VCR begat the DVD; neither Super 8′s screenplay nor its editing develop the idea into anything like a commentary, ironic or otherwise, leaving us with only the echo to mull over in search of meaning.

A lot of the movie is like that: traces and glimmers that rely on us to fill in the gaps. Its backdrop of dead mothers, emotionally-checked-out fathers, and Area 51 conspiracies is as economical in its gestures as the overdetermined iconography of its Drew Struzan poster (below), an array of signifiers like a subway map to my generation’s collective unconscious. Poster and film alike are composed and lit to summon a linkage of memories some thirty years long, all of which arrive at their noisy destination — there’s that train derailment again — in Super 8.

I don’t mind the sketchiness of Super 8‘s plot any more than I mind its appropriation of 1970s cinematography, which trades the endlessly trembling camerawork of Cloverfield and Star Trek for the multiplane composition, shallow focus, and cozily cluttered frames of Close Encounters of the Third Kind. (Abrams’s film is more intent on remediating CE3K‘s rec-room mise-en-scene than its Douglas-Trumbull lightshows.) To accuse Super 8 of vampirizing the past is about as productive as dropping by the castle of that weird count from Transylvania after dark: if the full moon and howling wolves haven’t clued you in, you deserve whatever happens, and to bring anything other than a Spielberg-trained sensibility to a screening of Super 8 is like complaining when Pop Rocks make your mouth feel funny.

What’s exceptional, in fact, about Super 8 is the way it intertextualizes two layers of history at once: Spielberg films and the experience of watching Spielberg films. It’s not quite Las Meninas, but it does get you thinking about the inextricable codependence between a text and its reader, or in the case of Spielberg and Abrams, a mainstream “classic” and the reverential audience that is its condition of possibility. With its ingenious hook of embedding kid moviemakers in a movie their own creative efforts would have been inspired by (and ripped off of), Super 8 threatens to transcend itself, simply through the squaring and cubing of realities: meta for the masses.

Threatens, but never quite delivers. I agree with Scott’s assessment that the film’s second half fails to measure up to its first — “The machinery of genre … so ingeniously kept to a low background hum for so long, comes roaring to life, and the movie enacts its own loss of innocence” — and blame this largely on the alien, a digital McGuffin all too similar to Cloverfield‘s monster and, now that I think about it, the thing that tried to eat Kirk on the ice planet. Would that Super 8‘s filmmakers had had the chutzpah to build their ode to creature features around a true evocation of 70s and 80s special effects, recreating the animatronics of Carlo Rambaldi or Stan Winston, the goo and latex of Rick Baker or Rob Bottin, the luminous optical-printing of Trumbull or Robert Abel, even the flickery stop motion of Jim Danforth and early James Cameron! It might have granted Super 8‘s Blue-Fairy wish with the transmutation the film seems so desperately to desire — that of becoming a Spielberg joint from the early 1980s (or at least time-traveling to climb into its then-youthful body like Sam Beckett or Marty McFly).

Had Super 8‘s closely-guarded secret turned out to be our real analog past instead of a CGI cosmetification of it, the movie would be profound where it is merely pretty. Super 8 opens with a wake; sour old man that I am, I wish it had had the guts to actually disinter the corpse of a dead cinema, instead of just reminiscing pleasantly beside the grave.

 

Paranormal Activity 2

I was surprised to find myself eager, even impatient, to watch Paranormal Activity 2, the followup to 2007’s no-budget breakthrough in surveillance horror. I wrote of the first movie that it delivered a satisfactory double-action of filmic and commercial engineering, chambering and firing its purchased scares in a way that felt requisitely unexpected and, at the same time, reassuringly predictable. The bonus of seeing it in a theater (accompanied by my mom, and my therapist would like to know why I chose to omit that fact from the post) was a happy reminder that crowds can improve, rather than degrade, the movie experience.

PA2 I took in at home after my wife had gone to bed. I lay on a couch in a living room dark except for the low light of a small lamp: a setting well-suited to a drama that takes place largely in domestic spaces at night. My complete lack of fear or even the faint neck-tickle of eeriness probably just proves the truism that some movies work best with an audience — but let’s not forget that cinema does many kinds of work, and offers many varieties of pleasure. This is perhaps especially true of horror, whose glandular address of our viscera places it among the body genres of porn and grossout comedy (1), and whose narratives of inevitable peril and failed protection offer a plurality of identifications where X marks the intersection of the boy-girl lines of gender and the killer-victim lines of predation (2).

I’m not sure what Carol Clover would make of Paranormal Activity 2 or its predecessor (though see here for a nice discussion), built as they are on the conceit of a gaze so disinterested it has congealed into the pure alienness of technology. Shuffled among the mute witnesses of a bank of home-security cameras, we are not in the heads of Alfred Hitchcock, Stanley Kubrick, or even Gaspar Noe, but instead the sensorium — and sensibility — of HAL. A good part of the uncanniness, and hence the fun, comes from the way this setup eschews the typical constructions of cinematography: conventions of framing and phrasing that long ago (with the rise of classical film style) achieved their near-universal legibility at the cost of their willingness to truly disrupt and disturb. PA2 is grindhouse dogme, wringing chills from its formal obstructions.

Rather than situating us securely in narrative space through establishing shots and analytic closeups, shot-reverse-shot, and point-of-view editing, PA2 either suspends us outside the action, hovering at the ceiling over kitchens and family rooms rendered vast as landscapes by a wide-angle lens, or throws us into the action in handheld turmoil that even in mundane and friendly moments feels violent. The visuals and their corresponding “spatials” position viewers as ghosts themselves, alternately watching from afar in building frustration and hunger, then taking posession of bodies for brief bouts of hot corporality. Plotwise, we may remain fuzzy on what the spectral force in question (demon? haunted house? family curse?) finally wants, but on the level of spectatorial empathy, it is easy to grasp why it both hates and desires its victims.

Along with Von Trier, other arty analogs for PA2 might be Chantal Akerman’s Jeanne Dielman or Laura Mulvey and Peter Wollen’s Riddles of the Sphinx, which similarly locate us both inside and outside domestic space to reveal how houses can be “haunted” by gender and power. They share, that is, a clinical interest in the social and ideological compartmentalization of women, though in the Paranormal Activity films the critique remains mostly dormant, waiting to be activated in the readings of brainy academics. (Certainly one could write a paper on PA2’s imbrication of marriage, maternity, and hysterical “madness,” or on the role of technological prophylaxis in protecting the white bourgeois from an Other coded not just as female but as ethnic.)

But for this brainy academic, what’s most interesting about PA2 is the way it weaves itself into the first film. Forming an alternate “flightpath” through the same set of events, the story establishes a tight set of linkages to the story of Micah and Katie, unfolding before, during, and after their own deadly misadventure of spirit photography gone wrong. It is simultaneously prequel, sequel, and expansion to Paranormal Activity, and hence an example — if a tightly conscribed one — of transmedia storytelling, in which a fictional world, its characters, and events can be visited and revisited in multiple tellings. In comments on my post on 2008’s Cloverfield, Martyn Pedler pointed out that film’s transmedia characteristics, and I suggested at the time that “Rather than continuing the story of Cloverfield, future installments might choose to tell parallel or simultaneous stories, i.e. the experiences of other people in the city during the attack.”

Paranormal Activity 2 does precisely that for its tiny, spooky universe. It may not be the scariest movie I’ve seen lately, but for what it implies about the evolving strategies of genre cinema in an age of new media, it’s one of the more intriguing.

Works Cited

1. Linda Williams, “Film Bodies: Gender, Genre, and Excess,” Film Quarterly 44:4 (Summer 1991), 2-13.

2. Carol J. Clover, Men, Women and Chainsaws: Gender in the Modern Horror Film (Princeton: Princeton University Press, 1993).