Beep … Beep … Beep …

sputnik-image-1.gif

The Soviet satellite Sputnik, launched fifty years ago today, is stitched into my family history in an odd way. A faded Polaroid photograph from that year, 1957, shows my older siblings gathered in the living room in my family’s old house. The brothers and sisters I would come to know as noisily lumbering teenage creatures who alternately vied for my attention and pounded me into the ground are, in the image, blond toddlers messing around with toys. There also happens to be a newspaper in the frame. On its front page is the announcement of Sputnik’s launch.

Whatever the oblique and contingent quality of this captured moment — one time-stopping medium (newsprint) preserved within another (the photograph) — I’ve always been struck by how it layers together so many kinds of lost realities, realities whose nature and content I dwell upon even though, or because, I never knew them personally. Sputnik’s rhythmically beeping trajectory through orbital space echoes another, more idiomatic “outer space,” the house where my family lived in Ann Arbor before I was born (in the early 1960s, my parents moved across town to a new location, the one that I would eventually come to know as home). These spaces are not simply lost to me, but denied to me, because they existed before I was born.

Which is OK. Several billion years fall into that category, and I don’t resent them for predating me, any more than I pity the billions to come that will never have the pleasure of hosting my existence. (I will admit that the only time I’ve really felt the oceanic impact of my own inevitable death was when I realized how many movies [namely all of them] I won’t get to see after I die.) If I’m envious of anything about that family in the picture from fall 1957, it’s that they got to be part of all the conversations and headlines and newspaper commentaries and jokes and TV references and whatnot — the ceaseless susurration of humanity’s collective processing — that accompanied the little beeping Russian ball as it sliced across the sky.

As a fan of the U.S. space program, I didn’t think I really cared that much about Sputnik until I caught a story from NPR on today’s Morning Edition, which profiled the satellite’s designer, Sergei Korolev. One of Korolev’s contemporaries, Boris Chertok, relates how Sputnik’s shape “was meant to capture people’s imagination by symbolizing a celestial body.” It was the first time, to be honest, I’d thought about satellites being designed as opposed to engineered — shaped by forces of fashion and signification rather than the exigences of physics, chemistry, and ballistics. One of the reasons I’ve always liked probes and satellites such as the Surveyor moon probe, the Viking martian explorer, the classic Voyager, and my personal favorite, the Lunar Orbiter 1 (pictured here), is that their look seemed entirely dictated by function.

lunar_orbiter.jpg

Free of extras like tailfins and raccoon tails, flashing lights and corporate logos, our loyal emissaries to remoteness like the Mariner or Galileo satellites possessed their own oblivious style, made up of solar panels and jutting antennae, battery packs and circuit boxes, the mouths of reaction-control thrusters and the rotating faces of telemetry dishes. Even the vehicles built for human occupancy — Mercury, Gemini, and Apollo capsules — I found beautiful, or in the case of Skylab or the Apollo missions’ Lunar Module, beautifully ugly, with their crinkled reflective gold foil, insectoid angles, and crustacean asymmetries. My reverence for these spacefaring robots wasn’t limited to NASA’s work, either: when the Apollo-Soyuz docking took place in 1975 ( I was nine years old then, equidistant from the Sputnik launch that bracketed my 1966 birthday), it was like two creatures from the deep sea getting it on — literally bumping uglies.

apollo-soyuz.jpg

So the notion that Sputnik’s shape was supposed to suggest something, “symbolizing a celestial body,” took me at first by surprise. But I quickly came to embrace the idea. After all, the many fictional space- and starships that have obsessed me from childhood — the Valley Forge in Silent Running, the rebel X-Wings and Millennium Falcon from Star Wars, the mothership from Close Encounters of the Third Kind, the Eagles from Space 1999, and of course the U.S.S. Enterprise from Star Trek — are, to a one, the product of artistic over technical sensibilities, no matter how the modelmakers might have kit-bashed them into verisimilitude. And if Sputnik’s silver globe and backswept antennae carried something of the 50s zeitgeist about it, it’s but a miniscule reflection of the satellite’s much larger, indeed global, signification: the galvanizing first move in a game of orbital chess, the pistol shot that started the space race, the announcement — through the unbearably lovely, essentially passive gesture of free fall, a metal ball dropping endlessly toward an earth that swept itself smoothly out of the way — that the skies were now open for warfare, or business, or play, as humankind chooses.

Happy birthday, Sputnik!

Upgrades

halo-3-1.jpg

I like the tired, almost fatalistic tone of Charles Herold’s New York Times review of Halo 3; it’s an unusually self-reflexive piece of videogame criticism. “It doesn’t really matter what reviewers say,” Herold writes with more than a hint of a cynical sigh. “Halo 3 is not just a game: it is a phenomenon fueled by obsessed fans, slick advertising and excessive press coverage (of which I find myself a part).”

Wisely, Herold approaches this newest version of Bungie’s blockbuster series less as a chapter or sequel than as an upgrade. The gist of his review — a line repeated twice, like a mantra — is that “Halo 3 is Halo 2 with somewhat better graphics.” The game’s strengths, he asserts, are in its enhancements to the multiplayer experience, an experience that I consider indistinguishable from the “obsessed fans” and “excessive press coverage” that Herold cites. That is to say, Halo 3 is as much a social game, in its way, as World of Warcraft or Second Life.

Before fans of those elaborate MMORPGs object, let me stipulate that Halo and other shooters involve very different aesthetics of play and intellectual engagement. Movement and communication in deathmatch are channeled and intensified by tactical exigence; interactions are of necessity fast and brutal, and the only economies one deals in are those of weapons and ammo. Foundational dynamics of avatarial immersion and what I have elsewhere called “spectatorial play” are, of course, present in Halo, as they are in any other videogame. But the MMORPG and the FPS deathmatch remain two distinct branches of ludic evolution.

What’s interesting to me about Halo 3‘s huge, immediate, predictable success is that it casts into sharp relief a vast preexisting social base of gamers who sit ready with their XBox 360s to spend hours, days, weeks, and months hunting and blasting each other in the virtual arenas provided by this latest upgrade: a package of new spatialities to explore and master. This base is as loyal as the most devout religious faith, the most engaged political party. (Indeed, I suspect that today’s online gaming audiences, which merge the pragmatics of commercial technology with the mysticism of avatarial transubstantiation, will be looked back upon by future historians as the first true hybridizations of the secular and religious communities.)

It’s too easy to say that Halo differs from something like World of Warcraft in its bloodiness and speed, its apparent simplicity (or primitiveness). I think the more profound distinction lies in the fact that Halo has colonized social spaces beyond those of the MMORPG, something that became clear to me a couple of years ago when I taught a course in videogame ethnography at Indiana University. Many of my students played only Halo and sports games like Madden; these players would never go near EverQuest, for example, because only “hardcore” gamers — the real geeks — played that. Halo, in other words, succeeds as a game because it has gone mainstream, become something one can mention without embarrassment. It nestles much more closely and comfortably into the crenellations and capillaries of real-world social dynamics; it is, in this sense, the norm.

Herding the Nerds

chuck.jpg

This cheerful fellow is Chuck Bartowski (Zachary Levi), the lead character on NBC’s new series Chuck. Notable not just for its privileged placement as the lead-in to Heroes on Monday nights, Chuck stands out to me as a particularly significant shift in how primetime network TV audiences are conceptualized and spoken to — or to use a fancy but endlessly useful term from Louis Althusser’s work, interpellated. Like Heroes, Chuck offers viewers a user-friendly form of complex, mildly fantastical narrative: call it science-fiction lite, albeit several shades “liter” than Heroes‘s chase-structured saga of anxious, insecure superheroes or Lost‘s island of brooding mindgamers. In fact, as I think back over the family tree from which Chuck seems to have sprouted, I detect a steady march toward the innocuous: consider the progression from Twin Peaks to The X-Files to Buffy: The Vampire Slayer to Lost and its current offspring. Dark stuff, yes, but delivered in ever more buoyantly funny and sexy packages. (Only Battlestar Galactica seems determined to honor its essentially traumatic and decentering premise.)

But maybe I’m putting Chuck in the wrong box. The show is a cross between Alias and Ed, with a little sprinkle of Moonlighting and a generous glaze of rapid, self-conscious dialogue in the style of Gilmore Girls and Grey’s Anatomy. The main character works at a big electronics chain (think The 40-Year-Old Virgin, though here the brand milieu has not even been disguised to the degree it was in that movie — Chuck is a member of the “Nerd Herd,” a smug shoutout to the corporately-manufactured tech cred of Best Buy’s Geek Squad). The conceit on which the show gambles the audience’s disbelief is that Chuck has acquired a vast trove of classified “intel” through exposure to an image-filled email (think the cursed videotape of The Ring crossed with the brainwashing montage of The Parallax View) and now finds himself at the intersecting foci of several large, conspiratorial, deadly government organizations. Not a setup that necessarily oozes laughs and romance. But laughs and romance is what, in this case, we get. Like Levi’s unassuming good looks, Chuck‘s puppydog appeal seems destined to win over audiences — though handicapping shows this early is a fool’s game.

Monday’s pilot episode efficiently established that Chuck will be aided in his navigation of the dangers ahead not just by a beautiful female secret agent (Yvonne Strahovski), but his nebbishly sidekick at “Buy More.” In case we were skeptical of Chuck’s pedigree as a leading man, that is, the scenario helpfully supplies a socially-stunted homunculus in the form of his buddy Morgan Grimes (Joshua Gomez), shifting the burden of comic relief from Chuck and thereby moving him a crucial rung up the status ladder. While Chuck responds to his sudden aquisition of epistemological ordnance with deer-in-the-headlights cluelessness, Morgan takes to the larger game right away, advising Chuck on sexual tactics and the correct way to defeat a ninja with a sureness of touch derived equally, one presumes, from excessive masturbation and too many hours playing Splinter Cell and Metal Gear Solid.

What impresses me is that Chuck not only gives us a scenario geared toward geek fetishes, but embeds within itself a passel of geeks as decoding agents and centers of action. In this regard, it’s something like a virtual-reality program running on Star Trek‘s holodeck: a choose-your-own-adventure game whose ideological lure consists of nakedly mirroring its player/viewer in the form of a central character who is not, at least at first glance, a fantasy ideal, a performative mask, a second skin. It impresses me, but also alarms me. It is a storytelling tactic that all too serendipitously echoes the larger strategies of that most expert and ephemeral of modern commodities, the TV serial, which seeks to win from us one simple thing: our ongoing commitment. Chuck acknowledges that its audience is made up of prefab fans, eager for new affiliations, no matter how machine-lathed and focus-group-tested those engines of imaginary engagement may be. (In fact, it congratulates the audience for “getting” its nested inside jokes, chief among which is the fact of their own commodification; the ideal viewer of Chuck is precisely the illegal torrenter of media that NBC hopes to convert with its downloadable content.) Plotwise, with what I’m confident will be a spiraling shell-game of reveals and cliffhangers building toward some season-ending epiphany, Chuck will surely feed the jouissance of genre-based prediction and diegetic decryption that Tim Burke has labelled “nerd hermeneutics.”

I may or may not watch Chuck, depending on my taste for my own taste for cotton candy. By comparison, Heroes, which opened its second season on Monday night, was reassuringly byzantine and willing to frustrate. Perhaps that’s the value of new TV shows, which seem so offputtingly transparent in the way they play on our pleasures — revealing to us how easily we are taken in (and taken aboard), again and again. New shows come along and retroactively establish the authenticity of their predecessors, just as Chuck, measured against next year’s new offerings, will surely ripen from empty copy to cherished original.

NBC’s Heroic Measures

heroes_title_card.png

Though I’m sure spoilerrific information is out there — perhaps in the fall previewing going on at The Extratextuals, which I look forward to reading starting tomorrow — I’m as pure as the driven snow when it comes to tonight’s season premiere of Heroes. I’m something of a late adopter of this show, having dived into the series a third of a way through its first season. (I still remember the blissful November weekend I spent binging on the first six episodes.) Now, like much of the country, I’m feeling the crazy wound-up energy of settling in for a great roller coaster ride. I love being in the midst of an ongoing, expertly told story which is also a game of expectations: the audience saying Yeah, but can you top yourself? and the show saying (literally) Just watch me.

So I’m glad I know nothing of what’s going to happen on a narrative level. On the industrial side, I’m equally unsure. NBC’s decision to yank its programming from iTunes struck me as remarkably stupid, especially as I imagine that the Heroes audience trends toward (A) those who will happily pay $1.99 to download episodes and (B) those who will equally happily acquire the content through torrents, peer-to-peer, or other means if the first source dries up. (Acting out the teleplays with hand-puppets, perhaps?) But at the same time, I work on a daily basis with a very devoted sector of the Heroes demographic — college students — who, I learned while teaching a course on television and new media last spring, don’t particularly mind watching their TV on network websites like NBC’s. They even stick around for the embedded advertising, which is what drives me away from such options. Maybe these young men and women lack the grouchy hacker-derived ethic which still grumbles in my guts that Information should be free … of excessive branding. Whatever the reason for these students’ easy acceptance of NBC’s proprietary flow, I applaud them for it. Whatever gets you the TV you want to see.

So NBC’s gamble of offering episodes as free downloads with a one-week expiration date may work out after all. Me, I’ll be watching tonight’s Heroes the old-fashioned way: on tape, timeshifted by an hour so my wife can catch the premiere of The Bachelor.

On Fanification

A recent conversation on gender and fandom hosted at Henry Jenkins’s blog prompted me to hold forth on the “fanification” of current media — that is, my perception that mainstream television and movies are displaying ever more cultlike and niche-y tendencies even as they remain giant corporate juggernauts. Nothing particularly earthshaking in this claim; after all, the bifurcation and multiplication of TV channels in search of ever more specialized audiences is something that’ s been with us since the hydra of cable lifted its many heads from the sea to do battle with the Big Three networks.

My point is that, after thirty-odd years of this endless subdivision and ramification, texts themselves are evolving to meet and fulfill the kinds of investments and proficiencies that — once upon a time — only the obsessive devotees of Star Trek and soap operas possessed. The robustness and internal density of serialized texts, whether in small-screen installments of Lost or big-screen chapters of Pirates of the Caribbean, anticipates the kind of scrutiny, dissection, and alternate-path-exploring appropriate to what Lizbeth Goodman has called “replay culture.” More troublingly, these textual attributes hide the mass-produced and -marketed commodity behind the artificially-generated underdog status of the cult object: in a kind of adaptive mimicry, the center pretends that it is the fringe. And audiences, without knowing they are doing so, complete the ideological circuit by acting as fans, even though the very notion of “fan” becomes insupportable once it achieves mainstream status. (In other words, to quote The Incredibles, if everyone’s a fan, then no one is.)

As evidence of the fanification of mainstream media, one need look no further than Alessandra Stanley’s piece in this Sunday’s New York Times. In her lead essay for a special section previewing the upcoming fall TV season, Stanley writes of numerous ways in which today’s TV viewer behaves, for all intents and purposes, like the renegade fans of yore — mapping, again, a minority onto a majority. Here are a few quotes:

… Viewers have become proprietary about their choices. Alliances are formed, and so are antipathies. Snobbery takes root. Preferences turn totemic. The mass audience splintered long ago; now viewers are divided into tribes with their own rituals and rites of passage.

A favorite show is a tip-off to personality, taste and sophistication the way music was before it became virtually free and consumed as much by individual song as artist. Dramas have become more complicated; many of the best are serialized and require time and sequential viewing. If anything, television has become closer to literature, inspiring something similar to those fellowships that form over which authors people say they would take to the proverbial desert island.

In this Balkanized media landscape, viewers seek and jealously guard their discoveries wherever they can find them.

Before the Internet, iPhones and flash drives, people jousted over who was into the Pixies when they were still a garage band or who could most lengthily argue the merits of Oasis versus Blur. Now, for all but hardcore rock aficionados, one-upmanship is more likely to center around a television series.

Stanley concludes her essay by suggesting that to not be a fan is to risk social censure — a striking inversion of the cultural coordinates by which geekiness was once measured (and, according to the values of the time, censured). “People who ignore [TV’s] pools and eddies of excellence do so at their own peril,” Stanley writes. “They are missing out on the main topic of conversation at their own table.” Her points are valid. I just wish they came with a little more sense of irony and even alarm. For me, fandom has always been about finding something authentic and wonderful amid the dross. Fandom is, among other things, a kind of reducing valve, a filter for what’s personal and exciting and offbeat. If mass media succeeds in de-massing itself, what alternative — what outside — is left?

Britney and Bush: The Comeback Kids

bush.jpgbritney.jpg

 

 

Last week saw an astonishing play of parallels across our media screens – a twinned spectacle of attempted resurrection which, while occupying two very different sets of cultural coordinates, were perhaps not so distinct when examined closely. If it’s true that the personal is the political, then the popular must be political too; and it’s no great stretch to say that, in contemporary media culture as well as contemporary politics, the rituals of rejuvenation are more alike than dissimilar.

Britney Spears’s opening number at the MTV Video Music Awards on September 9 has by now been thoroughly masticated and absorbed by the fast-moving digestive system of blogosphere critics, with TMZ and Perez Hilton leading the way. I won’t belabor Britney’s performance here, except to note that when I, like much of the nation, succeeded in finding online video documentation the next day despite the best efforts of MTV and Viacom, it was just as fascinatingly surreal (or surreally fascinating) to watch as promised: a case where the hype, insofar as it was a product of derision rather than promotion, accurately described the event.

The other performance last week was, of course, George W. Bush’s September 13 address to the nation discussing the testimony of General David Petraeus before Congress. Again, there is little point in rehearsing here what Petraeus had to say about Iraq, or what Bush took away from it. For weeks, the mainstream media had been cynically predicting that nothing in the President’s position would change, and when nothing did, the outraged responses were as strangely, soporifically comforting as anything on A Prairie Home Companion. The jagged edges of our disillusion have long since been worn to the gentle contour of rosary beads, the dissonance of our angry voices merged into the sweet anonymous harmony of a mass chorus (or requiem).

What stands out to me is the perfect structural symmetry between Britney and Bush’s public offering, not of themselves – I’m too much a fan of Jean Baudrillard to suppose there is any longer a “there” there, that the individual corporeal truths of Britney and Bush did not long ago transubstantiate into the empty cartoon of the hyperreal – but of their acts. Both were lip-synching (one literally, one figuratively), and if Bush’s mouth matched the prerecorded lyrics more closely than did Britney’s, I can only ascribe it to his many more years of practice. I’ve always considered him not so much a man as a mechanism, a vibrating baffle like you’d find in a stereo speaker, emitting whatever modulated frequencies of conservative power and petroleum capital he was wired to. Unlike his father or the avatars of evil (Rove, Cheney, Rumsfeld) that surround him, Bush has never struck me as sentient or even, really, alive – he gives off the eerie sense of a ventriloquist’s dummy or a chess-playing robot. (Admittedly, he did at the height of his post-9/11 potency manifest another mode, the petulant child with unlimited power, like Billy Mumy’s creepy kid in the Twilight Zone episode “It’s a Good Life.”)

Britney, by contrast, seems much less in synch with the soundtrack of her life, which is what makes her so hypnotically sad and wonderfully watchable. I flinch away when words emanate from Bush the way I flinch when the storm of bats flies out of the cave early in Raiders of the Lost Ark; Britney’s clumsy clomping and uncertain smile at the VMAs is more like a series of crime-scene photos, slow-motion film of car wrecks, or the collapsing towers of the World Trade Center. Like any genuine trauma footage, you can’t take your eyes off it.

Where Britney and Bush came together last week was in their touching allegiance to strategies that worked for them in the past: hitting their marks before the cameras and microphones, they struck the poses and mouthed the words that once charmed and convinced, terrified and titillated. The magic has long since fled – you can’t catch lightning in a bottle twice, whether in the form of a terrorist attack or a python around the shoulders – but you can give it the old college try, and even if we’re repulsed, we’re impressed. But is it stamina or something more coldly automatic? Do we praise the gear for turning, the guillotine blade for dropping, the car bomb for exploding?

Game Studies Position at IU

I was excited to see that my former academic home base, the Department of Communication and Culture at Indiana University in Bloomington, is searching for a faculty hire in game studies. Here’s the job description:

Assistant Professor in Digital Media Studies

 

The Department of Communication and Culture at Indiana University invites applications for a tenure-track assistant professor position in digital media studies to begin Fall 2008. We seek an individual with expertise in critical approaches to digital media to join an innovative, interdisciplinary program that includes media studies, ethnography and performance studies, and rhetoric and public culture. While we invite candidates from a wide range of disciplinary backgrounds, we encourage applicants involved in research on the cultural, political, and communicative aspects of online games and in the broader field of digital game studies. Research may involve the formal qualities of digital games, their social and political dimensions, as well as questions of genre, narrative, and history. Applicants should be prepared to discuss the role that digital media play in shaping perceptions of history and culture, in forging individual and collective identities, and in mediating social change. Applicants are expected to have a strong research agenda and a commitment to excellence in teaching. Preference will be given to candidates who have their Ph.D. in hand by the date of appointment. Applicants should send a letter of application, curriculum vitae, writing sample, and three letters of recommendation. Review of applications will begin on November 16, 2007. Address applications to: Christopher Anderson, Chair, Digital Media Studies Search, Department of Communication and Culture, 800 East Third St., Indiana University, Bloomington IN 47405.

 

This is great news, not just because Bloomington’s a wonderful town (I lived there for six years), but because CMCL is a fantastic department, full of energetic and friendly scholars at both the graduate and faculty level. In recent years they’ve hired several young and exciting academics, including Phaedra Pezzullo, Ted Striphas, Mary L. Gray, and Josh Malitsky, all of whom do imaginative, politically engaged, boundary-crossing research. Meanwhile, the department has drawn M.A. and Ph.D. students in ever greater numbers who are planted firmly in the fast lane of digital and new media studies. Perhaps the best thing about CMCL, though, is that it also honors the disciplines of traditional film and media studies as well as rhetoric and pedagogy, making for a truly rich and interdisciplinary environment. I recommend the job, the department, the university, and the town to all interested applicants.

A Ping from the Blogosphere

I don’t know how wise it is to shout out to one’s own shout-out; all the cross-blogging and interlinking might prove too much for the ephemeral fabric of the cybertextual continuum, opening a raw singularity out of which droning fleets of gramophones, films, and typewriters will fly like the repressed of our lost predigital literacies. Still, that won’t stop me from thanking Henry Jenkins for the kind mention of Graphic Engine on his blog, Confessions of an Aca-Fan. His take on my postings here, in particular those regarding the Harry Potter series [1], [2], [3], is very generous (and a reminder that I have another post or two in the pipeline on this subject). I’m especially happy to be showcased on Confessions of an Aca-Fan because Henry’s blog, along with Jason Mittell’s and Tim Burke’s, were what inspired me to dive into blogging this summer.

For the last several months, Henry’s been hosting an ongoing conversation/debate about gender and fandom, pairing male and female “aca-fen” for public dialogues (here’s the inaugural entry). My turn is coming up in a few weeks, when I’ll be discussing fandom and gender in light of new media industries with Suzanne Scott, a doctoral student at USC. My initial conversations with Suzanne have been fun and enlightening, and I look forward to sharing our discourse when the spotlight falls on us later in September.

As for Henry Jenkins, well, he’s an up-and-coming scholar with a very bright future. I’d keep my eye on him.

Coming Soon: Videogame/Player/Text

This is a plug for a new collection on videogame theory, Videogame/Player/Text, that’s just about to be published by Manchester University Press (here’s the official announcement). The editors, Barry Atkins and Tanya Krzywinska, came up with a great idea: invite game scholars to contribute chapters in which they turn a videogame of their choice inside out, upside down, and shake it wildly to see what insights tumble out.

Videogame/Player/Text

For me, Videogame/Player/Text was an opportunity to return to the subject of first-person shooters, which have interested me both as a player and an academic since my early days in grad school. (My master’s thesis, a Lacanian reading of FPS history written at the University of North Carolina at Chapel Hill, later became a chapter called “Playing at Being: Psychoanalysis and the Avatar” in The Video Game Theory Reader [Routledge, 2003], edited by Mark J. P. Wolf and Bernard Perron — Amazon link here.)

For Tanya and Barry’s collection, I wanted to get away from the puzzle of retrofitting film theory to videogames (which is still, for many game scholars, anathema) and write in a more medium-specific manner. My focus in “Of Eye Candy and Id: The Terrors and Pleasures of Doom 3” is the evolution of graphic engines, the software component that renders 3D spaces from a subjective viewpoint and is an integral part — the kernel, really — of FPS experience. What I take on in my article for V/P/T is the question of when, exactly, graphic engines came into existence, both as a technical and discursive category; how graphics have generally been talked about in dialectical relation to gameplay; and how the evolution of 3D graphics relates to player embodiment, isolation, and solipsism. As a teaser, the opening paragraphs of my V/P/T essay are quoted here:

Let’s start with a claim often heard about Doom 3 (Activision/id Software, 2004): that it is “just” a remake of the 1993 original, the same stuff packaged in prettier graphics. That, although separated by eleven years and profound changes in the cultural, technological, and aesthetic dimensions of videogaming, Doom 3 – like all of Doom’s versions – boils down to a single conceit, recycled in the contemporary digital argot:

First, people are taken over, turned into cannibal Things. Then the real horror starts, the deformed monstrosities from Outside. … Soon, brave men drop like flies. You lose track of your friends, though sometimes you can hear them scream when they die, and the sounds of combat echo from deep within the starbase. Something hisses with rage from the steel tunnels ahead. They know you’re here. They have no pity, no mercy, take no quarter, and crave none. They’re the perfect enemy, in a way. No one’s left but you. You … and them.

Here the second-person voice does to readers what Doom so famously did to players, isolating them in a substitute self, an embattled, artificial you. The original Doom had its shareware release on December 10, 1993, marking the popular emergence of the first-person shooter or FPS. Less a game than a programming subgenre all its own, Doom’s brand of profane virtual reality was built around a set of graphical hacks – an “engine” of specialized rendering code – that portrayed navigable, volumetric environments from eye-level perspective. Players peered over shotgun barrels at fluidly animated courtyards and corridors, portals and powerups, and “deformed monstrosities” like the fireball-hurling Imp, the elephantine Mancubus, and the Cyberdemon (“a missile-launching skyscraper with goat legs”).

Technologically, Doom depended on advances in computer sound and imaging, themselves a result of newly affordable memory and speedy processors. Psychologically, the FPS stitched the human body into its gameworld double with unprecedented intimacy. Gone were the ant-farm displacements of third-person videogames, the god’s-eye steering of Pac-Man (Namco, 1980) and the sidescrolling tourism of Super Mario Brothers (Nintendo, 1985). Doom fully subjectivized the avatar – the player-controlled object around which action centers – turning it into a prison of presence whose embodied vulnerability (they’re coming for me!) deliciously complemented its violent agency (take that, you bastard!).

Shooters that followed – Unreal (GT Interactive/Epic, 1998), Half-Life (Sierra/Valve, 1998), Deus Ex (Eidos Interactive/Ion Storm, 2000), Halo (MS Game Studios/Bungie, 2001), and countless others – deepened the FPS formula with narrative and strategic refinements, not to mention improvements in multiplayer, artificial intelligence, and level design. But to judge by its latest iteration, the Doom series didn’t bother to evolve at all – except in terms of technical execution. …

As for the rest of V/P/T‘s contents, they look fascinating, and I’m very much looking forward to reading them. A lot of friends among the contributors, and a lot of writers whose work I respect. Here’s the chapter lineup:

  • Introduction: Videogame, player, text – Barry Atkins and Tanya Krzywinska
  • Beyond Ludus: narrative, videogames and the split condition of digital textuality – Marie-Laure Ryan
  • All too urban: to live and die in SimCity – Matteo Bittanti
  • Play, modality and claims of realism in Full Spectrum Warrior – Geoff King
  • Why am I in Vietnam? – The history of a video game – Jon Dovey
  • ‘It’s Not Easy Being Green’: real-time game performance in Warcraft – Henry Lowood
  • Being a determined agent in (the) World of Warcraft: text/play/identity – Tanya Krzywinska
  • Female Quake players and the politics of identity – Helen W. Kennedy
  • Of eye candy and id: the terrors and pleasures of Doom 3 – Bob Rehak
  • Second Life: the game of virtual life – Alison McMahan
  • Playing to solve Savoir-Faire – Nick Montfort
  • Without a goal – on open and expressive games – Jesper Juul
  • Pleasure, spectacle and reward in Capcom’s Street Fighter series – David Surman
  • The trouble with Civilization – Diane Carr
  • Killing time: time past, time present and time future in Prince of Persia: The Sands of Time – Barry Atkins

Videogame/Player/Text should be published by the end of September from Manchester University Press. I invite you to check it out.

Remembered for His Monsters

William Tuttle

With sadness and a sweet sense of nostalgia I note the passing of one of the great effects technicians, William Tuttle, who died July 27. (The NY Times obituary is here; you may need to register with the site to view it.) Tuttle headed the makeup department of MGM and worked on many films I remember fondly from when I was a kid: The Fury (the 1978 Brian DePalma film that ends with the explosion of John Cassavettes); Logan’s Run (1977), in which he turned Roscoe Lee Brown into a silver-faced cyborg artist nutcase named Box; Young Frankenstein (1974), where Tuttle’s makeup for Peter Boyle both satirized and honored Jack Pierce’s artistry in the 1931 Frankenstein; and “The Night Strangler” (1973), one of the pair of telefilms that launched the TV series Kolchak: The Night Stalker. Tuttle also did standout work in The Seven Faces of Dr. Lao (1964) and The Time Machine (1960), creating a morphing set of identities for Tony Randall in the former and the fearsome Morlocks in the latter.

Young FrankensteinThe Time Machine

If you recognize in this retrograde stroll through Tuttle’s filmography the archival trace of IMDb, you’re exactly right; I called up that website reflexively, using it, as I so often do, as a prosthetic augmentation of my mediagoing memories. What’s interesting in this case is how much of Tuttle’s work I was completely unaware of: all the non-monstrous, un-fantastic makeup jobs he did on Hollywood stars, making them look glamorous or rugged or merely screen-real instead of bizarre. Perhaps Tuttle’s best-known creation — in that it triggers for a much of a certain TV-watching generation an avalanche of networked associations to black-and-white anthology dramas, smart SF & fantasy, and twist endings — played on just that split between the “normal” and the “hideous,” in the episode of The Twilight Zone entitled “Eye of the Beholder” (1960). Almost as unmistakably recognizable as Rod Serling’s wry face and voice is the twisted visage of the medical staff unveiled in “Beholder”‘s wrenching final images. For all the faces that William Tuttle helped to look pretty or handsome, it’s the monsters we’ll remember him for.

Eye of the Beholder