Digital Day for Night

A quick followup to my recent post on the new Indiana Jones movie: I’ve seen it, and find myself agreeing with those who call it an enjoyable if silly film. Actually, it was the best couple of hours I’ve spent in a movie theater on a Saturday afternoon in quite a while, and seemed especially well suited to that particular timeframe: an old-fashioned matinee experience, a slightly cheaper ticket to enjoy something less than classic Hollywood art. Pulp at a bargain price.

But my interest in the disproportionately angry fan response to the movie continues. And to judge by articles popping up online, Indiana Jones and the Kingdom of the Crystal Skull is providing us, alongside its various pleasures (or lack thereof), a platform for thinking about that (ironically) age-old question, “How are movies changing?” — also known as “Where has the magic gone?” Here, for example, are three articles, one from Reuters, one from The Atlantic.com, and one from an MTV blog, each addressing the film’s heavy use of CGI.

I can see what they’re talking about, and I suppose if I were less casual in my fandom of the first three Indy movies, I’d be similarly livid. (I still can’t abide what’s been done to Star Wars.) At the same time, I suspect our cultural allergy to digital visual effects is a fleeting phenomenon — our collective eyes adjusting themselves to a new form of light. Some of the sequences in Crystal Skull, particularly those in the last half of the film, simply wouldn’t be possible without digital visual FX. CG’s ability to create large populations of swarming entities onscreen (as in the ant attack) or to stitch together complex virtual environments with real performers (as in the Peru jungle chase) were clearly factors in the very conception of the movie, with the many iterations of the troubled screenplay passing spectacular “beats” back and forth like hot potatoes on the assumption that, should all else fail, at least the movie would feature some killer action.

Call it digital day for night, the latest version of the practice by which scenes shot in daylight “pass” for nighttime cinematography. It’s a workaround, a cheat, like all visual effects, in some sense nothing more than an upgraded cousin of the rear-projected backgrounds showing characters at seaside when they’re really sitting on a blanket on a soundstage. It’s the hallmark of an emerging mode of production, one that’s swiftly becoming the new standard. And our resistance to it is precisely the moment of enshrining a passing mode of production, one that used to seem “natural” (for all its own undeniable artificiality). By such means are movies made, but it’s also the way that the past itself is manufactured, memory and nostagia forged through an ongoing dialectic of transparency and opacity that haunts our recreational technologies.

We’ll get used to the new way of doing things. And someday, movies that really do eschew CG in favor of older FX methodologies, as Spielberg and co. initially promised to do, will seem as odd in their way as performances of classical music that insist on using authentic instruments from the time. For the moment, we’re suspended between one mode of production and another, truly at home in neither, able only to look unhappily from one bank to another as the waterfall of progress carries us ever onward.

Indiana Jones and the Unattainable FX Past

This isn’t a review, as I haven’t yet made it to the theater to see Indiana Jones and the Kingdom of the Crystal Skull (portal to the transmedia world of Dr. Jones here; typically focused and informative Wiki entry here). What I have been doing — breaking my normal rule about keeping spoiler-free — is poring over fan commentaries on the new movie, swimming within the cometary aura of its street-level paratexts, working my way into the core theatrical experience from the outside in. This wasn’t anything intentional, more the crumbling of an internet wall that sprang one informational leak after another, until finally the wave of words washed over me like, well, one of the death traps in an Indiana Jones movie.

Usually I’m loath to take this approach, finding the twists and turns of, say, Battlestar Galactica and Lost far more compelling when they clobber me unexpectedly (and let me add, both shows have been rocking out hard with their last couple of episodes). But it seemed like the right approach here. Over the years, the whole concept of Indiana Jones has become a diffuse map, gas rather than solid, ocean rather than island. Indy 4 is a media object whose very essence — its cultural significance as well as its literal signification, the decoding of its concatenated signage — depends on impacted, recursive, almost inbred layers of cinematic history.

On one level, the codes and conventions of pulp adventure genres, 1930s serials and their ilk, have been structured into the series film by film, much like the rampant borrowings of the Star Wars texts (also masterminded by George Lucas, whose magpie appropriations of predecessor art are cannily and shamelessly redressed, in his techno-auteur house style, as timelessly mythic resonance). But by now, 27 years after the release of Raiders of the Lost Ark, the Indy series must contend with a second level of history: its own. The logic of pop-culture migration has given way to the logic of the sequel chain, the franchise network, the transmedia system; we assess each new installment by comparing it not to “outside” films and novels but to other extensions of the Indiana Jones trademark. Indy 4, in other words, cannot be read intertextually; it must be read intratextually, within the established terms of its brand. And here the franchise’s history becomes indistinguishable from our own, since it is only through the activity of audiences — our collective memory, our layered conversations, the ongoing do-si-do of celebration, critique, and comparison — that the Indy texts sustain any sense of meaning above and beyond their cold commodity form.

All of this is to say that there’s no way Indiana Jones and the Kingdom of the Crystal Skull could really succeed, facing as it does the impossible task of simultaneously returning to and building upon a shared and cherished moment in film history. While professional critics have received the new film with varying degrees of delight and disappointment, the talkbacks at Aint-It-Cool News (still my go-to site for rude and raucous fan discourse) are far more scornful, even outraged, in their assessment. Their chorused rejection of Indy 4 hits the predictable points: weak plotting, flimsy attempts at comic relief, and in the movie’s blunt infusion of science-fiction iconography, a generic splicing so misjudged / misplayed that the film seems to be at war with its own identity, a body rejecting a transplanted organ.

But running throughout the talkback is another, more symptomatic complaint, centering on the new film’s overuse of CG visual effects. The first three movies — Raiders, Temple of Doom, and Last Crusade — covered a span from 1981 to 1989, an era which can now be retroactively characterized as the last hurrah of pre-digital effects work. All three feature lots of practical effects — stuntwork, pyrotechnics, and the on-set “wrangling” of everything from cobras to cockroaches. But more subtly, all make use of postproduction optical effects based on non-digital methods: matte paintings, bluescreen compositing, a touch of cel animation here, a cloud tank there. Both practical and optical effects have since been augmented if not colonized outright by CG, a shift apparently unmissable in Indy 4. And that has longtime fans in an uproar, their antidigital invective targeted variously on Lucas’s influence, the loss of verisimilitude, and the growing family resemblance of one medium (film) to another (videogames):

The Alien shit didnt bother me at all, it was just soulless and empty as someone earlier said.. And the CGI made it not feel like an Indy flick in some parts.. I walked out of the theater thinking the old PC game Fate of Atlantis gave me more Indiana joy than this piece of big budget shit.

My biggest gripe? Too much FUCKING CGI. The action lacked tension in crucial places. And there were too many parts (more than from the past films) where Looney Tunes physics kept coming into play. By the end, when the characters endure 3 certain deaths, you begin to think “Okay, the filmmakers are just fucking around, lean back in your seat and take in the silliness.” No thanks. That’s not what makes Indiana Jones movies fun.

This film was AVP, The Mummy Returns and Pirates of the Fucking Carribean put together, a CGI shitfest. A long time ago in a galaxy far far away, Lucas said “A special effect is a tool, a means to telling astory, a special effect without a story is a pretty boring thing.” Take your own advice Lucas, you suck!!!

The entire movie is shot on a stage. What happened to the locations of the past? The entire movie is CG. What a disappointment. I really, REALLY wanted to enjoy it.

Interestingly, this tension seems to have been anticipated by the filmmakers, who loudly claimed that the new film would feature traditional stuntwork, with CGI used only for subtleties such as wire removal. But the slope toward new technologies of image production proves to be slippery: according to Wikipedia, CG matte paintings dominate the film, and while Steven Spielberg allegedly wanted the digital paintings to include visible brushstrokes — as a kind of retro shout-out to the FX artists of the past — the result was neither nostalgically justifiable or convincingly indexical.

Of course, I’m basing all this on a flimsy foundation: Wiki entries, the grousing of a vocal subcommunity of fans, and a movie I haven’t even watched yet. I’m sure I will get out to see Indy 4 soon, but this expedition into the jungle of paratexts has definitely diluted my enthusiasm somewhat. I’ll encounter the new movie all too conscious of how “new” and “old” — those basic, seemingly obvious temporal coordinates — exceed our ability to construct and control them, no matter how hard the filmmakers may try, no matter how hard we audiences may hope.

A Marvel of Engineering

The opening act of the summer movie season, Iron Man, is much like the machine armor worn by Tony Stark (Robert Downey Jr.): a potent blend of advanced technology, sleek style, and glowing energy. The fetishism of the super-suit has rarely been quite so explicitly rendered, or embraced with such pornographic shamelessness, in comic-book cinema. Sure, movies and television have given us plenty of heroes whose iconic power resides in the costume (whether caped or capeless): Christopher Reeve’s Superman, Tobey Maguire’s Spider-Man, the leather-overcoat-and-sunglasses combo of Wesley Snipes in Blade, the patriotic bustier worn by Lynda Carter as Wonder Woman. Often these sartorial choices become flashpoints of controversy with fans: think of Bryan Singer’s X-Men adaptation, which did away with the classic yellow costumes of the comic series, or the many nippled and sculpted variations of the Batsuit worn by the Batactors playing a series of Batmen in the Batfranchise.

In Iron Man, the situation is different, for Iron Man is his suit, the “secret identity” within a compromised figure both morally and physically. Over the course of the story, Stark’s transformation from a hard-partying weapons magnate to a passionately peace-committed and (mostly) teetotaling cyborg is made concrete — made metal, really — through the metaphor of the successively more sophisticated armor shells in which he encases himself. The first is a kitbashed monstrosity the color of corroded tin cans, crisscrossed with scars of solder. Like a rustbucket car, it survives just long enough to convey Stark to version two, a more compact silver exoskeleton reminiscent of Ginsu knives and Brookstone gadgets. It’s quite satisfying when Stark arrives at the canonical configuration, a red-and-gold chassis of interlocking plates, purring hydraulics, and HUD graphics that answers the question “What would it be like to wear a Lamborghini?”

What’s clever is how these upgrades express Stark’s ethical evolution while recapitulating decades of shifting design in the Marvel comic series from which this movie sprang. (It’s kind of like a He’s Not There version of James Bond in which the lead is played over the course of the film by Sean Connery, then Roger Moore, followed by Timothy Dalton, Pierce Brosnan, and Daniel Craig — with a quick dream interlude, of course, starring George Lazenby.) Iron Man, in other words, manages to honor superhero history rather than pillaging it, and this — along with the film’s smart screenplay and glossy digital mise-en-scene — wins it a sure place in future best-of lists when it comes to the spotty genre of comic-book adaptations.

Another weapon in the film’s arsenal, riding within in its narrative delivery system like he pilots his mechanical costume, is Robert Downey Jr., who seems to have arrived at a point of perfect intertextual harmony with this turn in his career. His performance as Stark is properly lived-in and mischievous (indeed, the actor’s persona is yet another kind of suit, this one built of bad publicity) yet alive with disarmingly sincere warmth. In one of the film’s facile yet pitch-perfect tropes, Stark must wear a pulsing blue-and-white generator on his chest, a kind of electromagnetic pacemaker that doubles as an energy source for his armor and trebles as a signifier of the character’s humanity. Downey Jr. is himself a kind of power source that propels the vehicle of Iron Man forward while lending it, in stray moments, genuine moral weight. His portrayal reminds us that, while a superhero’s technological or organic essence is important, the larger ingredient is something harder to quantify and calibrate — in Stark’s case, a kludge of intellect, imagination, and compassion that easily trades one outfit for another.

That’s not all that’s going on under the hood: the fight scenes, not to mention flight scenes, are awesome, and Jeff Bridges is remarkably menacing with all his hair relocated from his head to his chin. There’s some nimble ideological shadowboxing around the the military-industrial complex and the terrible allure of “shock and awe” (there’s your real pornography). And while Stan Lee makes his trademarked cameo, branding this as yet another item in Marvel’s transmedia catalog, a quirky counterpoint is sounded when a villain taunts Stark by asking “Did you really think that just because you had an idea, it belongs to you?” For comic fans, it’s hard not to hear this and think of Marvel’s infamous screwing of Jack Kirby. At the same time, we should count our blessings that concepts like Iron Man travel so fluidly through our mediascape, rephrasing themselves in the transformational grammar of convergence cinema. Too often, the result is an ugly and lifeless thing — a strangled fragment like Daredevil, a run-on sentence like Van Helsing. But once in a lucky while, you get a perfect little poem like Iron Man.

The Id Machine

In one of those media events so global — perhaps solar-systemic? — in scope that you hardly need me to remind you about it, Grand Theft Auto IV launches today. Early word on Rockstar’s latest is everything it ought to be: game reviewers are enraptured, moral guardians enraged. Me, I’m just waiting to get my hands on the thing, which manifests in this world as a silver disk in a bright-green plastic case, but becomes in the space of the screen another totemic circle: a steering wheel. Maybe more than any other virtual-world franchise, GTA toggles its players smoothly between human and automotive avatars, encasing us in cars for such long stretches of gametime that, as Tycho at Penny Arcade writes, you might end up just “sitting in a parking lot listening to the radio.”

The webcomic associated with Tycho’s post makes another good point about Grand Theft Auto, namely that its possible pathways are so seemingly infinite in number that they risk numbing the player with the paralysis of “total freedom.” The opposite of the rail shooter to which I compared the Harry Potter novels a few posts back, GTA and its ilk are better described as sandbox games, which emphasize the open-endedness of play. Now, I admit to being skeptical of such neat distinctions, believing in my curmudgeonly way that the sense of unbounded possibility offered up by most “interactive” experiences is just that: a phantasmic structure of feeling, conveniently packaged and sold to us in the same way that advance hype about the summer movie season is more the actual commodity than the movies themselves. (That said, I’m looking forward, same as always, to things like Iron Man, Indiana Jones and the Kingdom of the Crystal Skull, The Dark Knight, and — mmmm yes — Speed Racer.) It’s a matter of perception, not pathways. The most scripted of videogames (if done well) can get my heart thumping with the sense that anything might happen next, while GTA, no matter how many gigabytes of gamedata and corresponding square mileage of explorable diegesis it may offer, can still wear out its welcome. Though I played both avidly, I never finished either Grand Theft Auto III or its sequel, Vice City.

That said, I never played anything as gripping, anarchic, and sensual, either. My friend Chris Dumas calls GTA the “id machine,” and he’s right. Like the Krell technology buried beneath the surface of Altair IV in Forbidden Planet, GTA is a visualization engine for the subconscious, pipelining our nastiest, bloodiest impulses into daylight, setting loose neon monsters we didn’t know we had in us. It’s insane fun to play and, like the best videogames, cinematically engrossing to observe. It’s also perversely Bazinian in form. As a grad student in 2002, I wrote a paper called “Grand Theft Auto 3 and the Interface of the Everyday,” arguing that GTA is at heart a simulation — not of mechanical or physical processes, but urban experience. Here’s the intro:

With its heightened violence, black humor, and mise-en-scene reminiscent of blaxploitation and vigilante films from the 1970s and Quentin Tarantino’s postmodern recyclings in movies such as Reservoir Dogs (1992) and Pulp Fiction (1994), GTA3 seems to stand apart from the tradition of simulation games, so much so that its simulationist tendencies are perceptible only upon reflection; on first glance it is more likely to be put into the category of “shoot-em-up” games such as Doom, Quake, and other first-person shooters, or hand-to-hand fighting games such as Tekken and Mortal Kombat. I argue, however, that GTA3 actually represents the culmination, in a form so pure as to be almost unrecognizable, of a particular simulationist logic that has heretofore stayed comfortably submerged in videogames: the notion of urban realism. Or rather, a refracted and stylized realism whose excesses should not be allowed to obscure its essential goal: the representation of modern urban existence, complete with dead time, bad weather, traffic lights, blaring radio stations, law enforcement by turns oblivious and aggressive, and a totalizing motif of passage – endless motion through the city’s spaces on foot or (more often) behind the wheel of a car, from the vantage point of which Liberty City’s bridges and skyscrapers, storefronts and pedestrians, become spectacles simultaneously mundane and beautiful.

In this sense, GTA3 follows a logic of modernity articulated by Walter Benjamin and Siegfried Kracauer, and before them Baudelaire, whose epigrammatic summation of modernity as “the transient, the fleeting, the contingent” set the terms for a discourse of ephemera – the idea that the truth of contemporary existence, and perhaps the key to its revolutionary reform, resides not in the monumental and “historic” but in the unnoticed and ordinary. In this paper I shall explore the idea that GTA3 makes the everyday its object of simulation, interaction, and pleasure, enabling users to play within the environs of a stylized urban reality as a way of experiencing, and reflecting upon, their own place in the world and position in society. In the second half of the paper I move toward a consideration of videogames in general, setting them against a backdrop of twentieth-century technologies, in order to argue that videogames share a function that Benjamin identified as “subject[ing] the human sensorium to a complex kind of training” in which “perception in the form of shocks [is] established as a formal principle.” Under this view, the content of Grand Theft Auto (which concerns itself textually with an alternating rhythm of shocks and boredom) merges with the formal operations of videogames, which, consumed in unmarked leisure time, reflect changes wrought in consciousness by technology and industrialization, similar to Benjamin’s description of the Fun Fair whose Dodgem cars achieve “a taste of the drill to which the unskilled laborer is subjected in the factory.” I begin with a consideration of three main components of GTA3’s play: the city, the flâneur, and the car.

Looking at this argument today, it doesn’t seem too earthshaking; Gonzalo Frasca explores some similar ideas in his 2003 essay “Sim Sin City.” It may be that with the advancing tide of computer graphics, we’re less scandalized by the notion that videogames can stand in, even substitute for, the visual and auditory sensorium through which we filter and know reality. Games, that is, increasingly engage in a double simulation, first of our lived sensory existence and only secondarily of more ephemeral (but nonetheless meaningful) matters: ethics, aesthetics, class consciousness. In the case of GTA, the subjectivity tourism is that of a violent, animalistic, unforgiving struggle to survive on the streets, something that its player demographic will likely never confront. GTA provides in musical flashes a world we recognize as our own even as we comfortably disavow it through the technological trick of switching off the console: inverting the hypodermic needle’s injection, we anesthetize ourselves precisely by unplugging, retreating from the raw truth of the made-up game into the ongoing dream of our privileged, protected lives.

Gearing up for Santa Barbara

I leave in a few days for the Console-ing Passions conference in Santa Barbara. I’d be excited just because of the location (the conference concludes with a beach party, for gosh sakes) or the nature of the professional gathering itself, since I had a wonderful time at Console-ing Passions in New Orleans in 2004. But most of all I’m thrilled to be taking part in a workshop discussion that grew out of the gender-and-fandom debates hosted by Henry Jenkins last summer. My colleagues Julie Levin Russo (Brown University), Louisa Stein (San Diego State University), Sam Ford (Massachusetts Institute of Technology), and Suzanne Scott (University of Southern California) all participated in those male-female pairups, and we formulated the CP workshop as a space not just to present our own research, but engage in a dialogue about where that massive, months-long conversation has left us as fan scholars who confront issues of gender, power, privilege, and creativity

The workshop, which takes place Friday morning, is entitled Gendered Fan Labor in New Media and Old. Each of us will speak briefly about a current research interest or project, based on a text or media artifact that raises questions about creative media fandom in both its historical and contemporary dimensions and which focuses on gendered labor as an axis intersecting multiple concerns: taxonomies of fan practice, shifting economic relations between consumers and producers, questions of legitimacy and legality, the impact of new technologies, and the increasing visibility in popular, industrial, and academic discourses of heretofore marginal(ized) fan communities. Second, we hope to perform a kind of post-mortem on the summer’s debates: highlighting certain recurring themes, tendencies, and absences that structured the discourse, unpacking problematic areas, and reflecting both on what went well or badly in the past, and where we might productively go in the future. Here are the others’ projects, full versions of which are viewable on LiveJournal’s fandebate (thanks to Kristina Busse):

  • Julie Levin Russo, “The L Word: Labors of Love”
  • Sam Ford, “Outside the Target Demographic: Surplus Audiences in Wrestling and Soaps”
  • Suzanne Scott, “From Filking to Wrocking: The Rock Star/Groupie Dialectic in Harry Potter Wizard Rock”
  • Louisa Stein, “Vidding as Cultural Narrative”

My own project, “Boys, Blueprints, and Boundaries: Star Trek‘s Hardware Fandom,” examines a subset of Trek fandom that devotes itself to the literal mapping of Trek‘s canonical universe and recreating in material form its diegesis through activities such as the drafting of episode guides and concordances, the manufacture of costumes, props, and model kits, and the making of technical manuals and blueprints. The first paragraph is quoted below; you can also read the full (short) paper at LiveJournal. Comments on the project welcomed and appreciated!

The recent legal dispute between J. K. Rowling, author of the Harry Potter novels, and Steven Vander Ark, a Michigan librarian who has compiled an internet guide to the Harry Potter “universe,” raises many interesting questions about copyright, authorial power, and what might be called a double standard of contemporary media production in which potentially infringing online publication is tolerated, even welcomed, by copyright holders, while the equivalent publication in print form is energetically resisted. But viewed through the lenses of fandom and gender, the Rowling / Vander Ark case illuminates another and much older conundrum, consisting of a linked pair of problematic binaries. On one hand, there is the contrast between fan-produced materials which creatively transform an original work (like fanfic, slash, vidding, filksongs, and artwork) and those which “merely” document, map, or archive the original work (like concordances, episode guides, blueprints, and technical manuals). On the other hand, there is the apparent gender split between the traditionally female fans who produce work considered to be transformative, and male fans whose productivity tends instead toward the technical and archival. The relationship between male fans and what I will call “blueprint culture” is the subject of this short paper, in which I consider gendered fan labor as it is manifested in fantasy and history; ways of rethinking this labor as creative and transformative; and current trends that reflect the growing impact of blueprint culture in both industrial and academic domains.

Cartographers of (Fictional) Worlds, Unite!

J. K. Rowling’s appearance in a Manhattan courtroom this week to defend the fantasy backdrop of her Harry Potter novels is interesting to me for several reasons. It dovetails with a conversation I’ve been having in the Fan Culture class I’m teaching this semester, about the vast world-models that subtend many franchise fictions (e.g. the “future history” of Star Trek, the Middle-Earth setting of Lord of the Rings, the Expanded Universe of Star Wars, and so on). In his writing on subcreation, J. R. R. Tolkien calls these systematic networks of invented facts, events, characters, and languages “secondary worlds,” but more recently the phenomenon has been given other labels by media theorists: master text, hyperdiegesis. Henry Jenkins has put forth the most influential formulation with his concept of transmedia storytelling, which recasts franchise fictions like The Matrix as a kind of generative space — a langue capable of ceaseless acts of fictional parole — which can be accessed through any number of its “extensions” in disparate media.

One might say, in an excess of meta-thinking, that the notion of the storyworld itself floats suspended among these various theoretical invocations: a distributed ghost of a concept that feels increasingly “real.” As our media multiply, overlap, and converge in a spectacular mass ornament like a Busby Berkeley musical number, we witness a contrasting, even paradoxical, tendency toward stabilization, concreteness, and order in our fictional universes.

A key agency in this stabilization is the cataloging and indexing efforts of fans who keep track of sprawling storylines and giant mobs of dramatis personae, cross-referencing and codifying the rules of seriality’s endless play of meaning. Most recently, these labors have coalesced in communally-maintained databases like Lostpedia, the Battlestar Wiki, and — yes — the Harry Potter Lexicon at the heart of the injunction that Rowling is seeking. The conflict is over a proposed book project based on the online Lexicon, a fan-crafted archive of facts and lore, characters and events, that make up the Harry Potter universe. Although Rowling has been sanguine about the Lexicon till now (even admitting that she draws upon it to keep her own facts straight), the crystallization of this database into a for-profit publication has her claiming territorial privilege. Harry, Hermione, and Ron — as well as Quidditch, Dementors, and Blast-Ended Skrewts — are emphatically Rowling’s world, and we’re not quite as welcome to it as we might have thought.

At issue is whether such indexing activities are protected by the concept of transformative value: an emerging legal consensus that upholds fan-produced texts as valid and original so long as they add something new — an interpretive twist, a fresh insight — to the materials they are reworking. (For more on this movement, check out the Organization for Transformative Works.) Rowling asserts that the Harry Potter Lexicon brings nothing to her fiction that wasn’t there already; it “merely” catalogs in astonishing detail the contents of the world as she has doled them out over the course of seven novels. And on the surface, her claim would seem to be true: after all, the Lexicon is not itself a work of fiction, a new story giving a new slant on Harry and his adventures. It is, in a sense, the opposite of fiction: a documentary concordance of a made-up world that treats invention as fact. Ideologically, it inverts the very logic of make-believe, but in a different way from behind-the-scenes paratexts like author interviews or making-of featurettes on DVDs. We might call what the Lexicon and other fan archives do tertiary creation — the extraction of a firm, navigable framework from a secondary, subcreated world.

But is Rowling’s case really so straightforward? It seems to me that what’s happening is a turf battle that may be rare now, but will become increasingly common as transmedia fictions proliferate. The Lexicon, whether in print or cybertext, does compete with Rowling’s work — if we take that “work” as being primarily about building a compelling, consistent world. The Lexicon marks itself as a functionally distinct entity by disarticulating the conventional narrative pleasures offered by Rowling’s primary text: what’s stripped away is her voice, the pacing and structure of her storytelling. By the same token, however, the Lexicon produces Rowling’s world as something separate from Rowling. And for those readers for whom that world was always more compelling than the specific trajectories with which Rowling took them through it (think of the concept of the rail shooter in videogames), the Lexicon might indeed seem like a direct competitor — especially now that it has migrated into a medium, print, that was formerly Rowling’s own.

The question is: what happens to secondary worlds once they have been created? What new forms of authority and legitimacy constellate around them? It may well be the case that the singular author who “births” a world must necessarily cede ownership to the specialized masses who then come to populate it, whether by writing fanfic, building model kits and action figures, cosplaying, roleplaying, or — in the Lexicon’s case — acting as archivists and cartographers.

Before the Internet, such maps were made on paper, sold and circulated among fans. One of my areas of interest is the “blueprint culture” that arose around Star Trek and other science-fiction franchises in the 1960s and 1970s. I’ll be speaking about this topic at the Console-ing Passions conference in Santa Barbara at the end of April, but Rowling’s lawsuit provides an interesting vantage point from which to blend contemporary and historical media processes.

Spring Break

You’ve probably noticed that I haven’t been posting as frequently in recent weeks. It’s not for lack of desire or interesting things to talk about; I’ve just found my to-do lists growing lengthier rather than shorter, and the accumulation of projects and obligations is taking its toll.

Hence, Graphic Engine will be going on temporary hiatus as I turn my attention to several areas of work that need attention before the end of spring semester. I plan to be back soon, however, so watch this space! And in the meantime, take care.

Man in the Suit

creature.jpg

Sad news: Ben Chapman, who played the Creature from the Black Lagoon in the 1954 film of the same name, is dead.

Chapman’s death, while no less tragic, hits me a little differently than the passing of William Tuttle, whom I wrote about last August. While Tuttle contributed to hundreds of films, Chapman played just one role in one movie — and that uncredited at the time. While Tuttle worked behind the scenes, Chapman performed in front of the camera. And while Tuttle designed and applied makeup and prosthetics that others wore, Chapman was literally the man in the suit: a full-body sheath made of foam rubber, a headpiece fringed with pulsating gills, and two webbed gloves tipped with fearsome claws.

In this sense, we might think of Chapman as occupying a nodal point in the circuit of special effects manufacture precisely opposite that of the costume’s “creator.” Somebody else designed the thing; all Chapman did was inhabit it. Indeed, Chapman’s contribution subdivides and apparently dissipates the more closely we examine it, scattering into a shadowy network of elided labor and thwarted fame. He was not, for example, the only person to play the Creature. Ricou Browning wore the suit for underwater sequences, while Chapman did the bits on land. (Browning returned for the water scenes in sequels Revenge of the Creature [1955] and The Creature Walks Among Us [1956]; in these films the Creature-on-land was played by Tom Hennesy and Don Megowan respectively.) Even the suit’s original designer is in question, credited for many years to veteran makeup artist Bud Westmore, but recently recuperated as the work of Milicent Patrick.

Yet amid the thicket of Hollywood’s ramified pasts, Chapman and the suit he wore are fused in my memory as well as the collective memory of horror and science fiction fans. To some extent this is due to the first Creature‘s place at the overlap of several important genre histories. It was a cornerstone of the grand 1950s wave of cinematic SF that includes The Thing from Another World (1951), The Day the Earth Stood Still (1951), War of the Worlds (1953), Them (1954), Invasion of the Body Snatchers (1956), Earth Versus the Flying Saucers (1956), The Blob (1958), and — a personal favorite and source of this blog’s signature image — Forbidden Planet (1956). Moreover, Creature was directed by Jack Arnold, who also helmed the classics It Came From Outer Space (1953), This Island Earth (1955), and The Incredible Shrinking Man (1957).

Not all of these films are of equal caliber, certainly. They run the gamut from cerebral “message” films to drive-in shockers, a continuum on which Creature probably registers toward the window-mounted-speakers end. Befitting its status as an early Jaws, Creature was released in 3D. As a kid, I was lucky enough to see one of these ghosty red-and-green prints at a screening on the University of Michigan campus; the headache induced by those plastic glasses is inseparable from the excitement of seeing claws jutting out of a petrified wall in one of the film’s opening images.

claw.jpg

But the fascination of Creature (the movie) and Creature (the monster) outlasted their tricked-up 3D and their genre boomlet, surviving as only an icon can throughout many replayings on TV, VCR, and DVD. Ben Chapman built a career out of his few minutes on screen, appearing at conventions, giving interviews, and running a website whose very title — www.the-reelgillman.com — insists on the singular authenticity of his performance. Like the suit he wore, a neglected piece of film flotsam rediscovered by a janitor and ultimately purchased by Forrest J. Ackerman of Famous Monsters, Chapman physically anchored a diffuse cloud of memories and fantasies, concretizing a point in time and space where Creature from the Black Lagoon “really happened.”

Not just an icon, then, but an index: evidentiary proof of a world existing simultaneously before the camera and within our imaginations, and hence a junction point between virtual and real, dream and daylight, forgotten and retrieved, submarine and dry land.

julieadams4.jpg

Movie-a-Day: November 2007

mist.jpg

An unexpected benefit of charting my movie-watching habits through this series of posts is the perspective it sheds on my work rhythms — the ebb and flow of teaching, grading, doing research, and attending daily to dozens of other details of the academic profession — and their impact on what I choose to watch. November was crazy-busy — not just with school but a trip back to Indiana for Thanksgiving with my and my wife’s families, followed by our first wedding anniversary. Looking back, I’m surprised that I got to see anything at all. But no: somehow I managed to squeeze in nineteen films in thirty days, or roughly (my compulsive calculator checking reveals) .63 movies per day. Reviewing the list, though, what jumps out at me is how many recent and new releases I gravitated toward: eleven of the nineteen or (more calculator-tapping) 58% of the titles are from 2006 and 2007.

So what does this mean? Maybe older films are simply more taxing; good as my favorites were — the 1939 Hunchback of Notre Dame, Powell and Pressburger’s I Know Where I’m Going!, Bogdanovich’s Paper Moon — they demanded more in the watching than did the blissful, frictionless experience of newer pleasures like The Host, No Country for Old Men, and Meet the Robinsons. When I started this regimen last summer, I was able to sit through very old, very long movies in alert immersion — taking notes, no less. Most of November’s selections I watched flat on my back, relaxing into the TV screen or the laptop balanced on my chest, and I’m embarrassed to admit that I fell asleep once or twice during movies that really demand more respectful attention: I Walked With A Zombie, The Italian Job.

Ah well. I’ve learned to stop apologizing for watching what I watch, the way I watch it. (It’s pretty much a requirement if you commit to doing media studies.) If it surprises me now to learn that I saw The Ex — recorded in brainspace only by a fugue-patch of static — I’ll just remind myself that wasted time is usually good for the soul. But I will admit that the coming summer is starting to look very good to me: a long lush season of quiet during which I can finally get back to a real movie-a-day plan, digging into film history and moving outside my comfort zones.

As usual, I’ve placed stars next to the films that made an unusually strong impact on me. In one case, perhaps, more stars are deserved: Frank Darabont’s adaptation of Stephen King’s The Mist was the most wrenching horror film I’ve seen since The Descent, and in its way a thing of remarkable beauty.

Movie-A-Day: November 2007

Notes On A Scandal (Richard Eyre, 2006)
The Hunchback of Notre Dame (William Dieterle, 1939)
Spider-Man 3 (Sam Raimi, 2007)
Allegro Non Troppo (Bruno Bozzetto, 1976)
Harlan County USA (Barbara Kopple, 1976)
The Italian Job (Peter Collinson, 1969)
I Know Where I’m Going! (Michael Powell & Emeric Pressburger, 1945)*
The Host (Joon-ho Bong, 2006)*
No Country for Old Men (Joel & Ethan Coen, 2007)*
Battlestar Galactica: Razor (Félix Enríquez Alcalá & Wayne Rose, 2007)
La noire de … [Black Girl] (Ousmane Sembene, 1965)
I Walked With a Zombie (Jacques Tourneur, 1943)
Sherrybaby (Laurie Collyer, 2006)*
Inland Empire (David Lynch, 2006)
The Ex (Jesse Peretz, 2006)
Meet the Robinsons (Stephen J. Anderson, 2007)
Paper Moon (Peter Bogdanovich, 1973)*
Hairspray (Adam Shankman, 2007)
The Mist (Frank Darabont, 2007)*

Retrographics and Multiplayer avant la lettre

super-mario-galaxy-21.jpg

Let me start with a disclosure: although I own both a Nintendo Wii and an XBox 360, I almost exclusively play the latter — and rarely play the former. I’ve agonized over this. Why does my peak Wii moment remain the mercenary achievement of tracking one down last summer? Why haven’t the Wii-mote and its associated embodied play style inspired me to spend a fraction as many hours in front of the television as I’ve spent working through Katamari Beautiful, Valve’s Orange Box, Halo 3, and Need for Speed Carbon on the Xbox? The answer, it seems to me, comes down to graphics: Microsoft’s console simply pushes more pixels and throws more colors on my new HD TV, and I vanish into those neon geometries without looking back. I feel guilty about this, vaguely philistine, the same way I felt when I switched from Macintosh to PC. But there it is. Like Roy Neary (Richard Dreyfuss) in Close Encounters of the Third Kind, I go where the pretty lights lead me.

But that doesn’t make the phenomenon of the Wii any less fascinating, and the recent New York Times article on the top-selling console games of 2007 is compelling in its assertion that gamers are turning away from the kind of high-end techno-sublime represented by the Xbox 360 and the Playstation 3 and toward the simpler graphics and more accessible play style of the Wii. It makes sense that a dialectic would emerge in videogames between the superadvanced aesthetic and its primitive-by-comparison cousin; the binary of shiny-new and gnarly-old has structured everything from Quake‘s blend of futuristic cyborgs and medieval demons to Robert Zemeckis’s digital adaptation of the ancient Beowulf.  Anyone who’s discovered the joy of bringing old 8-bit games to life with emulators like MAME knows that the pleasure of play involves an oscillation between where we’ve been and where we’re going; between what passes for new now and what used to do so; between the sensory thrill of the state-of-the-art and the nostalgia of our first innocent encounters with the videogame medium in all its subjectivity-transforming power.

A less elaborate way of saying which is: the Wii represents through its pared-down graphics the return of a historical repressed, the enshrining of a certain simplicity that remains active at the medium’s heart, but until now has not been packaged and sold back to us with quite such panache.

The other interesting claim in the article is that the top games (World of Warcraft, Guitar Hero) are not solitary, solipsistic shooters like Bioshock and Halo, but rich social experiences — you play them with other people around, whether online or ranged around you in the dorm room. Seth Schiesel writes,

Ever since video games decamped from arcades and set up shop in the nation’s living rooms in the 1980s, they have been thought of as a pastime enjoyed mostly alone. The image of the antisocial, sunlight-deprived game geek is enshrined in the popular consciousness as deeply as any stereotype of recent decades.

The thing is, I can’t think of a time when the games I played as a child and teenager in the 1970s and 1980s weren’t social. I always consumed them with a sense of community, whether because my best friend Dan was with me, watching me play (or I watching him) and offering commentary, or because I talked about games endlessly with kids at school. Call it multiplayer avant la lettre; long before LANs and the internet made it possible to blast each other in mazes or admire each other’s avatarial stand-ins, we played our games together, making sense of them as a community — granted, a maligned subculture by the mainstream measure of high school, but a community nonetheless. As graphics get better and technologies more advanced, I hope that gamers don’t rewrite their pasts, forgetting the friendships forged in an around algorithmic culture.