Good Night, and Good Luck.

Once again, thanks to my TV & New Media course, it is time to watch Good Night, and Good Luck (George Clooney, 2005), and once again I am reminded what a beautifully intimate experience it is. On the manifest level of its narrative, the film details the crusade of Edward R. Murrow and the See It Now news team to take Senator Joseph McCarthy to task for his many transgressions against democracy, and it’s gripping stuff: but on the latent level of its mise-en-scene, the movie is all about the television studios, elevators, lobbies, and offices at CBS — pristine spaces rendered in crisp black-and-white cinematography (actually the result of shooting grayscale sets in color, then digitally timing them to a sublime monochrome) and redolent of technological and cultural power as only the broadcast TV era could embody it. In its period evocation it’s Mad Men played straight, and unlike the AMC series, the total lack of exterior shots gives the whole thing the hermetic feel of a holodeck simulation.

When I first saw the film, the U.S. was gritting its teeth through George W. Bush’s second term, and its messages about the abuse of governmental power and patriotic ideology were impossible to read as anything other than statements about our post-9/11 world. Seven years later, the connotative corset has loosened, and exciting resonances with the passionately essayistic journalism of Rachel Maddow and the breathless pace of blogging and spreadable media (an electrical feeling of liveness and deadline I experience, if only in a small way, in my new daily posting regimen) tie Murrow’s moment to our own, inviting us to see the “old” in new media, and vice versa. I’m looking forward to discussing it with my students!

“The most intricate, beautiful map you could possibly imagine”

The first season of HBO’s Game of Thrones followed the inaugural novel of George R. R. Martin’s series so minutely that, despite its obvious excellence, I found it a bit redundant: like the early days of superhero comics in which the panel art basically just illustrated the captions, each episode had the feel of an overly faithful checklist, the impeccable casting and location work a handsome but inert frame for Martin’s baroque and sinister plotting. That’s one big reason why I’m eager to see the second season, which premieres tonight — I bogged down a hundred or so pages into book two, A Clash of Kings, and so apart from a few drips and drabs that have leaked through my spoiler filter, I’m a fresh and untrammeled audience. (Given the sheer scale of A Song of Ice and Fire, at five fat volumes and counting, the whole concept of spoilers seems beside the point, rendered irrelevant by a level of structural complexity that forces synchronic rather than diachronic understandings; it’s hard enough at any given moment to keep track of the dense web of characters, alliances, and intrigues without worrying about where they’ll all be two or three thousand pages later.)

Another reason I’m looking forward to the series’ return is its arresting title sequence, a compact masterpiece of mannered visualization that establishes mood, momentum, and setting in the span of ninety seconds:

Here is how the website Art of the Title describes the governing concept:

A fiery astrolabe orbits high above a world not our own; its massive Cardanic structure sinuously coursing around a burning center, vividly recounting an unfamiliar history through a series of heraldic tableaus emblazoned upon it. An intricate map is brought into focus, as if viewed through some colossal looking glass by an unseen custodian. Cities and towns rise from the terrain, their mechanical growth driven by the gears of politics and the cogs of war.

From the spires of King’s Landing and the godswood of Winterfell, to the frozen heights of The Wall and windy plains across the Narrow Sea, Elastic’s thunderous cartographic flight through the Seven Kingdoms offers the uninitiated a sweeping education in all things Game of Thrones.

“Elastic,” of course, refers to the special-effects house that created the sequence, and Art of the Title‘s interview with the company’s creative director, Angus Wall, is enormously enlightening. Facing the challenge of establishing the nonearthly world in which Game of Thrones takes place, Wall developed the idea of a bowl-shaped map packed with detail. In his words, “Imagine it’s in a medieval tower and monks are watching over it and it’s a living map and it’s shaped like a bowl that’s 30 feet in diameter and these guys watch over it, kind of like they would the Book of Kells or something… they’re the caretakers of this map.” Realizing the limitations of that topology, Elastic put two such bowls together to create a sphere, and placing a sun in the center, arrived at the sequence’s strange and lyrical fusion of a pre-Copernican cosmos with a Dyson Sphere.

Yet even more interesting than the sequence’s conceit of taking us inside a medieval conception of the universe — a kind of cartographic imaginary — is its crystallization of a viewpoint best described as wargaming perspective: as it swoops from one kingdom to another, the camera describes a subjectivity somewhere between a god and a military general, the eternally comparing and assessing eye of the strategist. It’s an expository visual mode whose lineage is less to classical narrative than to video-game cutscenes or the mouse-driven POV in an RTS. Its ultimate root, however, is not in digital simulation but in the tabletop wargames that preceded it — what Matthew Kirschenbaum has evocatively called “paper computers.” Kirschenbaum, a professor at the University of Maryland, blogs among other places at Zone of Influence, and his post there on the anatomy of wargames contains a passage that nicely captures the roving eye of GoT‘s titles:

Hovering over the maps, the players occupy an implicit position in relation to the game world. They enjoy a kind of omniscience that would be the envy of any historical commander, their perspectives perhaps only beginning to be equaled by today’s real-time intelligence with the aid of GPS, battlefield LANs, and 21st century command and control systems.

Earlier I mentioned the sprawling complexity of A Song of Ice and Fire, a narrative we might spatialize as unmanageably large territory — unmanageable, that is, without that other form of “paper computers,” maps, histories, concordances, indices, and family trees that bring order to Martin’s endlessly elaborated diegesis. Their obvious digital counterpart would be something like this wiki, and it’s interesting to note (as does this New Yorker profile) that the author’s contentious, codependent relationship with his fan base is often battled out in such internet forums, where creative ownership of a textual property exists in tension with custodial privilege. Perhaps all maps are, in the words of Kurt Squire and Henry Jenkins, contested spaces. If so, then the tabletop maps on which wargames are fought provide an apt metaphor both for Game of Thrones‘s narrative dynamics (driven as they are by the give-and-take among established powers and would-be usurpers) and for the franchise itself, whose fortunes have increasingly become distributed among many owners and interests.

All of this comes together in the laden semiotics of the show’s opening, which beckons to us not just as viewers but as players, inviting us to engage through this “television computer” with a narrative world and user experience drawn from both old and new forms of media, mapping the past and future of entertainment.

Jason Mittell’s Complex TV

Passing along this word from my friend Jason Mittell, Associate Professor of American Studies and Film and Media Culture at Middlebury College, whose exciting new publication project is now available for open access and “peer-to-peer review.” He is inviting feedback on the pre-published chapters of Complex TV: The Poetics of Contemporary Television Storytelling at http://mediacommons.futureofthebook.org/mcpress/complextelevision/. He shares this outline of the plan:

The book’s introduction and first chapter are posted now (as of Saturday evening, in conjunction with an SCMS workshop on digital publishing I Skyped in for from Germany). I plan on posting chapters every week or two over the next few months, serializing the release to allow time for people to read and comment (and me to finish writing). I hope that momentum will build and the conversation will flourish through this process, but this is obviously an experiment. I hope you can stop in and read the work in progress and offer feedback, and also help spread the word to others who might be interested in the topic. I would also love to hear any feedback about this unorthodox mode of publication and review.

Jason blogs at Just TV.

Easing back into Mad Men

I am a fan of Mad Men, which puts me in a tiny minority consisting of just about everyone I know, and most of those I don’t. Perhaps it’s a holdover from my recent post on 4chan’s reaction to The Hunger Games, but I can’t shake the sense that it’s getting harder to find points of obsession in the pop-culture-scape that haven’t already been thoroughly picked over. Credit AMC, a channel that has branded itself as a reliable producer of hit shows that play like cult favorites. I suspect that my desire to hold onto the illusion that I and I alone understand the greatness of Mad Men is the reason I saved its season-five premiere a full 24 hours, sneaking upstairs to watch it in the darkness of our bedroom, the iPad’s glowing screen held inches from my eyes, the grown-up equivalent of reading under the covers with a flashlight. Let me be alone with my stories.

Somewhat hardening the hard-core of my fanboy credibility is the fact I’ve followed the show religiously since it first aired in 2007; it jumped out at me immediately as a parable of male terror, a horror story dressed in impeccable business suits. That basic anxiety has stayed with the show, though it’s only one of its many pleasures. One aspect of the series that I once believed important, the 1960s setting presented in such alien terms as to turn the past into science fiction, turns out not to be so crucial, if the failure of broadcast-network analogs Pan Am and The Playboy Club are any indication.

As for the premiere, I enjoyed it well enough, but not nearly as much as I will enjoy the rest of the season. It always takes me a while to get back into Mad Men’s vibe, with the first episodes seeming stiff and obvious, even shallow, only later deepening into the profoundly entertaining, darkly funny, and ultimately heartbreaking drama I recognize the show to be. The slow thaw reminds me of how I react to seeing Shakespeare on stage or screen: it takes twenty minutes for the rarefied language to come into focus as something other than ostentation. Mad Men too initially distances by seeming a parody of itself, but I suspect that the time it takes for my eyes and ears to adapt to this hermetically-sealed bubble universe has more to do with its precise evocation of a juncture in history when surfaces were all we had — when the culture industry, metonymized in Madison Avenue, found its brilliant stride as a generator of sizzle rather than a deliverer of steak.

The Walking Dead

How does the old joke go? “What a terrible restaurant — the food sucks, and such small portions!” That seems to be the way a lot of people feel about AMC’s The Walking Dead: it’s an endless source of disappointment as well as the best damn zombie show on television.

Not that there’s much competition. Contemporary TV horror is a small playing field, nothing like the heyday of the 1970s, when Night GalleryKolchak, Ghost Story, and telefilms like Don’t Be Afraid of the Dark fed home audiences a plentiful stream of dark and disturbing content, “channeling” a boom in horror cinema that began with demonic-possession blockbuster The Exorcist and morphed late in the decade, via Halloween and Friday the 13th, into the slasher genre. The only real competition for TWD is American Horror Story, a series whose unpleasantness is so expertly-wrought that I couldn’t make it past the third episode. Apart from this and an endless supply of genre-debasing quasi-reality shows on SyFy a la Paranormal Witness, there’s simply not a lot to choose from, and for this reason alone, The Walking Dead is far, far better than it needs to be.

But it’s still a frustrating show: like its zombies, slow-moving and unsure of its goals. (The guys at Penny Arcade, it should be pointed out, hold the opposite interpretation.) Following a phenomenal pilot episode that ended on one of the best cliffhangers I’ve seen since the closing shot of Best of Both Worlds, Part 1, the first season burned through four tense episodes, only to close with an implausible, shoehorned finale set in CDC control center. Season two, at twice the length, has moved at half the speed, and while I enjoyed the thoughtful pace of life at Herschel’s farm, I grew impatient — like many — with plots that seemed to circle compulsively around the same issues week after week, played out in arguments that reduced a formidable cast of characters (and likeable actors) into tiresomely broken records. (The death of Dale [Jeffrey DeMunn] in the antepenultimate episode came as a relief, signaling that the series was fed up with its own moral center.)

Too, there is simply a feeling that more should be happening on a show about the zombie apocalypse; events should play out on a larger scale, balancing the conflicts among characters with action sequences on the level of the firebombing of Atlanta that opens “Chupacabra.” Part of the problem, I suspect, is that the ZA has been visualized so thoroughly in the decades since George A. Romero’s Night of the Living Dead; books like Max Brooks’s Zombie Survival Guide and World War Z, not to mention the many sequels, remakes, and ripoffs of Romero’s 1968 breakthrough, have fleshed out the undead plague on a planetary scale. The blessing of this most fecund of horror genres (second only, perhaps, to vampires) is also its curse: too much has been said, too many bottoms of barrels scraped, too many expectations raised. When the Centers for Disease Control put out preparedness warnings, it’s a safe bet the ante has been upped.

Of course, the most proximate source of raised expectations is the comic book and graphic novel series that originated The Walking Dead; Robert Kirkman, Tony Moore, and Charlie Adlard captured lightning in a bottle with their brisk yet methodical storytelling, whose black-and-white panels powerfully recall Romero’s foundational film, and whose pacing — in monthly bites of thirty pages — lends itself to a measured unfolding that has so far eluded the TV version. I’m less interested in discrepancies between the comic and the show than in the formal (indeed, ontological) problems of adaptation they illustrate: like Zack Snyder’s Watchmen movie, some fundamental, insurmountable obstruction seems to exist between the two forms of visual storytelling that otherwise seem so suited to mutual transcoding.

On a surface level, what works in the comic — the mise-en-scene of an emptied world, a uniquely American literalization of existential crisis through the metaphor of reanimated, cannibalistic corpses — works beautifully on screen. And person by person, the show brings the characters of the page to life (an artful act of reanimation itself, I suppose). But what it hasn’t done, and maybe never can do, is recreate the comic’s particular style of punctuation, doling out panels that closely attend to nuances of expression and shifts in lighting, then interleaving those orderly moments of psychological observation with big, raw shocks of splash pages that bring home the sickening spectacle of existence as eventual prey.

I’ll tune in tonight for the finale, and without question I will be there to devour season three. Furthermore, I’ll defend The Walking Dead — in both its incarnations — as some of the best horror that’s currently out there. But I’ll be watching the show out of a certain duty to the genre, whereas the comic, which I’m saving up to read in blocks of 10 and 12 issues at a go, I’ll savor as such stories are meant to be savored: late at night, alone in the quiet house, by a lamp whose glow might as well be the last light left in a world gone dark.

NBC’s pleasant surprise

For fans of NBC sitcoms, the return of Community is surely the week’s big news — but thanks to a packed DVR and a baby pushing the boundaries of his bedtime following a daylight-savings shakeup, I haven’t yet watched it. Instead, another and somewhat lesser show caught me by happy surprise. Whitney, still in its first season, featured an episode in which one of the side characters comes out of the closet. Neal — played by Jack’s former assistant on the increasingly moribund 30 Rock — finds himself confusedly but unerringly attracted to another man, and as the news travels around his circle of friends, I braced for stupidity, or worse, a conservative return to the status quo by story’s end. But none of this happened; Neal was allowed his realization, and his pals, usually a reliable generator of quippy sarcasm, took the announcement in stride. Not the most earth-shaking development, perhaps, but a victory nonetheless: a small bit of sanity and compassion amid an overheated political season and a television programming block otherwise prone to doofusy dismissals of difference and a disdain for the messiness of actual life. Bravo, Whitney, for doing something right.

Face Off’s practical magic[ians]

With the finale of Face Off airing tonight, I wanted to quickly share my fondness for the Syfy series, which pits fledgling special-effects artists against each other in timed challenges to create fantastical make-ups. Now in its second season, the show is notable for the way it eschews (to the point of rarely acknowledging the existence of) digital effects, which within the industry increasingly augment and substitute for old-school prosthetics, blood and wound creation, and creature design. For many who grew up watching SF and fantasy film and television in what we now recognize as the analog era of special effects, there is an irreducible beauty to such practical magic, no matter how realistic or unrealistic such effects might currently appear; indeed, our celebration of the artistry involved arguably cannot flower outside the passage of time — and advances in technology — that render older special effects visible precisely as tricks, in turn demonstrating the bankruptcy and uselessness of standards for screen illusion that hinge solely on those illusions’ undetectability.

The deeper import of Face Off, running beneath its highly entertaining races to design, fabricate, apply, and paint prosthetic appliances, is that such processes preserve an individual, artisanal ethos that is vanishing from the contemporary effects industry; CGI takes more people, and more time, than the rigors of reality-show competitions allow, which is one reason why the digital era of visual effects has yet to produce an auteur on the level of Stan Winston, Dick Smith, or Ray Harryhausen. (Instead, that auteur function has reverted to the director, himself [so far always a “him”] a crossover between artist and technician a la James Cameron.)

The other thing I dig about Face Off is that it is one of the few reality competitions that don’t focus on beautiful people — what Brenda Weber calls the “afterbodies” of makeover TV — or on unbeautiful people as a problem in need of solving, as in The Biggest Loser. Instead, the contestants of Face Off, like the judges, are a wonderful miscellany of folks bearing the styles and decorations of subcultures who rarely receive air time except as oddities. The gender (and, between the lines, queer and transgender) identities are welcomely mixed, though so far pretty uniformly white. It’s an almost accidental showcase of diversity that makes perfect sense given the communities of fandom that populate Face Off: a group whose self-conscious display of difference is, itself, a celebration of the modified and colorful body, encased in its cleverly bizarre social prosthetics.

FMST 84: TV and New Media

Course Description and Goals

This course explores the commercial, technological, and aesthetic dimensions of television, using this fundamentally “transient and unstable” medium (as William Uricchio has called it) as a springboard for larger discussions about cultural responses to media succession. At its birth, television disrupted and reworked the media around it (film, radio, and telephone); has itself been reshaped by VCRs, DVDs, and game consoles; and now faces further redefinition by smart phones, iPads, DVRs, streaming video on demand, social networking, and piracy. Amid all the excitement, our challenge as critical media scholars is to separate the revolutionary from the evolutionary, arriving at a comprehensive picture of how the contemporary mediascape – with its promises of total information access, on-demand entertainment, and democratic participation in content creation – both extends and breaks with tradition.

Our goals, by the end of the term, will be to (A) map the historical paths by which television has grown from a radically “new” medium to an everyday part of our social and ideological fabric; (B) explore the ways in which TV, as industry and entertainment form, incorporates and responds to emerging technologies, new media genres, and globalization; (C) analyze recurrent tropes in the cultural imagining of new media, such as interactivity, “liveness,” and tensions between mass and individual, fiction and reality; and finally (D) reflect critically on our own media practices – how we use media for pleasure and knowledge, and how media in turn shape us as consumers and citizens, as gendered and raced individuals.

Texts

  • Bennett, James and Niki Strange (eds). Television as Digital Media.
    Durham: Duke University Press, 2011. [TVDM]
  • Kackman, Michael et al (eds). Flow TV: Television in the Age of Media
    Convergence
    . New York: Routledge, 2011. [FTV]
  • Newman, Michael Z. and Elana Levine. Legitimating Television: Media
    Convergence and Cultural Status
    . New York: Routledge, 2012. [LT]
  • Links to and PDFs of additional readings on Moodle (https://moodle.swarthmore.edu/my/). Please print and bring all texts to class.

Graded Course Components

  • 10%            Participation
  • 10%            Podcast
  • 15%            Midterm
  • 20%            Journal
  • 20%            Blogging
  • 25%            Final Project

Participation

Includes regular attendance (if you must miss class, please email me with an explanation), preparation (read all materials in advance), and active, helpful contributions to discussion.

Podcast

You will sign up to record and post to Moodle a 5-minute podcast (audio or video) that responds critically to one of our readings. Podcasts must be posted by Monday night so everyone can review before class. Podcasts will begin in Week 3.

Midterm

Working in teams of two, you will find two media artifacts (clips of TV series, YouTube videos, etc.), one representing “old” and the other “new,” and bring them together in a post to the class wiki that explores their relationship and connects it to a question, theory, or author(s) we have covered. We will view and discuss these in class in Week 8.

Journal

Throughout the semester, you will keep a journal on Moodle in which you respond to prompts, track and discuss your own media habits, and analyze media content. Plan to journal once every two weeks, for a total of 6-8 substantive entries. As part of this assignment, watch several episodes of one of the TV series listed at the end of the syllabus, all of which are on reserve at McCabe.

Blogging

I will divide you into four teams of 4-6 people. Each team will take responsibility for posting to the class blog for one three-week term, while the rest of the class comments. Teams should plan to post at least every other day, for a total of 9-12 entries, with all members participating. Posts may be drawn from current news and events in media, historical materials, or responses to course topics and discussion, but should always be relevant and interesting. Note: assessment of this component will be based both on how your team performs, and how active each individual is in commenting when other teams are posting.

Final Project

Your final project, on a research question of your choice, will combine a wiki page with a 10-minute presentation and participation in a Q&A at our colloquium in Week 14.

Calendar

Readings, topics, and screenings are subject to change.

Week 1 (Jan 18) – Course Introduction

  • Screening: Network (Sidney Lumet, 1976)
  • Intros to LT, FTV, TVDM

Week 2 (Jan 25) – Broadcast TV: History, Forms, and Genres

  • Screening: Marty (Delbert Mann, 1953)
  • LT 2, “Another Golden Age?”
  • Anderson, “Television Networks and the Uses of Genre”
  • Williams, “Programming as Sequence or Flow”
  • Dayan and Katz, from Media Events: The Live Broadcasting of History
  • Ellis, from Visible Fictions: Cinema, Television, Video
  • § Team 1 blogs

Week 3 (Feb 1) – TV in the Age of the Web

  • TVDM Dawson, “Television’s Aesthetic of Efficiency”
  • TVDM Burgess, “User-Generated Content and Everyday Cultural Practice”
  • FTV Gurney, “It’s Just Like a Mini-Mall”
  • § Team 1 blogs
  • Podcasts begin

Week 4 (Feb 8 ) – Converging and Spreading

  • You are expected to attend Henry Jenkins events: lecture 2/9 at 7 p.m. in SCI 101; conversation with students 2/10 at 10 a.m., Scheuer Room
  • Excerpts from Convergence Culture, Spreadable Media
  • § Team 1 blogs

Week 5 (Feb 15) – Audiences, Agency, Authorship, Interpretation

  • Screening: Twin Peaks (David Lynch, 1991)
  • LT 3, “The Showrunner as Auteur”
  • FTV Gray, “The Reviews Are In”
  • FTV Stein, “Word of Mouth on Steroids”
  • § Team 2 blogs

Week 6 (Feb 22) – Spaces and Screens

  • LT 6, “The Television Image and Image of Television”
  • TVDM Boddy, “Is It TV Yet?”
  • FTV Chamberlin, “Media Interfaces”
  • § Team 2 blogs

Week 7 ( Feb 29) – Race, Ethnicity, Identity

  • Screening: Color Adjustment (Marlon Riggs, 1992)
  • FTV Kim, “NASCAR Nation and Television: Race-ing Whiteness”
  • FTV Amaya, “Television/Televisión”
  • § Team 2 blogs

Spring Break

Week 8 (Mar 14): Old and New

  • Present midterms in class
  • § Team 3 blogs

Week 9 (Mar 21) – Drama

  • Screening: TBA
  • LT 5, “Not A Soap Opera”
  • Seiter and Wilson, “Soap Opera Survival Tactics”
  • § Team 3 blogs

Week 10 (Mar 28) – Comedy

  • Screening: TBA
  • LT 4, “Upgrading the Situation Comedy”
  • Butsch, “Five Decades and Three Hundred Sitcoms About Class and Gender”
  • § Team 3 blogs

Week 11 (Apr 4) – Reality

  • Screening: TBA
  • Simon, “The Changing Face of Reality Television”
  • FTV Bratich, “Affective Convergence in Reality Television”
  • FTV Kavka, “Industry Convergence Shows”
  • § Team 4 blogs

Week 12 (Apr 11) – News and Politics

  • Screening: Good Night, and Good Luck (George Clooney, 2005)
  • FTV Freedman, “The Limits of the Cellular Imaginary”
  • FTV Tryon, “Representing the Presidency”
  • § Team 4 blogs

Week 13 (Apr 18) – Cult

  • Screening: Dr. Horrible’s Sing-Along Blog (Joss Whedon, Web, 2009); “Love and Monsters” (Doctor Who, BBC1, w. Russell T. Davies, d. Dan Zeff, 2006)
  • TVDM Pearson, “Cult Television as Digital Television’s Cutting Edge”
  • FTV Kompare, “Online Cult Television Authorship”
  • § Team 4 blogs

Week 14 (Apr 25) – Colloquium and Course Conclusion

  • Meet in SCI 101 during screening time to present final projects

There is no final exam in this course.

Don’t Be Afraid of the Dark

It’s hard to pinpoint the primal potency of the original Don’t Be Afraid of the Dark, the 1973 telefilm about a woman stalked by hideous little troll monsters in the shadowy old house where she lives with her husband. The story itself, a wisp of a thing, has the unexplained purity of a nightmare piped directly from a fevered mind: both circuitously mazelike and stiflingly linear, it’s like watching someone drown in a room slowly filling with water. As with contemporaries Rosemary’s Baby and The Stepford Wives, it’s a parable of domestic disempowerment built around a woman whose isolation and vulnerability grow in nastily direct proportion to her suspicion that she is being hunted by dark forces. All three movies conclude in acts of spiritual (if not quite physical) devouring and rebirth: housewives eaten by houses. To the boy I was then, Don’t Be Afraid of the Dark provoked a delicious, vertiginous sliding of identification, repulsion, and desire: the doomed protagonist, played by Kim Darby, merged the cute girl from the Star Trek episode “Miri” with the figure of my own mother, whose return to full-time work as a public-school librarian, I see now, had triggered tectonic shifts in my parents’ relationship and the continents of authority and affection on which I lived out my childhood. These half-repressed terrors came together in the beautiful, grotesque design of the telefim’s creatures: prunelike, whispery-voiced gnomes creeping behind walls and reaching from cupboards to slice with razors and switch off the lights that are their only weakness.

The 2011 remake, which has nothing of the original’s power, is nevertheless valuable as a lesson in the danger of upgrading, expanding, complicating, and detailing a text whose low-budget crudeness in fact constitutes televisual poetry. Produced by Guillermo del Toro, the movie reminds me of the dreadful watering-down that Steven Spielberg experienced when he shifted from directing to producing in the 1980s, draining the life from his own brand (and is there not a symmetry between this industrial outsourcing of artistry and the narrative’s concern with soul-sucking?). The story has been tampered with disastrously, introducing a little girl to whom the monsters (now framed as vaguely simian “tooth fairies”) are drawn; the wife, played by a bloodless Katie Holmes, still succumbs in the end to the house’s demonic sprites, but the addition of a maternal function forces us to read her demise as noble sacrifice rather than nihilistic defeat, and when husband (Guy Pearce) walks off with daughter in the closing beat, it comes unforgivably close to a happy ending. As for the monsters, now more infestation than insidiousness, they skitter and leap in weightless CGI balletics, demonstrating that, as with zombies, faster does not equal more frightening. But for all its evacuation of purpose and punch, the remake is useful in locating a certain undigestible blockage in Hollywood’s autocannibalistic churn, enshrining and immortalizing — through its very failure to reproduce it — the accidental artwork of the grainy, blunt, wholly sublime original.

What is … Watson?

We have always loved making our computers perform. I don’t say “machines” — brute mechanization is too broad a category, our history with industrialization too long (and full of skeletons). Too many technological agents reside below the threshold of our consciousness: the dumb yet surgically precise robots of the assembly line, the scrolling tarmac of the grocery-store checkout counter that delivers our purchases to another unnoticed workhorse, the cash register. The comfortable trance of capitalism depends on labor’s invisibility, and if social protocols command the human beings on either side of transactions to at least minimally acknowledge each other — in polite quanta of eye contact, murmured pleasantries — we face no such obligation with the machines to whom we have delegated so much of the work of maintaining this modern age.

But computers have always been stars, and we their anxious stage parents. In 1961 an IBM 704 was taught to sing “Daisy Bell” (inspiring a surreal passage during HAL’s death scene in 2001: A Space Odyssey), and in 1975 Steve Dompier made his hand-built Altair 8800 do the same, buzzing tunes through a radio speaker at a meeting of the Homebrew Computer Club, an early collective of personal-computing enthusiasts. I was neither old enough nor skilled enough to take part in that initial storm surge of the microcomputer movement, but like many born in the late 1960s, was perfectly poised to catch the waves that crashed through our lives in the late 70s and early 80s: the TRS-80, Apple II, and Commodore PET; video arcades; consoles and cartridges for playing at home, hooked to the TV in a primitive convergence between established and emerging technologies, conjoined by their to-be-looked-at-ness.

Arcade cabinets are meant to be clustered around, joysticks passed around an appreciative couchbound audience. Videogames of any era show off the computer’s properties and power, brightly blipping messages whose content, reversing McLuhan, is new media, presenting an irresistible call both spectacular and interactive to any nerds within sensory range. MIT’s Spacewar worked both as game and graphics demo, proof of what the state of the art in 1962 could do: fifty years later, the flatscreens of Best Buy are wired to Wiis and PlayStation 3s, beckoning consumers in endless come-on (which might be one reason why the games in so many franchises have become advertisements for themselves).

But the popular allure of computers isn’t only in their graphics and zing. We desire from them not just explorable digital worlds but minds and souls themselves: another sentient presence here on earth, observing, asking questions, offering commentary. We want, in short, company.

Watson, the IBM artifact currently competing against champions Ken Jennings and Brad Rutter on Jeopardy, is the latest digital ingenue to be prodded into the spotlight by its earnest creators (a group that in reaction shots of the audience appears diverse, but whose public face in B-roll filler sums to the predictable type: white, bespectacled, bearded, male). Positioned between Jennings and Rutter, Watson is a black slab adorned with a cheerful logo, er, avatar, conveying through chance or design an uneasy blend of 2001‘s monolith and an iPad. In a nearby non-space hums the UNIVAC-recalling bulk of his actual corpus, affixed to a pushbutton whose humble solenoid — to ring in for answers — is both a cute nod to our own evolution-designed hardware and a sad reminder that we still need to even the playing field when fighting Frankenstein’s Monster.

There are two important things about Watson, and despite the technical clarifications provided by the informational segments that periodically and annoyingly interrupt the contest’s flow, I find it almost impossible to separate them in my mind. Watson knows a lot; and Watson talks. Yeats asked, “How can we know the dancer from the dance?” Watson makes me wonder how much of the Turing Test can be passed by a well-designed interface, like a good-looking kid in high school charming teachers into raising his grades. Certainly, it is easy to invest the AI with a basic identity and emotional range based on his voice, whose phonemes are supplied by audiobook narrator Jeff Woodman but whose particular, peculiar rhythms and mispronunciations — the foreign accent of speech synthesis, as quaint as my father’s Czech-inflected English — are the quirky epiphenomena of vast algorithmic contortions.

Another factor in the folksiness of Watson is that he sounds like a typical Jeopardy contestant — chirpy, nervous, a little full of himself — and so highlights the vaguely androidish quality of the human players. IBM has not just built a brain in a box; they’ve built a contestant on a TV game show, and it was an act of genius to embed this odd cybernetic celebrity, half quick-change artist, half data-mining savant, in the parasocial matrix of Alex Trebek and his chronotypic stage set: a reality already half-virtual. Though I doubt the marketing forces at IBM worried much about doomsday fears of runaway AIs, the most remarkable thing about Watson may be how benign he seems: an expert, and expertly unthreatening, system. (In this respect, it’s significant that the computer was named not for the brilliant and erratic Sherlock Holmes, but his perpetually one-step-behind assistant.)

Before the competition started, I hadn’t thought much about natural-language processing and its relationship to the strange syntactic microgenre that is the Jeopardy question. But as I watched Watson do his weird thing, mixing moronic stumbles with driving sprints of unstoppable accuracy, tears welled in my eyes at the beautiful simplicity of the breakthrough. Not, of course, the engineering part — which would take me several more Ph.D.s (and a whole lotta B-roll) to understand — but the idea of turning Watson into one of TV’s limited social beings, a plausible participant in established telerituals, an interlocutor I could imagine as a guest on Letterman, a relay in the quick-moving call-and-response of the one quiz show that has come to define, for a mass audience, high-level cognition, constituted through a discourse of cocky yet self-effacing brilliance.

Our vantage point on Watson’s problem-solving process (a window of text showing his top three answers and level of confidence in each) deromanticizes his abilities somewhat: he can seem less like a thinking agent than an overgrown search engine, a better-behaved version of those braying “search overload” victims in the very obnoxious Bing ads. (Tip to Microsoft: stop selling your products by reminding us how much it is possible to hate them.) But maybe that’s all we are, in the end: social interfaces to our own stores of internal information and experience, talkative filters customized over time (by constant interaction with other filters) to mistake ourselves for ensouled humans.

At the end of the first game on Tuesday night, Watson was ahead by a mile. We’ll see how he does in the concluding round tonight. For the life of me, I can’t say whether I want him to win or to lose.