Having sung the praises of Fridays and gloomed about Sundays, am I set on turning Mondays into self-reflexive meditations on method? Evidently so, at least until the spring semester ends and time opens up a little. Tonight I (gently) chastised my wife for skipping her picture-a-day project on Facebook,* but as usual when it comes to judging others, in truth I am just projecting my own anxiety — in this case, an almost superstitious dread of disrupting a chain of daily posts now six weeks long and counting. Keeping it up has meant sacrificing a number of things I once held dear: cherished notins of myself as a brilliant writer, lingeringly slow-cooked prose, a certain dignity and distance in my choice of topics. Laudable goals all, but maybe too a little hollow and egocentric, and often unconducive to productivity. Instead, I’m discovering the hard comfort of routine, the discipline of a writing practice, along with a new kind of notch to cut into the wall.
That said, I’m also feeling the need to start working these daily posts into something longer and more substantive — an actual, paper publication — so that will be the next horizon. It won’t happen without a deadline and some goalposts, so over the next several days I will begin mapping out an experiment in scholarship, an essay to be drafted, as it were, in public. I approach this with some trepidation but also excitement: as with the act of teaching, through which my body has evolved a new organ for converting anxiety to energy, writing this blog is helping to wear down the last vestiges of resistance to taking risks.
Once again, thanks to my TV & New Media course, it is time to watch Good Night, and Good Luck(George Clooney, 2005), and once again I am reminded what a beautifully intimate experience it is. On the manifest level of its narrative, the film details the crusade of Edward R. Murrow and the See It Now news team to take Senator Joseph McCarthy to task for his many transgressions against democracy, and it’s gripping stuff: but on the latent level of its mise-en-scene, the movie is all about the television studios, elevators, lobbies, and offices at CBS — pristine spaces rendered in crisp black-and-white cinematography (actually the result of shooting grayscale sets in color, then digitally timing them to a sublime monochrome) and redolent of technological and cultural power as only the broadcast TV era could embody it. In its period evocation it’s Mad Men played straight, and unlike the AMC series, the total lack of exterior shots gives the whole thing the hermetic feel of a holodeck simulation.
When I first saw the film, the U.S. was gritting its teeth through George W. Bush’s second term, and its messages about the abuse of governmental power and patriotic ideology were impossible to read as anything other than statements about our post-9/11 world. Seven years later, the connotative corset has loosened, and exciting resonances with the passionately essayistic journalism of Rachel Maddow and the breathless pace of blogging and spreadable media (an electrical feeling of liveness and deadline I experience, if only in a small way, in my new daily posting regimen) tie Murrow’s moment to our own, inviting us to see the “old” in new media, and vice versa. I’m looking forward to discussing it with my students!
It’s Easter weekend, according to the plastic eggs dangling from tree branches in our neighbor’s yard, and I am once again experiencing the odd non-sensation of my own long-lapsed Catholicism. I would like to say it’s something I still struggle with — indeed, struggling with things seemed to be the ur-lesson of most of the scriptural teachings to which I was exposed — but the truth is that I left the church as soon as I was doctrinally allowed to, following my Confirmation, and never looked back. Always suspicious of the soap-smelling classrooms of my Sunday school and grumblingly resistant to any commitment of a weekend morning (I remember complaining to my parents that I only got two mornings a week to sleep in, which made them laugh, and not in a nice way), I hit my breaking point when one of our teachers gently explained to me that no animals, including cats, could make it to the afterlife, since they had no souls. Maybe true and maybe not, but in any case, not a faith that fits or suits me.
Instead, I’ve spent most of my life engaging in the secular ritual of weekends, playing out my small personal drama fifty-two times a year, kicking off with the joyous arrival of Friday, bookended by the grim letdown of Sunday. At heart I will always be wired for weekends and summer vacations, patterns of leisure stamped into me by the school calendar, continued now in my career as a college professor. In the microcosm of the weekend, on Fridays I am young and just getting out of school; on Sundays, old, an adult preparing for the work of the coming week. Death and resurrection, not of the body but of the spirit.
Fridays lately have the added significance of being “family days,” devoted to Zach and Katie. You’d think that my role as a husband and father would mark the apogee of grown-up-ness, but in practice these days are about much more straightforward pleasures: putting aside schoolwork to experience the easy companionship of my wife’s love, the eternal and unflawed presentness of Z’s babyhood. Fridays remind me what my mind and heart used to be like before they got all kinked up and complicated, and I am as grateful for their simplicity as I am awed by their profundity.
I speak often and with great satisfaction of my Man Cave, our house’s finished basement where my nerdish technophilia is allowed free reign. My PC tower and its domino-line backdrop of external hard drives; my big, flat TV atop its nest of audio components and cables; a small museum of video-game consoles; and the nonelectronic pleasures of my John D. MacDonald paperbacks (inherited from my father, who freshly arrived from Czechoslovakia in the 1950s used detective and spy fiction to hone his English-language skills), white cardboard longboxes of unexamined comics which with every passing year come more to resemble stacked sarcophagi, a dusty Millennium Falcon playset packed with Star Wars action figures in various stages of dismemberment (the latter a gift from my brother in law).
As this inventory suggests, the contents of the Man Cave embody not just arrested development but a certain ongoing regression: a march in reverse through the stages and artifacts of the enthusiasms that made me what I am today. For that reason, it’s fitting that I have opened a new wing whose title might be “Boy Cave”: a model-kit-building station in a side workroom where the heating-oil tank and cat-litter box vie with paint thinner and acrylic glue for the prize of most fascinatingly noxious scent.
Currently on the workbench is Polar Lights’s Robby the Robot, a kit I’ve been dabbling with for more than a year, but which a few nights ago I decided to buckle down and finish. (Pictured above, the 1/12-scale figure is still missing an ornamental arrangement of gyroscopes on top of its head, and over that a clear dome that seals its brain circuitry in place.) Model kits based on science fiction and fantasy have become a central preoccupation in my scholarship, and I guess in some ways I have returned to kit-building in order to (re)gain firsthand experience of this strange subculture of artifactual play and constructivist leisure — its material investments as well as its surrounding discursive community (see, for example, the reviews and build-guides here, here, and here).
But I’m also realizing a simple and zenlike truth, which is that to build a kit you must build it; it won’t finish itself. And the difference between dreaming and doing, which has so often constituted an agonizing contrapunto to my publishing life, is like the difference between the unassembled plastic parts still on their sprue and the built, painted, finalized thing: a matter of making. If I can fit the pieces of Robby together in stray minutes (and it turns out that the rhythms of model-kit assembly fit nicely into the scattered but semipredictable intervals of parenting), what else might I accomplish, simply by opting to complete — rather than just contemplate — the process?
Longtime readers of this blog (and I take it on faith that there are one or two of you out there) will know that I am obsessed with the clumsy sublime, Laura Mulvey’s term for the accidental beauty of old-school special effects in classical Hollywood — studio tricks and machinations meant to pass unnoticed in their time, but which become visible and available for fresh appreciation as years go by and the state of the art evolves.
The same reader(s) will be familiar with my interest in virtual spaces and more specifically with the “pocket universes” of storyworlds and certain photographic experiments, such as the one featured in this post on the 2008 Republican National Convention. Well, now I’ve found a new toy to think with: this lovely virtualization of the courtyard in Rear Window (1954), digitally stitched together to recreate not just the large and elaborate studio set on which Alfred Hitchcock filmed the tale of L. B. Jefferies (James Stewart), whose immobilization by broken leg unlocks a visual mobility in which he scans through binoculars the social (and as it turns out, criminal) microcosm of the neighbors in his apartment complex. Here’s the video:
Within this composited space, part Holodeck, part advent calendar, the action of the movie unfolds with new seamlessness and unity, time’s passage marked by sunrises and sunsets, clouds rippling overhead, the moon rising on its nighttime trajectory as the small community bustles through an overlapping ballet of the quotidian. In the middle of it all, a harried husband builds to a murderous rage, disposes of a body, is investigated by Grace Kelly sneaking in from a fire escape.
One of Hitchcock’s undisputed masterpieces, Rear Window is also the go-to example in introductory film courses to illustrate voyeurism, scopophilia, and the cinematic apparatus — a raft of abstract yet efficacious concepts that come out of the “grand theory” tradition of film studies, themselves subject, perhaps, to their own form of clumsy sublimation. I wonder how we might update those ideas in an era of computational revisitation and transformation, in which the half-built, half-imagined territories of classical cinema can be unfolded into digital origami that simultaneously make them more “real” while rendering apparent their intricate artificiality, recoding cinema’s dreamspaces into simulacral form.
Watching Z pull himself eagerly along a shelf of books at our local library — thin tomes, the spine of each marked by a circular sticker whose color indicated the intended reading level — I allowed myself to hope that he will grow up bookish like me, drawn to the quiet hours one finds between a story’s pages, or the murmur of a parent’s voice reading aloud before bed. My mother read to me sometimes, but more consistently it was my oldest brother Paul, himself a highly intellectual, tense, and troubled young man with whom I got along not at all by day, but by night became my guide through lands of fantasy: The Hobbit, the Narnia books, A Wrinkle in Time and its remarkable, disquieting sequels. I would love to read these to Z, and more: Heinlein’s young adult novels, and Harry Potter, and E. Nesbit. But for now, I just whisper in the darkness of the nursery: shh shh shh, and it’s all right: waiting for the grape-flavored infant Advil to kick in and muffle his teething pain, our story a more simple and shared one of comfort in the night.
Here are preliminary notes for a brief guest lecture I’m giving tomorrow in Professor Maya Nadarni’s course “Anthropological Perspectives on Childhood and the Family.” The topic is Children at Play.
Introduction: my larger research project
fantastic-media objects, includes model kits, collectible statues, wargaming figurines, replica props: unreal things with material form
these objects are an integral part of how fantastic transmedia franchises gain purchase culturally and commercial, as well as how they reproduce industrially
particularly complex objects in terms of signification and value, mediation of mass and private, principles of construction, and local subcultures (both fan and professional) where they are taken up in different ways
while these objects have been with us for decades, evolving within children’s culture, hobby cultures, gaming, media fandom, and special-effects practices, the advent of desktop fabrication (3D printing) paired with digital files portends a shift in the economies, ontologies, and regulation of fantastic-media objects
Jonathan Gray, Show Sold Separately: a counter-reading of toys and action figures
examines Star Wars toys and action figures as examples of paratexts shaping interpretation of “main text”
story of Lucas’s retention of licensing rights, considered risible at the time
graphic showing that toys and action figures account for more profits than films and video games combined
rescues “denigrated” category of licensed toys as “central to many fans’ and non-fans’ understandings of and engagements with the iconic text that is Star Wars. … Through play, the Star Wars toys allowed audiences past the barrier of spectatorship into the Star Wars universe.” (176)
licensed toys provide opportunities “to continue the story from a film or television program [and] to provide a space in which meanings can be worked through and refined, and in which questions and ambiguities in the film or program can be answered.” (178)
notes role of SW toys in sustaining audience interest during 1977-1983 period of original trilogy’s release
transgenerational appeal of franchise linked to toys as transitional objects, providing a sense of familiarity in young fans’ identities
current transmedia franchises include licensed objects as components of extended storyworlds
Case study in history: the objects of monster culture
1960s monster culture spoke to (mostly male and white) pre-teen and adolescent baby boomers
mediated through Famous Monsters of Filmland (1958-), especially advertising pages from “Captain Company”
Aurora model kits were key icons of this subculture: “plastic effigies”
Steven M. Gelber: popularization of plastic kits represented “the ultimate victory of the assembly line,” contrasting with an earlier era of authentic creativity in which amateur crafters “sought to preserve an appreciation for hand craftsmanship in the face of industrialization.” (262-263)
model kits provided young fans with prefab creativity, merging their own crafts with media templates; also opportunities for transformation (1964 model kit contest)
The first season of HBO’s Game of Thrones followed the inaugural novel of George R. R. Martin’s series so minutely that, despite its obvious excellence, I found it a bit redundant: like the early days of superhero comics in which the panel art basically just illustrated the captions, each episode had the feel of an overly faithful checklist, the impeccable casting and location work a handsome but inert frame for Martin’s baroque and sinister plotting. That’s one big reason why I’m eager to see the second season, which premieres tonight — I bogged down a hundred or so pages into book two, A Clash of Kings, and so apart from a few drips and drabs that have leaked through my spoiler filter, I’m a fresh and untrammeled audience. (Given the sheer scale of A Song of Ice and Fire, at five fat volumes and counting, the whole concept of spoilers seems beside the point, rendered irrelevant by a level of structural complexity that forces synchronic rather than diachronic understandings; it’s hard enough at any given moment to keep track of the dense web of characters, alliances, and intrigues without worrying about where they’ll all be two or three thousand pages later.)
Another reason I’m looking forward to the series’ return is its arresting title sequence, a compact masterpiece of mannered visualization that establishes mood, momentum, and setting in the span of ninety seconds:
A fiery astrolabe orbits high above a world not our own; its massive Cardanic structure sinuously coursing around a burning center, vividly recounting an unfamiliar history through a series of heraldic tableaus emblazoned upon it. An intricate map is brought into focus, as if viewed through some colossal looking glass by an unseen custodian. Cities and towns rise from the terrain, their mechanical growth driven by the gears of politics and the cogs of war.
From the spires of King’s Landing and the godswood of Winterfell, to the frozen heights of The Wall and windy plains across the Narrow Sea, Elastic’s thunderous cartographic flight through the Seven Kingdoms offers the uninitiated a sweeping education in all things Game of Thrones.
“Elastic,” of course, refers to the special-effects house that created the sequence, and Art of the Title‘s interview with the company’s creative director, Angus Wall, is enormously enlightening. Facing the challenge of establishing the nonearthly world in which Game of Thrones takes place, Wall developed the idea of a bowl-shaped map packed with detail. In his words, “Imagine it’s in a medieval tower and monks are watching over it and it’s a living map and it’s shaped like a bowl that’s 30 feet in diameter and these guys watch over it, kind of like they would the Book of Kells or something… they’re the caretakers of this map.” Realizing the limitations of that topology, Elastic put two such bowls together to create a sphere, and placing a sun in the center, arrived at the sequence’s strange and lyrical fusion of a pre-Copernican cosmos with a Dyson Sphere.
Yet even more interesting than the sequence’s conceit of taking us inside a medieval conception of the universe — a kind of cartographic imaginary — is its crystallization of a viewpoint best described as wargaming perspective: as it swoops from one kingdom to another, the camera describes a subjectivity somewhere between a god and a military general, the eternally comparing and assessing eye of the strategist. It’s an expository visual mode whose lineage is less to classical narrative than to video-game cutscenes or the mouse-driven POV in an RTS. Its ultimate root, however, is not in digital simulation but in the tabletop wargames that preceded it — what Matthew Kirschenbaum has evocatively called “paper computers.” Kirschenbaum, a professor at the University of Maryland, blogs among other places at Zone of Influence, and his post there on the anatomy of wargames contains a passage that nicely captures the roving eye of GoT‘s titles:
Hovering over the maps, the players occupy an implicit position in relation to the game world. They enjoy a kind of omniscience that would be the envy of any historical commander, their perspectives perhaps only beginning to be equaled by today’s real-time intelligence with the aid of GPS, battlefield LANs, and 21st century command and control systems.
Earlier I mentioned the sprawling complexity of A Song of Ice and Fire, a narrative we might spatialize as unmanageably large territory — unmanageable, that is, without that other form of “paper computers,” maps, histories, concordances, indices, and family trees that bring order to Martin’s endlessly elaborated diegesis. Their obvious digital counterpart would be something like this wiki, and it’s interesting to note (as does this New Yorker profile) that the author’s contentious, codependent relationship with his fan base is often battled out in such internet forums, where creative ownership of a textual property exists in tension with custodial privilege. Perhaps all maps are, in the words of Kurt Squire and Henry Jenkins, contested spaces. If so, then the tabletop maps on which wargames are fought provide an apt metaphor both for Game of Thrones‘s narrative dynamics (driven as they are by the give-and-take among established powers and would-be usurpers) and for the franchise itself, whose fortunes have increasingly become distributed among many owners and interests.
All of this comes together in the laden semiotics of the show’s opening, which beckons to us not just as viewers but as players, inviting us to engage through this “television computer” with a narrative world and user experience drawn from both old and new forms of media, mapping the past and future of entertainment.
Well, it finally happened: a friend showed me his new iPad, and I lived to tell the tale.
As I indicated in this post a few weeks ago, Apple’s recent refresh of its game-changing tablet computer struck me as something less than overwhelming — an incremental step rather than a quantum leap. Though I haven’t changed my position, I should clarify that I remain a committed iPad user and Apple enthusiast; I don’t expect miracles at every product announcement, any more than I expect every word out of my own mouth (or post on this blog) to be immortal wisdom. It’s OK for the company to tread water for a while, especially in the wake of losing Steve Jobs.
I’ve been reading Walter Isaacson’s biography of Jobs, flipping through that white slab of a book (obviously styled as a kind of print analog of an iPod) to my favorite chapters of Jobs’s history: his garage collaboration with Steve Wozniak in the mid-70s on the first Apple computer, in its cheeky wooden case; shepherding the first Macintosh to market in the early 1980s; his series of home runs twenty years later, helping to develop first iTunes and then the iPod as part of a rethinking of the personal computer as a “digital hub” for local ecosystems of smart, small, simple devices for capturing and playing back media.
It’s a modern mythology, with Jobs as an information-age Odysseus, somehow always seeing further than the people around him, taking us with him on his journey from one island of insight and inspiration to another. His death threatens to leave Apple rudderless, and the gently-revised iPad seems to me an understandable response to the fear of drifting off course. Too dramatic a correction at this point might well strand the company or distract it into losing its way entirely, and for a business so predicated on its confident mapping of the future — its navigation of nextness — that outcome is unthinkable.
The flipside, of course, is stagnation through staying the course, death by a thousand cautious shortcuts. Apple’s solution to the dilemma is symptomatized in the new iPad’s Retina display, the highest-resolution screen ever created for a mobile device. It’s hard not to intepret this almost ridiculously advanced visualization surface as a metaphor for the company’s (and our own) neurotic desire to “see” its way forward, boosting pixels and GPU cycles in an effort to scry, on a more abstract level, the ineffable quality that Steve Jobs brought to just about everything he did, summarized in that tired yet resilient word vision. We stroke the oracular glass in hopes of resolving our own future.
As in Greek tragedy, of course, such prophetic magic rarely comes without costs. The new iPad’s demanding optics send fiery currents surging through its runic circuitry, raising the device’s heat to levels that some find uncomfortable, though as Daedalus learned, sometimes you have to fly close to the sun. Hungry for power, the iPad takes a long time to charge, and doesn’t always come clean about its appetites, but who is to say if this is a bug or a feature? Not us mere mortals.
What I most worried about was looking upon the iPad’s Retina display and being forever ruined — turned to stone by the gorgon’s gaze. It happened before with the introduction of DVD, which made videocassette imagery look like bad porn; with Blu-Ray, which gave DVDs the sad glamour of fading starlets in soft-focus closeups; with HDTV, which in its dialectical coining of “standard def” (how condescendingly dismissive that phrase, the video equivalent of “she’s got a great personality”) jerrymandered my hundreds of channels into a good side and bad side of town. I was afraid that one glimpse of the new iPad would make the old iPad look sick.
But it didn’t, and for that I count myself lucky: spared for at least one more year, one more cycle of improvements. But already I can feel the pressure building, the groundwork being laid. As one by one my apps signal that they need updating to raise themselves to the level of the enhanced display’s capabilities, I imagine that I myself will one day awaken with a small circular push notification emblazoned on my forehead: ready to download an upgraded I.
I am a fan of Mad Men, which puts me in a tiny minority consisting of just about everyone I know, and most of those I don’t. Perhaps it’s a holdover from my recent post on 4chan’s reaction to The Hunger Games, but I can’t shake the sense that it’s getting harder to find points of obsession in the pop-culture-scape that haven’t already been thoroughly picked over. Credit AMC, a channel that has branded itself as a reliable producer of hit shows that play like cult favorites. I suspect that my desire to hold onto the illusion that I and I alone understand the greatness of Mad Men is the reason I saved its season-five premiere a full 24 hours, sneaking upstairs to watch it in the darkness of our bedroom, the iPad’s glowing screen held inches from my eyes, the grown-up equivalent of reading under the covers with a flashlight. Let me be alone with my stories.
Somewhat hardening the hard-core of my fanboy credibility is the fact I’ve followed the show religiously since it first aired in 2007; it jumped out at me immediately as a parable of male terror, a horror story dressed in impeccable business suits. That basic anxiety has stayed with the show, though it’s only one of its many pleasures. One aspect of the series that I once believed important, the 1960s setting presented in such alien terms as to turn the past into science fiction, turns out not to be so crucial, if the failure of broadcast-network analogs Pan Am and The Playboy Club are any indication.
As for the premiere, I enjoyed it well enough, but not nearly as much as I will enjoy the rest of the season. It always takes me a while to get back into Mad Men’s vibe, with the first episodes seeming stiff and obvious, even shallow, only later deepening into the profoundly entertaining, darkly funny, and ultimately heartbreaking drama I recognize the show to be. The slow thaw reminds me of how I react to seeing Shakespeare on stage or screen: it takes twenty minutes for the rarefied language to come into focus as something other than ostentation. Mad Men too initially distances by seeming a parody of itself, but I suspect that the time it takes for my eyes and ears to adapt to this hermetically-sealed bubble universe has more to do with its precise evocation of a juncture in history when surfaces were all we had — when the culture industry, metonymized in Madison Avenue, found its brilliant stride as a generator of sizzle rather than a deliverer of steak.