Tuesday

Watching Z pull himself eagerly along a shelf of books at our local library — thin tomes, the spine of each marked by a circular sticker whose color indicated the intended reading level — I allowed myself to hope that he will grow up bookish like me, drawn to the quiet hours one finds between a story’s pages, or the murmur of a parent’s voice reading aloud before bed. My mother read to me sometimes, but more consistently it was my oldest brother Paul, himself a highly intellectual, tense, and troubled young man with whom I got along not at all by day, but by night became my guide through lands of fantasy: The Hobbit, the Narnia books, A Wrinkle in Time and its remarkable, disquieting sequels. I would love to read these to Z, and more: Heinlein’s young adult novels, and Harry Potter, and E. Nesbit. But for now, I just whisper in the darkness of the nursery: shh shh shh, and it’s all right: waiting for the grape-flavored infant Advil to kick in and muffle his teething pain, our story a more simple and shared one of comfort in the night.

Children at Play

Here are preliminary notes for a brief guest lecture I’m giving tomorrow in Professor Maya Nadarni’s course “Anthropological Perspectives on Childhood and the Family.” The topic is Children at Play.

Introduction: my larger research project

  • fantastic-media objects, includes model kits, collectible statues, wargaming figurines, replica props: unreal things with material form
  • these objects are an integral part of how fantastic transmedia franchises gain purchase culturally and commercial, as well as how they reproduce industrially
  • particularly complex objects in terms of signification and value, mediation of mass and private, principles of construction, and local subcultures (both fan and professional) where they are taken up in different ways
  • while these objects have been with us for decades, evolving within children’s culture, hobby cultures, gaming, media fandom, and special-effects practices, the advent of desktop fabrication (3D printing) paired with digital files portends a shift in the economies, ontologies, and regulation of fantastic-media objects

Jonathan Gray, Show Sold Separately: a counter-reading of toys and action figures

  • examines Star Wars toys and action figures as examples of paratexts shaping interpretation of “main text”
  • story of Lucas’s retention of licensing rights, considered risible at the time
  • graphic showing that toys and action figures account for more profits than films and video games combined
  • rescues “denigrated” category of licensed toys as “central to many fans’ and non-fans’ understandings of and engagements with the iconic text that is Star Wars. … Through play, the Star Wars toys allowed audiences past the barrier of spectatorship into the Star Wars universe.” (176)
  • licensed toys provide opportunities “to continue the story from a film or television program [and] to provide a space in which meanings can be worked through and refined, and in which questions and ambiguities in the film or program can be answered.” (178)
  • notes role of SW toys in sustaining audience interest during 1977-1983 period of original trilogy’s release
  • transgenerational appeal of franchise linked to toys as transitional objects, providing a sense of familiarity in young fans’ identities
  • current transmedia franchises include licensed objects as components of extended storyworlds

Case study in history: the objects of monster culture

  • 1960s monster culture spoke to (mostly male and white) pre-teen and adolescent baby boomers
  • mediated through Famous Monsters of Filmland (1958-), especially advertising pages from “Captain Company”
  • Aurora model kits were key icons of this subculture: “plastic effigies”
  • Steven M. Gelber: popularization of plastic kits represented “the ultimate victory of the assembly line,” contrasting with an earlier era of authentic creativity in which amateur crafters “sought to preserve an appreciation for hand craftsmanship in the face of industrialization.” (262-263)
  • model kits provided young fans with prefab creativity, merging their own crafts with media templates; also opportunities for transformation (1964 model kit contest)

“The most intricate, beautiful map you could possibly imagine”

The first season of HBO’s Game of Thrones followed the inaugural novel of George R. R. Martin’s series so minutely that, despite its obvious excellence, I found it a bit redundant: like the early days of superhero comics in which the panel art basically just illustrated the captions, each episode had the feel of an overly faithful checklist, the impeccable casting and location work a handsome but inert frame for Martin’s baroque and sinister plotting. That’s one big reason why I’m eager to see the second season, which premieres tonight — I bogged down a hundred or so pages into book two, A Clash of Kings, and so apart from a few drips and drabs that have leaked through my spoiler filter, I’m a fresh and untrammeled audience. (Given the sheer scale of A Song of Ice and Fire, at five fat volumes and counting, the whole concept of spoilers seems beside the point, rendered irrelevant by a level of structural complexity that forces synchronic rather than diachronic understandings; it’s hard enough at any given moment to keep track of the dense web of characters, alliances, and intrigues without worrying about where they’ll all be two or three thousand pages later.)

Another reason I’m looking forward to the series’ return is its arresting title sequence, a compact masterpiece of mannered visualization that establishes mood, momentum, and setting in the span of ninety seconds:

Here is how the website Art of the Title describes the governing concept:

A fiery astrolabe orbits high above a world not our own; its massive Cardanic structure sinuously coursing around a burning center, vividly recounting an unfamiliar history through a series of heraldic tableaus emblazoned upon it. An intricate map is brought into focus, as if viewed through some colossal looking glass by an unseen custodian. Cities and towns rise from the terrain, their mechanical growth driven by the gears of politics and the cogs of war.

From the spires of King’s Landing and the godswood of Winterfell, to the frozen heights of The Wall and windy plains across the Narrow Sea, Elastic’s thunderous cartographic flight through the Seven Kingdoms offers the uninitiated a sweeping education in all things Game of Thrones.

“Elastic,” of course, refers to the special-effects house that created the sequence, and Art of the Title‘s interview with the company’s creative director, Angus Wall, is enormously enlightening. Facing the challenge of establishing the nonearthly world in which Game of Thrones takes place, Wall developed the idea of a bowl-shaped map packed with detail. In his words, “Imagine it’s in a medieval tower and monks are watching over it and it’s a living map and it’s shaped like a bowl that’s 30 feet in diameter and these guys watch over it, kind of like they would the Book of Kells or something… they’re the caretakers of this map.” Realizing the limitations of that topology, Elastic put two such bowls together to create a sphere, and placing a sun in the center, arrived at the sequence’s strange and lyrical fusion of a pre-Copernican cosmos with a Dyson Sphere.

Yet even more interesting than the sequence’s conceit of taking us inside a medieval conception of the universe — a kind of cartographic imaginary — is its crystallization of a viewpoint best described as wargaming perspective: as it swoops from one kingdom to another, the camera describes a subjectivity somewhere between a god and a military general, the eternally comparing and assessing eye of the strategist. It’s an expository visual mode whose lineage is less to classical narrative than to video-game cutscenes or the mouse-driven POV in an RTS. Its ultimate root, however, is not in digital simulation but in the tabletop wargames that preceded it — what Matthew Kirschenbaum has evocatively called “paper computers.” Kirschenbaum, a professor at the University of Maryland, blogs among other places at Zone of Influence, and his post there on the anatomy of wargames contains a passage that nicely captures the roving eye of GoT‘s titles:

Hovering over the maps, the players occupy an implicit position in relation to the game world. They enjoy a kind of omniscience that would be the envy of any historical commander, their perspectives perhaps only beginning to be equaled by today’s real-time intelligence with the aid of GPS, battlefield LANs, and 21st century command and control systems.

Earlier I mentioned the sprawling complexity of A Song of Ice and Fire, a narrative we might spatialize as unmanageably large territory — unmanageable, that is, without that other form of “paper computers,” maps, histories, concordances, indices, and family trees that bring order to Martin’s endlessly elaborated diegesis. Their obvious digital counterpart would be something like this wiki, and it’s interesting to note (as does this New Yorker profile) that the author’s contentious, codependent relationship with his fan base is often battled out in such internet forums, where creative ownership of a textual property exists in tension with custodial privilege. Perhaps all maps are, in the words of Kurt Squire and Henry Jenkins, contested spaces. If so, then the tabletop maps on which wargames are fought provide an apt metaphor both for Game of Thrones‘s narrative dynamics (driven as they are by the give-and-take among established powers and would-be usurpers) and for the franchise itself, whose fortunes have increasingly become distributed among many owners and interests.

All of this comes together in the laden semiotics of the show’s opening, which beckons to us not just as viewers but as players, inviting us to engage through this “television computer” with a narrative world and user experience drawn from both old and new forms of media, mapping the past and future of entertainment.

Spared by the gorgon’s gaze

Well, it finally happened: a friend showed me his new iPad, and I lived to tell the tale.

As I indicated in this post a few weeks ago, Apple’s recent refresh of its game-changing tablet computer struck me as something less than overwhelming — an incremental step rather than a quantum leap. Though I haven’t changed my position, I should clarify that I remain a committed iPad user and Apple enthusiast; I don’t expect miracles at every product announcement, any more than I expect every word out of my own mouth (or post on this blog) to be immortal wisdom. It’s OK for the company to tread water for a while, especially in the wake of losing Steve Jobs.

I’ve been reading Walter Isaacson’s biography of Jobs, flipping through that white slab of a book (obviously styled as a kind of print analog of an iPod) to my favorite chapters of Jobs’s history: his garage collaboration with Steve Wozniak in the mid-70s on the first Apple computer, in its cheeky wooden case; shepherding the first Macintosh to market in the early 1980s; his series of home runs twenty years later, helping to develop first iTunes and then the iPod as part of a rethinking of the personal computer as a “digital hub” for local ecosystems of smart, small, simple devices for capturing and playing back media.

It’s a modern mythology, with Jobs as an information-age Odysseus, somehow always seeing further than the people around him, taking us with him on his journey from one island of insight and inspiration to another. His death threatens to leave Apple rudderless, and the gently-revised iPad seems to me an understandable response to the fear of drifting off course. Too dramatic a correction at this point might well strand the company or distract it into losing its way entirely, and for a business so predicated on its confident mapping of the future — its navigation of nextness — that outcome is unthinkable.

The flipside, of course, is stagnation through staying the course, death by a thousand cautious shortcuts. Apple’s solution to the dilemma is symptomatized in the new iPad’s Retina display, the highest-resolution screen ever created for a mobile device. It’s hard not to intepret this almost ridiculously advanced visualization surface as a metaphor for the company’s (and our own) neurotic desire to “see” its way forward, boosting pixels and GPU cycles in an effort to scry, on a more abstract level, the ineffable quality that Steve Jobs brought to just about everything he did, summarized in that tired yet resilient word vision. We stroke the oracular glass in hopes of resolving our own future.

As in Greek tragedy, of course, such prophetic magic rarely comes without costs. The new iPad’s demanding optics send fiery currents surging through its runic circuitry, raising the device’s heat to levels that some find uncomfortable, though as Daedalus learned, sometimes you have to fly close to the sun. Hungry for power, the iPad takes a long time to charge, and doesn’t always come clean about its appetites, but who is to say if this is a bug or a feature? Not us mere mortals.

What I most worried about was looking upon the iPad’s Retina display and being forever ruined — turned to stone by the gorgon’s gaze. It happened before with the introduction of DVD, which made videocassette imagery look like bad porn; with Blu-Ray, which gave DVDs the sad glamour of fading starlets in soft-focus closeups; with HDTV, which in its dialectical coining of “standard def” (how condescendingly dismissive that phrase, the video equivalent of “she’s got a great personality”) jerrymandered my hundreds of channels into a good side and bad side of town. I was afraid that one glimpse of the new iPad would make the old iPad look sick.

But it didn’t, and for that I count myself lucky: spared for at least one more year, one more cycle of improvements. But already I can feel the pressure building, the groundwork being laid. As one by one my apps signal that they need updating to raise themselves to the level of the enhanced display’s capabilities, I imagine that I myself will one day awaken with a small circular push notification emblazoned on my forehead: ready to download an upgraded I.

Easing back into Mad Men

I am a fan of Mad Men, which puts me in a tiny minority consisting of just about everyone I know, and most of those I don’t. Perhaps it’s a holdover from my recent post on 4chan’s reaction to The Hunger Games, but I can’t shake the sense that it’s getting harder to find points of obsession in the pop-culture-scape that haven’t already been thoroughly picked over. Credit AMC, a channel that has branded itself as a reliable producer of hit shows that play like cult favorites. I suspect that my desire to hold onto the illusion that I and I alone understand the greatness of Mad Men is the reason I saved its season-five premiere a full 24 hours, sneaking upstairs to watch it in the darkness of our bedroom, the iPad’s glowing screen held inches from my eyes, the grown-up equivalent of reading under the covers with a flashlight. Let me be alone with my stories.

Somewhat hardening the hard-core of my fanboy credibility is the fact I’ve followed the show religiously since it first aired in 2007; it jumped out at me immediately as a parable of male terror, a horror story dressed in impeccable business suits. That basic anxiety has stayed with the show, though it’s only one of its many pleasures. One aspect of the series that I once believed important, the 1960s setting presented in such alien terms as to turn the past into science fiction, turns out not to be so crucial, if the failure of broadcast-network analogs Pan Am and The Playboy Club are any indication.

As for the premiere, I enjoyed it well enough, but not nearly as much as I will enjoy the rest of the season. It always takes me a while to get back into Mad Men’s vibe, with the first episodes seeming stiff and obvious, even shallow, only later deepening into the profoundly entertaining, darkly funny, and ultimately heartbreaking drama I recognize the show to be. The slow thaw reminds me of how I react to seeing Shakespeare on stage or screen: it takes twenty minutes for the rarefied language to come into focus as something other than ostentation. Mad Men too initially distances by seeming a parody of itself, but I suspect that the time it takes for my eyes and ears to adapt to this hermetically-sealed bubble universe has more to do with its precise evocation of a juncture in history when surfaces were all we had — when the culture industry, metonymized in Madison Avenue, found its brilliant stride as a generator of sizzle rather than a deliverer of steak.

Coming home

Sharing one of my exceedingly rare cigarettes with a friend at this weekend’s SCMS conference in Boston, I joked about writing an avant-garde academic text in the form of a giant palindrome: it would be a perfectly cogent argument up to the halfway point, then reverse itself and proceed backwards, until, on the last page, it ended on the same word with which it had started.

Now that my wife, son, and I are back in our house, I see that the act of travel, of being away and coming back, is a lot like that giant palindrome. All of the preparations we so carefully make before depature — packing the suitcase, loading the dishwasher, turning off the lights — reverse themselves on our arrival home, and the first small acts with which I began (always, for some reason, the zipping-up of toothbrush and razor in my toiletries bag) are the last to be performed at the other end of the experience, shuffling disordered cards back into the familiar and dog-eared deck of our everyday life. For me, there is nothing quite like the pleasure and relief of settling back in at home.

All of this has an added resonance and poignancy, because today marks one year since another act of coming home. On March 25, 2011, my wife and I lost a pregnancy at 23 weeks, after medical scans that showed profound defects in the fetus and left us with little hope for a healthy birth or normal life for the child if it survived. “Fetus,” “child,” “it”; all inadequate but protectively distancing approximations for the boy we named Arlo, delivered as the sun rose outside the windows of our room at the hospital. That room continued to brighten and warm thoughout the morning as we sat with our son, saying hello and goodbye to this tiny pound-and-a-quarter person whose motionless face, after all our weeks of fear and dread, turned out to be not so scary after all: a gentle little visage, like a thoughtful gnome’s, with eyes that never opened.

There was a certain undeniable grace to that morning, a gift of release, but by nightfall much of the spell had worn off, and by the time we got home, the first real waves of pain had started to throb through the cushion of our shock. K’s mother was here to take care of us, and our dog (now deceased) here to need us in turn, and there was mail on the kitchen table waiting to be sorted, shows on the DVR to watch.

I don’t really remember how we got through the next several weeks (though I did have a moment of startled realization not too many mornings later, sitting at the kitchen table with my wife and mother in law, that the world had not in fact ended). We did the things we normally do: cooked meals, went for walks, paid bills. Early in April some robins built a nest outside our kitchen window and filled it with perfect blue eggs from which emerged a gaggle of adorably wrinkled and disgusting beasts that soon enough turned cute, sprouted feathers, opened their eyes, and flew away. In June we submitted our adoption profile. Six weeks later, we received a call from our agency. Two days after that, we found ourselves in another hospital room, meeting the newborn baby who would become our son.

We got home after that experience, too, and I guess the lesson here is that if luck is with you, you get to come home, put the pieces of your life back together, and move on. Sometimes leaving the safety zone is voluntary and sometimes it’s forced upon you, but either way it’s usually something you have to do in order to keep on growing.

Now that spring is here, robins are starting to show up in the trees and on our lawn. Irrational as it is, I hold out hope that one or more of them are the babies we watched through our kitchen window, grown up now and coming home themselves. The nest is still there, waiting for them to settle in and unpack their suitcases.

Hungry for recognition

As usual, I can’t say to what degree the fast-moving currents of vituperation and one-up-manship on the /tv/ board of 4chan summarize the opinions shared by wider communities of fantastic-media fandom. The most one can safely conclude is that this anonymous posting culture, which despite its lack of identifiers screams straight-white-maleness, at least speaks from the heart; and amid the jeering homophobia, misogyny, racism, and antisemitism that function as a kind of chainmail for the ego, the collective seems to feel genuinely offended by the huge box office of The Hunger Games‘ opening weekend.

The gist of the complaints is that the new movie franchise and the books on which it is based borrow freely but without acknowledgement from other cherished fan texts such as Battle Royale and some of Stephen King’s early novels. I ran down this exact list in my post from last week on the icky and ironic parallels between the media makeover the main character, Katniss, receives in the story and the real-life glamorizing of the movie’s star, Jennifer Lawrence — but seeing the same litany of influences played out on 4chan in a more accusatory tone reveals a striking woundedness on the part of fans (and I think I’m talking primarily about fanboys) who feel betrayed by the explosive popularity of a story they believe they have encountered many times before. Particular ire is directed toward the trilogy’s author, Suzanne Collins, who is seen as not simply derivative but dishonest in sourcing her creation to recent developments in U.S. military adventurism and reality-television programming, rather than to Japanese pop culture and pulp dystopian fantasy of the late 1970s.

I suspect that the emotional stakes here are those of ownership as a byproduct of fannish familiarity and knowledge; /tv/’s readership feels sidelined by the mainstream success of material in which they were formerly the sole experts. It’s an interesting exercise in cult guardianship and the ethics of a citational economy in which Collins’s apparent refusal to give credit where credit is due flies in the face of fan practices predicated on the competitive display of intertextual knowledge. It’s another kind of hunger game, this appetite for the mantle of mastery, fought in the vertical arena of replies to replies on a website. Collins’s fiction and its cinematic adaptation commit the unpardonable crime of neglecting that arena outright, making up their own rules, and thinking outside the box, just as Katniss does to win her victory — and 4chan’s media fans find themselves in the angry position of Panem’s repressive government, fighting an insurrection that threatens to undo the grounds of its authority.

Conferencing

Big academic conferences have a strange energy — which is to say, they have an energy that is palpable and powerful but exceeds my ability to understand or, more importantly, locate myself within it. It is, in part, a concentration of brain power, the collected expertise of a scholarly discipline (in this case, cinema and media studies) brought together for five days and four nights in a Boston hotel. I experience this first aspect as a kind of floating cerebral x-ray of whatever room I’m in, the heads around me overlaid with imagined networks of knowledge like 3D pop-up maps of signal strength in competing cell-phone ads.

But there is another, related dimension, and that is the sheer social density of such gatherings. The skills we develop as students and scholars are honed for the most part in isolation: regardless of the population of the campuses where we work, the bulk of our scholarly labor transpires in the relative silence of the office, the quiet arena of the desktop, the soft skritch of pencil against paper or gentle clicking of computer keyboards still a million times louder than the galaxies of thought whirling through our minds. (Libraries are a good metaphors for what I’m talking about here: quiet spaces jammed with unvocalized cacophanies of text, physical volumes side by side but never communicating with each other save for their entangled intimacies of footnotes and citations.)

Bring us all together for a conference and instantly the silence of our long internal apprenticeships, our walkabouts of research, converts to a thousand overlapping conversations, like a thunderstorm pouring from supersaturated clouds. We’re hungry for company, most of us, and the sudden toggle from solitary to social can be daunting.

When we arrived, the hotel’s computers were down, and the lobby was jammed with people waiting to check in, dragging their suitcases like travelers waiting to board an airplane. A set of clocks over the reception desk read out times from across the world — San Francisco, London, Tokyo — in cruel chronological contrast to the state of stasis that gripped us. Amid the digital work stoppage, I met a colleague’s ten-year-old son, who proudly showed me a bow and arrow he had fashioned from a twig and a taut willow branch found outside in the city’s public gardens. Plucking the bowstring like a musical instrument, he modestly estimated the range of his makeshift weapon (“about six feet”), but all I could do was marvel at his ingenuity in putting wood to work while electronic technologies ground to a halt, stranding all of us brainy adults in long and weary lines. Maybe the whole conference would run better if we swapped our iPads and phones and laptops for more primitive but reliable hand-fashioned instruments; but then, just as our scholarship can’t proceed in a social vacuum, maybe we need the network.

30 days

Today marks one month of blogging every day. It’s a little hard to believe; since starting this blog in August 2007, I racked up something like 100 posts, and if I was in the mood to do the math, it would probably work out to a depressingly low frequency — a far cry from the promise I made myself when I started (two posts a week, I believe, was the goal). Yet in the last 30 days, I’ve added almost a third of that total again.

It’s a small victory, but a victory nonetheless. My method for achieving it, if this earlier post didn’t make it clear, is to consciously lower the threshold for my own writing, making myself OK with contributing less-than-stellar content every time. It’s been a way of knocking the chip off my own shoulder, an exercise in getting over myself, and in that sense quite healthy — if humbling. (Are the humbling experiences of life always the healthiest? Tricky question, given that my phases of self-aggrandizing overconfidence are premised on, and interleaved with, deep insecurity.)

That it has resulted in the regular production of words and ideas in a more modest vein aligns this exercise with a long history of writing and engaging in different forms of writing practice: I’ve kept a diary since 1984, when I was eighteen, and between 1990 and 1991 I made a point of writing every day for a solid year. On nights when I was too wiped out to open one of the black-and-white composition books in which I preferred to write, I would scribe sentences in the air with my finger. Was that writing? Sure, if what counts is only the commitment to the act, a playing-out of internal monologue. But definitely not, if the measure of writing is to record something outside oneself, externalizing a record that thenceforth coexists with you and even splits away to find a new audience.

What I’m talking about is publication, a concept that’s been much on my mind since becoming an academic, and even before. As a kid, I wrote scripts for short science-fiction movies I planned to shoot in Super 8, synched to the vinyl records I listened to in my bedroom: Stravinsky’s Firebird, Saint-Saens’ Danse Macabre, Jerry Goldsmith’s music for Star Trek: The Motion Picture, and just about every film score composed by John Williams between 1977 and 1981. (My all-time favorite, though, was James Horner’s score for Star Trek II: The Wrath of Khan.) These screenplays morphed over time into scripts for plays I envisioned staging at my high school or in college, where for the first years of my undergraduate career I pursued a major in theater. Halfway through the eight years it took me to earn my B.A., I became an English major, and from there my writing ambitions recentered around short stories and novels. I would chase that dream throughout the 1990s, until I abruptly came to the end of the dream in 1997. The next year, I started graduate studies at the University of North Carolina.

Long story short, I’ve always been a writer, always wanted to be one, written a lot of words in pursuit of that dream. For the last six years, as an assistant professor at Swarthmore College, writing has become laden with the pressures and expectations of the tenure track, a challenge to which I’ve risen only sporadically, and often resentfully — as though, deep down, I’m still happier to sketch ideas in the air than commit them to paper.

The blog is yet another space, I understand: not quite a diary, not quite scholarship, but something in between. I struggle to locate myself within it, even as I try to find a magical bridge between the words that come so easily (well, fairly easily) to the screen and those I need to convince an academic press to accept. It’s an ongoing adventure as well as something of a slog. I don’t know if success is out there (or readers, for that matter). But for now I will keep posting — every day.

NBC’s pleasant surprise

For fans of NBC sitcoms, the return of Community is surely the week’s big news — but thanks to a packed DVR and a baby pushing the boundaries of his bedtime following a daylight-savings shakeup, I haven’t yet watched it. Instead, another and somewhat lesser show caught me by happy surprise. Whitney, still in its first season, featured an episode in which one of the side characters comes out of the closet. Neal — played by Jack’s former assistant on the increasingly moribund 30 Rock — finds himself confusedly but unerringly attracted to another man, and as the news travels around his circle of friends, I braced for stupidity, or worse, a conservative return to the status quo by story’s end. But none of this happened; Neal was allowed his realization, and his pals, usually a reliable generator of quippy sarcasm, took the announcement in stride. Not the most earth-shaking development, perhaps, but a victory nonetheless: a small bit of sanity and compassion amid an overheated political season and a television programming block otherwise prone to doofusy dismissals of difference and a disdain for the messiness of actual life. Bravo, Whitney, for doing something right.