Worldbuilding avant la lettre in Robert A. Heinlein

“The only mainstream writer to whom Heinlein acknowledges a debt is Sinclair Lewis, and it is not for literary style. Lewis laid out extensive backgrounds for his work which did not directly appear in the story. That way he understood how his characters should react in a given situation, since he knew more about them than the reader did. In Heinlein, this ultimately grew beyond the bounds intended by Sinclair Lewis, whose characters performed against a setting with which the reader might be familiar. The Sinclair Lewis method couldn’t work for science fiction unless an entire history of the future was projected: then individual stories and characters in that series could at least be consistent within the framework of that imaginary never-never land.

“In following just this procedure, Robert A. Heinlein inadvertently struck upon the formula that had proved for successful for Edgar Rice Burroughs, L. Frank Baum, and, more recently, J. R. R. Tolkien. He created a reasonably consistent dream world and permitted the reader to enter it. Heinlein’s Future History has, of course, a stronger scientific base than Burroughs’s Mars, Baum’s Oz, or Tolkien’s land of the ‘Rings,’ but is fundamentally the same device.”

— Sam Moskowitz, Seekers of Tomorrow: Masters of Modern Science Fiction (New York: Ballantine, 1967). 194.

Seaworthy

Back in January when I sent out my book manuscript, I had the weird sense of waving goodbye to a cruise ship I built myself, standing at the pier while this giant, white, overstuffed artifact bellied out to sea.

It was not the first time this particular ship had been launched. In August 2006 I printed out the whole thing, some 350 pages. This was after my dissertation defense but before I dropped off the text at the print shop in Bloomington where Indiana University dissertations are bound. Lots of other stuff was going on at the time—I was in the midst of packing for the move to Pennsylvania, my thoughts mostly focused on coming up with syllabi for the two courses I was contracted to teach at Swarthmore starting in the fall. But I took a moment, amid the mess of cardboard boxes and sorting stacks for the yard sale, to balance the fat block of pages in my hands, marveling that I had managed to produce such a thing.

About a year later I sent it out again, this time as a book proposal. I got polite notes back from two academic presses—saying, essentially, thanks but no thanks—and shelved the project until 2011 or so. It went out again at that point, and this time was met with a yes, just in time for my tenure case.

Then came the reader reports. Mostly positive, with a handful of suggestions for changes, they stopped me in my tracks; it would be almost four more years before I got around to patching holes, updating case studies, and clarifying ambiguities needed to clear the final hurdle.

I should explain, if it isn’t clear from the outline, that I am not a good writer. Process-wise, I mean. Faced with a task, I put it off; encouraged, I dig in my feet and work even more grudgingly. This goes deep with me, all the way back to childhood. Though I have, for the most part, achieved the level of wisdom that involves accepting myself as I am, procrastination is one of the traits I most want to change in myself. As soon as I get around to it.

Anyway, it turns out that publishing a book, at least a scholarly one, involves more than one goodbye; it’s less like Ilsa and Rick lingering heartlost in the fog than like dropping off a child at school, morning after morning. That’s probably the wrong metaphor here, because I adore my children, but have come to detest the book. Still, the other images that spring to mind—repeated skin biopsies, for instance—might express in a Cronenbergian way the connection between writing and excrescence, a putrefaction of words shed like skin dust, but they don’t capture the idea of an object consciously built. A model kit, seams puttied and sanded, paint sprayed and retouched, decals and weathering conscientiously applied. Doomed to show only flaws and mistakes in the eyes of its maker; to everyone else it’s probably, y’know, okay.

My book is looking more okay these days thanks to the copyeditors at NYU Press. I got the manuscript back for review, have been going through the chapters, reviewing changes. There are a few on every page, and I see the wisdom of every single one. That’s generally my response to being edited—gratitude. Harlan Ellison and a mob of similarly perpetually disgruntled writers would kick me out of the Tough Kids Club for saying so. You can find me over by the janitor’s closet, eating lunch with Strunk and White.

Fort

There’s nothing like a suddenly lost object to demonstrate the precarity of our systems for keeping order—the flimsiness of the illusion that the spaces we inhabit are at our mercy, rather than the other way around.

There are many sorts of object, of course, and many sorts of loss. I daily shed millions of dead skin cells without thinking about it, and it doesn’t trouble my world if a Lego block goes missing from the Tupperware footlocker where all our Lego pieces entropically end up. The absence I’m talking about is the shadow cast by a specific kind of item: it must be something so critical to daily function that I need it—at least need easy access to it—almost all the time; by the same token, its ubiquity as both physical item and psychic token must make it easy to take for granted. Glasses, keyring, wallet, phone, various iPods and iPads. Made almost invisible by ritualized use, these small but vital technologies don’t often vanish from the map. But when they do, they threaten to take the map with them.

This week I spent a disturbing and disorienting couple of days searching for my laptop, a silvery sylph of a MacBook Air, which did not disappear so much as slowly slip off my radar—not a jump cut but a slow dissolve. Like Pasteur’s germs, the loss became an official fact only retrospectively. First I had to shamble from spot to spot around the house to check all the places the MacBook tends to get left: the high shelf kids can’t get at, the table beside the wall outlet, under the couch, under the bed. Meanwhile my thoughts probed an alternative theory, treating the missing computer as a theft. Hadn’t I left my car unlocked, work case in the front seat plain for all to see, when I dropped my kids at school? It was only a few minutes. But how long would it have taken, really?

I did not like the feeling of these suspicions germinating and spreading vinelike through my view of the world. Too much of the U.S. is ensnared and immobilized in such thorny psychic tendrils. And just as the presidency is in a way the mass projection of a schizoid populace—a country whose constituent blocs have lost the ability to comprehend each other, an imagined community angry-drunk on its dark and fearful imaginings—my worries about some faceless thief are just a way of externalizing anxiety and disavowing my own responsibility for losing track of something valuable.

The computer finally turned up (isn’t it I who turned up? the laptop didn’t move) in my campus office. It was on a shelf at about shoulder height, a place where books belong. I had no memory of setting it there, but set it there I must have. So now my theoretical thief has become an inferred Bob.

That word: absentminded. Quick flash of Fred MacMurray and an infinitely receding four-dimensional array of old academics wearing one sock and polishing their glasses. A little past that tesseract of cliché is one very real person, my mother, whose memory loss has in recent years become profound. Because of her I suppose I watch my own slips and failings with a diagnostic eye, sifting random problems for systematic ones, signals in the noise that point to a larger noise in the signal.

The computer vanished the instant I put it somewhere it doesn’t usually go. What does that say about where the coordinates and position of any object reside? Is it all and only relational? Are there, in fact, only negative differences, dark matter? I think it’s less important to answer those unanswerables than to note how close they are to the surface, a magma of existential worry coursing under the brightness and rationality of waking life. Note it, remember it, honor it.

Starting the Last of Us

thelastofus

The remarkable opening sequence of The Last of Us was ruined for me — at my request, I hasten to add — and as much as it might be in keeping with the game’s ethos of cowing and disempowering its players, I don’t want to visit the same epistemological violence upon readers without warning. So proceed no further if you wish to remain unspoiled!

After a long sojourn in retro tidepools of emulation (via MAME and Nestopia) and the immediate, delimited pleasures of casual gaming (where usual suspects like Bejeweled and Temple Run share playtime with private-feeling discoveries like Alien Zone and Nimble Quest) I’m returning to modern videogaming with a PlayStation 3 — itself on the verge of obsolescence, I suppose, thanks to the imminent PS4. My motivations for acquiring both The Last of Us and hardware to run it on can be traced to an hour or so of gaming at a friend’s place, where, as my two companions watched and kibbitzed, I walked, crouched, and ran TLOU’s protagonist-avatar Joel through a couple of early “encounters” whose purpose seemed to be to teach me the futility of fighting, shooting, or doing anything really besides sneaking around or flat-out running away from danger.

I find TLOU’s strategy of undermining any sense of potency or agency to be one of its most intriguing traits, but I will wait to talk more about that in a future post. For now I simply want to note the clever, evil way in which the game gets its hooks in you. You begin the game playing as Sarah, Joel’s twelve-year-old daughter, and the initial sequence involves piloting her around a darkened house in search of her father. It’s suitably creepy, with Sarah calling out “Dad?” in increasingly panicked tones as, outside the game, you adapt yourself to the basics of movement, camera placement, and manipulating objects in the environment.

The latter is a now-standard method of starting a game in crypto-tutorial mode — apparently sometime within the last ten years instruction manuals ceased to exist. Controllers have become standardized according to their brands, but each videogame deploys its button-and-joystick layout slightly differently, and acclimatizing the player to this scheme in a way that feels natural is every game’s first design challenge, a kind of ludic bootstrapping.

When Joel arrives home in the middle of the night and spirits Sarah off in a pickup truck, TLOU enters another mode, the expository tour, in this case a bone-rattling run through a world in the process of collapsing: police cars screeching by with sirens blaring (and lenses flaring), houses burning, townspeople rioting. Rushed from one apocalyptic setpiece to another, it’s a bit like Disney’s “Small World” ride filtered through Dante’s Inferno. By this point, avatarial focus has been handed off to Joel, but you barely notice it; he’s carrying Sarah in his arms as he runs, so it feels like he, she, and you have merged into a single unit of desperate, hounded motion.

And when it appears that the three of you have finally reached safety, a soldier appears, opens fire, and kills Sarah. Cut to black and the title card: THE LAST OF US.

It’s a great opening, harrowing and emasculating, and by breaking a couple of the basic expectations of storytelling (killing a child) and of gaming (killing an avatar we have grown used to inhabiting), it decenters and disorients the player, readying him or her for what is to come by demonstrating precisely how unready we really are.

It put me in mind of Psycho, which similarly kills off its ostensible protagonist at the end of its first act — though in the 1960 film Marion Crane has had a moral defect established that makes her, in retrospect at least, deserving of punishment in Hitchcock’s sadistic scopic regime. Sarah, by contrast, is an innocent, and as much a cipher as emblems of purity always are. Starting the game with her death is a manipulative but effective gut-punch that can be read both positively and negatively. It was enough to make me take the leap and reengage with contemporary gaming — well, it and a few other things. But more on that later.

 

Fun with your new head

The title of this post is borrowed from a book of short stories by Thomas M. Disch, and it’s doubly appropriate in that an act of borrowing arguably lies at the heart of the latest 3D-printing novelty to catch my eye: a British company called Firebox will take pictures of your own head, turn them into a 3D-printed noggin, and stick it on a superhero body. As readers of this blog probably know, I’m intrigued by desktop-fabrication technologies less for their ability to coin unique inventions (the “rapid prototyping” side of their operations) and more for the interesting wrinkles they introduce to the production and circulation of licensed and branded objects — especially fantasy objects, which are referentially unreal but tightly circumscribed by designs associated with particular franchises. Superhero bodies are among the purest examples of such artifacts, offering immediately recognizable physiologies and costumes such as Batman, Superman, and Wonder Woman; all of which are among the bodies onto which you can slap your replacement head.

Aside from literalizing the dual-identity structure that has always offered us mild-mannered Clark Kents a means of climbing into Kryptonian god-suits, what I love about this is its neat encapsulation of the deeper ideological function of the 3D-printed fantasy object, giving people the opportunity not just to locate themselves amid an array of mass produced yet personally significant forms (as in, for example, a collection of action figures) but to materialize themselves within and as part of that array, through plastic avatars that also serve as a kind of cyborg expression of commercialized subjectivity. That Firebox (and, presumably, license-holder DC Comics) currently offer a controlled version of that hybridity is only, I think, a symptom of our prerevolutionary moment, poised at the brink of an explosion of such transmutations and transubstantiations, legal and illegal alike, though which the virtual and material objects of fantastic media will not just swap places but find freshly bizarre combinatorial forms.

Six weeks

Having sung the praises of Fridays and gloomed about Sundays, am I set on turning Mondays into self-reflexive meditations on method? Evidently so, at least until the spring semester ends and time opens up a little. Tonight I (gently) chastised my wife for skipping her picture-a-day project on Facebook,* but as usual when it comes to judging others, in truth I am just projecting my own anxiety — in this case, an almost superstitious dread of disrupting a chain of daily posts now six weeks long and counting. Keeping it up has meant sacrificing a number of things I once held dear: cherished notins of myself as a brilliant writer, lingeringly slow-cooked prose, a certain dignity and distance in my choice of topics. Laudable goals all, but maybe too a little hollow and egocentric, and often unconducive to productivity. Instead, I’m discovering the hard comfort of routine, the discipline of a writing practice, along with a new kind of notch to cut into the wall.

That said, I’m also feeling the need to start working these daily posts into something longer and more substantive — an actual, paper publication — so that will be the next horizon. It won’t happen without a deadline and some goalposts, so over the next several days I will begin mapping out an experiment in scholarship, an essay to be drafted, as it were, in public. I approach this with some trepidation but also excitement: as with the act of teaching, through which my body has evolved a new organ for converting anxiety to energy, writing this blog is helping to wear down the last vestiges of resistance to taking risks.

* She intends to post two pictures tomorrow.

 

Good Night, and Good Luck.

Once again, thanks to my TV & New Media course, it is time to watch Good Night, and Good Luck (George Clooney, 2005), and once again I am reminded what a beautifully intimate experience it is. On the manifest level of its narrative, the film details the crusade of Edward R. Murrow and the See It Now news team to take Senator Joseph McCarthy to task for his many transgressions against democracy, and it’s gripping stuff: but on the latent level of its mise-en-scene, the movie is all about the television studios, elevators, lobbies, and offices at CBS — pristine spaces rendered in crisp black-and-white cinematography (actually the result of shooting grayscale sets in color, then digitally timing them to a sublime monochrome) and redolent of technological and cultural power as only the broadcast TV era could embody it. In its period evocation it’s Mad Men played straight, and unlike the AMC series, the total lack of exterior shots gives the whole thing the hermetic feel of a holodeck simulation.

When I first saw the film, the U.S. was gritting its teeth through George W. Bush’s second term, and its messages about the abuse of governmental power and patriotic ideology were impossible to read as anything other than statements about our post-9/11 world. Seven years later, the connotative corset has loosened, and exciting resonances with the passionately essayistic journalism of Rachel Maddow and the breathless pace of blogging and spreadable media (an electrical feeling of liveness and deadline I experience, if only in a small way, in my new daily posting regimen) tie Murrow’s moment to our own, inviting us to see the “old” in new media, and vice versa. I’m looking forward to discussing it with my students!

Good Friday, Bad Sunday

It’s Easter weekend, according to the plastic eggs dangling from tree branches in our neighbor’s yard, and I am once again experiencing the odd non-sensation of my own long-lapsed Catholicism. I would like to say it’s something I still struggle with — indeed, struggling with things seemed to be the ur-lesson of most of the scriptural teachings to which I was exposed — but the truth is that I left the church as soon as I was doctrinally allowed to, following my Confirmation, and never looked back. Always suspicious of the soap-smelling classrooms of my Sunday school and grumblingly resistant to any commitment of a weekend morning (I remember complaining to my parents that I only got two mornings a week to sleep in, which made them laugh, and not in a nice way), I hit my breaking point when one of our teachers gently explained to me that no animals, including cats, could make it to the afterlife, since they had no souls. Maybe true and maybe not, but in any case, not a faith that fits or suits me.

Instead, I’ve spent most of my life engaging in the secular ritual of weekends, playing out my small personal drama fifty-two times a year, kicking off with the joyous arrival of Friday, bookended by the grim letdown of Sunday. At heart I will always be wired for weekends and summer vacations, patterns of leisure stamped into me by the school calendar, continued now in my career as a college professor. In the microcosm of the weekend, on Fridays I am young and just getting out of school; on Sundays, old, an adult preparing for the work of the coming week. Death and resurrection, not of the body but of the spirit.

Fridays lately have the added significance of being “family days,” devoted to Zach and Katie. You’d think that my role as a husband and father would mark the apogee of grown-up-ness, but in practice these days are about much more straightforward pleasures: putting aside schoolwork to experience the easy companionship of my wife’s love, the eternal and unflawed presentness of Z’s babyhood. Fridays remind me what my mind and heart used to be like before they got all kinked up and complicated, and I am as grateful for their simplicity as I am awed by their profundity.

The zen of model kits

I speak often and with great satisfaction of my Man Cave, our house’s finished basement where my nerdish technophilia is allowed free reign. My PC tower and its domino-line backdrop of external hard drives; my big, flat TV atop its nest of audio components and cables; a small museum of video-game consoles; and the nonelectronic pleasures of my John D. MacDonald paperbacks (inherited from my father, who freshly arrived from Czechoslovakia in the 1950s used detective and spy fiction to hone his English-language skills), white cardboard longboxes of unexamined comics which with every passing year come more to resemble stacked sarcophagi, a dusty Millennium Falcon playset packed with Star Wars action figures in various stages of dismemberment (the latter a gift from my brother in law).

As this inventory suggests, the contents of the Man Cave embody not just arrested development but a certain ongoing regression: a march in reverse through the stages and artifacts of the enthusiasms that made me what I am today. For that reason, it’s fitting that I have opened a new wing whose title might be “Boy Cave”: a model-kit-building station in a side workroom where the heating-oil tank and cat-litter box vie with paint thinner and acrylic glue for the prize of most fascinatingly noxious scent.

Currently on the workbench is Polar Lights’s Robby the Robot, a kit I’ve been dabbling with for more than a year, but which a few nights ago I decided to buckle down and finish. (Pictured above, the 1/12-scale figure is still missing an ornamental arrangement of gyroscopes on top of its head, and over that a clear dome that seals its brain circuitry in place.) Model kits based on science fiction and fantasy have become a central preoccupation in my scholarship, and I guess in some ways I have returned to kit-building in order to (re)gain firsthand experience of this strange subculture of artifactual play and constructivist leisure — its material investments as well as its surrounding discursive community (see, for example, the reviews and build-guides here, here, and here).

But I’m also realizing a simple and zenlike truth, which is that to build a kit you must build it; it won’t finish itself. And the difference between dreaming and doing, which has so often constituted an agonizing contrapunto to my publishing life, is like the difference between the unassembled plastic parts still on their sprue and the built, painted, finalized thing: a matter of making. If I can fit the pieces of Robby together in stray minutes (and it turns out that the rhythms of model-kit assembly fit nicely into the scattered but semipredictable intervals of parenting), what else might I accomplish, simply by opting to complete — rather than just contemplate — the process?

Revisiting the virtual courtyard

Longtime readers of this blog (and I take it on faith that there are one or two of you out there) will know that I am obsessed with the clumsy sublime, Laura Mulvey’s term for the accidental beauty of old-school special effects in classical Hollywood — studio tricks and machinations meant to pass unnoticed in their time, but which become visible and available for fresh appreciation as years go by and the state of the art evolves.

The same reader(s) will be familiar with my interest in virtual spaces and more specifically with the “pocket universes” of storyworlds and certain photographic experiments, such as the one featured in this post on the 2008 Republican National Convention. Well, now I’ve found a new toy to think with: this lovely virtualization of the courtyard in Rear Window (1954), digitally stitched together to recreate not just the large and elaborate studio set on which Alfred Hitchcock filmed the tale of L. B. Jefferies (James Stewart), whose immobilization by broken leg unlocks a visual mobility in which he scans through binoculars the social (and as it turns out, criminal) microcosm of the neighbors in his apartment complex. Here’s the video:

Within this composited space, part Holodeck, part advent calendar, the action of the movie unfolds with new seamlessness and unity, time’s passage marked by sunrises and sunsets, clouds rippling overhead, the moon rising on its nighttime trajectory as the small community bustles through an overlapping ballet of the quotidian. In the middle of it all, a harried husband builds to a murderous rage, disposes of a body, is investigated by Grace Kelly sneaking in from a fire escape.

One of Hitchcock’s undisputed masterpieces, Rear Window is also the go-to example in introductory film courses to illustrate voyeurism, scopophilia, and the cinematic apparatus — a raft of abstract yet efficacious concepts that come out of the “grand theory” tradition of film studies, themselves subject, perhaps, to their own form of clumsy sublimation. I wonder how we might update those ideas in an era of computational revisitation and transformation, in which the half-built, half-imagined territories of classical cinema can be unfolded into digital origami that simultaneously make them more “real” while rendering apparent their intricate artificiality, recoding cinema’s dreamspaces into simulacral form.