Spared by the gorgon’s gaze

Well, it finally happened: a friend showed me his new iPad, and I lived to tell the tale.

As I indicated in this post a few weeks ago, Apple’s recent refresh of its game-changing tablet computer struck me as something less than overwhelming — an incremental step rather than a quantum leap. Though I haven’t changed my position, I should clarify that I remain a committed iPad user and Apple enthusiast; I don’t expect miracles at every product announcement, any more than I expect every word out of my own mouth (or post on this blog) to be immortal wisdom. It’s OK for the company to tread water for a while, especially in the wake of losing Steve Jobs.

I’ve been reading Walter Isaacson’s biography of Jobs, flipping through that white slab of a book (obviously styled as a kind of print analog of an iPod) to my favorite chapters of Jobs’s history: his garage collaboration with Steve Wozniak in the mid-70s on the first Apple computer, in its cheeky wooden case; shepherding the first Macintosh to market in the early 1980s; his series of home runs twenty years later, helping to develop first iTunes and then the iPod as part of a rethinking of the personal computer as a “digital hub” for local ecosystems of smart, small, simple devices for capturing and playing back media.

It’s a modern mythology, with Jobs as an information-age Odysseus, somehow always seeing further than the people around him, taking us with him on his journey from one island of insight and inspiration to another. His death threatens to leave Apple rudderless, and the gently-revised iPad seems to me an understandable response to the fear of drifting off course. Too dramatic a correction at this point might well strand the company or distract it into losing its way entirely, and for a business so predicated on its confident mapping of the future — its navigation of nextness — that outcome is unthinkable.

The flipside, of course, is stagnation through staying the course, death by a thousand cautious shortcuts. Apple’s solution to the dilemma is symptomatized in the new iPad’s Retina display, the highest-resolution screen ever created for a mobile device. It’s hard not to intepret this almost ridiculously advanced visualization surface as a metaphor for the company’s (and our own) neurotic desire to “see” its way forward, boosting pixels and GPU cycles in an effort to scry, on a more abstract level, the ineffable quality that Steve Jobs brought to just about everything he did, summarized in that tired yet resilient word vision. We stroke the oracular glass in hopes of resolving our own future.

As in Greek tragedy, of course, such prophetic magic rarely comes without costs. The new iPad’s demanding optics send fiery currents surging through its runic circuitry, raising the device’s heat to levels that some find uncomfortable, though as Daedalus learned, sometimes you have to fly close to the sun. Hungry for power, the iPad takes a long time to charge, and doesn’t always come clean about its appetites, but who is to say if this is a bug or a feature? Not us mere mortals.

What I most worried about was looking upon the iPad’s Retina display and being forever ruined — turned to stone by the gorgon’s gaze. It happened before with the introduction of DVD, which made videocassette imagery look like bad porn; with Blu-Ray, which gave DVDs the sad glamour of fading starlets in soft-focus closeups; with HDTV, which in its dialectical coining of “standard def” (how condescendingly dismissive that phrase, the video equivalent of “she’s got a great personality”) jerrymandered my hundreds of channels into a good side and bad side of town. I was afraid that one glimpse of the new iPad would make the old iPad look sick.

But it didn’t, and for that I count myself lucky: spared for at least one more year, one more cycle of improvements. But already I can feel the pressure building, the groundwork being laid. As one by one my apps signal that they need updating to raise themselves to the level of the enhanced display’s capabilities, I imagine that I myself will one day awaken with a small circular push notification emblazoned on my forehead: ready to download an upgraded I.

The new iPad

I’m neither an automatic Apple acolyte nor a naysayer, but the company and its technologies do go deep with me: my first computer, purchased back in 1980, was an Apple II+ with 48K of RAM, and between me and my wife, the household currently holds six or seven Apple devices, including multiple MacBooks and iPods. That I integrate these machines with a powerful PC that is my primary workstation and gaming platform does not dilute the importance of the role Apple has played in my life.

All that said, today’s announcement of the latest iPad strikes me as a letdown, and I’ve been trying to figure out why. A Retina display with four times the resolution of the current device is nothing to sneeze at, and I’m glad to see a better rear-facing camera. But the quantum leap in capability and, more importantly, a certain escalation of the brand are missing. I am the happy owner of an iPad 2, brought a year ago during a difficult time for Katie and me; March 2011 was a profoundly unhappy month, and I am not embarrassed to say that my iPad was one of the small comforts that got me through long nights at the hospital. Perhaps if I was going through something equivalently tragic now, I might again turn to a technological balm, but I doubt that the new iPad would do the trick. It’s a cautious, almost timid refinement of existing hardware, and I daresay that were Steve Jobs still around, Apple might have taken a bolder leap forward.

I was struck by one statement in the promotional video I watched: the assertion that in an ideal Apple-based technological experience, the mediating device disappears from consciousness, allowing you to concentrate on what you’re doing, rather than the thing you’re doing it with. True enough, I suppose — I’m not thinking about the keyboard on which I’m typing this blog entry, or the screen on which I’m reading my own words. But such analyses leave out the powerful effect of the brand that surrounds those moments of “flow.” The iPad, like so many Apple innovations, is a potent and almost magical object in terms of the self-identifications it provides, and in off-screen moments I am always highly conscious of being an iPad user. It’s a happy interpellation, one I accept enthusiastically, turning with eagerness toward the policeman’s call. It’s anything but a transparent experience, and the money I give Apple goes at least as much to support my own subjectification as to underwrite a particular set of technological and creative affordances. The new iPad lacks this aura, so for now, I’ll stick with what I have.

Notes on Spacewar

What is it about Spacewar that so completely captures my imagination? Teaching my Theory and History of Video Games class, I once again crack open Steven Levy’s great book Hackers: Heroes of the Computer Revolution, which I have read at least a dozen times since it was published in 1984. A time now further away than the period of which Levy was then writing — the late 1950s and early 60s, when a motley assortment of brilliantly talented social misfits at MIT repurposed a PDP-1 to create, if not the world’s first computer game, then the first digital artifact to capture the spirit and culture of gaming that would explode over subsequent decades. Below, a bulletin board of sorts, collecting resources on this seminal software object and the matrix from which it was spawned.

Steve “Slug” Russell, posing with a PDP-1.

A bibliography on hacker/computer culture.

An article on Spacewar from WebBox’s CGI Timeline.

From the MIT Museum.

Origin story from Creative Computing magazine, August 1981. I remember reading this when it first came out, at the age of sixteen!

News snippet from Decuscope, April 1962. I was not alive to read this one at the time of its publication. Decuscope, one finds, is a newsletter for DEC (Digital Equipment Computer) Users; PDFs from 1961-1972 here.

About that PDP-1 and its capabilities. It’s always vertigo-inducing to consider how computing power and resources have changed. The TX-0 on which MIT’s hackers cut their teeth had something like 4K of storage, while its successor, the PDP-1, had the equivalent of 9K. By contrast, the Google Doodle below, at 48K, is more than five times as large:

Some tools for finding one’s bearings amid the rushing rapids of Moore’s Law: Wikipedia pages for the TX-0 and PDP-1; a byte metrics table; a more general-purpose data unit converter.

What is … Watson?

We have always loved making our computers perform. I don’t say “machines” — brute mechanization is too broad a category, our history with industrialization too long (and full of skeletons). Too many technological agents reside below the threshold of our consciousness: the dumb yet surgically precise robots of the assembly line, the scrolling tarmac of the grocery-store checkout counter that delivers our purchases to another unnoticed workhorse, the cash register. The comfortable trance of capitalism depends on labor’s invisibility, and if social protocols command the human beings on either side of transactions to at least minimally acknowledge each other — in polite quanta of eye contact, murmured pleasantries — we face no such obligation with the machines to whom we have delegated so much of the work of maintaining this modern age.

But computers have always been stars, and we their anxious stage parents. In 1961 an IBM 704 was taught to sing “Daisy Bell” (inspiring a surreal passage during HAL’s death scene in 2001: A Space Odyssey), and in 1975 Steve Dompier made his hand-built Altair 8800 do the same, buzzing tunes through a radio speaker at a meeting of the Homebrew Computer Club, an early collective of personal-computing enthusiasts. I was neither old enough nor skilled enough to take part in that initial storm surge of the microcomputer movement, but like many born in the late 1960s, was perfectly poised to catch the waves that crashed through our lives in the late 70s and early 80s: the TRS-80, Apple II, and Commodore PET; video arcades; consoles and cartridges for playing at home, hooked to the TV in a primitive convergence between established and emerging technologies, conjoined by their to-be-looked-at-ness.

Arcade cabinets are meant to be clustered around, joysticks passed around an appreciative couchbound audience. Videogames of any era show off the computer’s properties and power, brightly blipping messages whose content, reversing McLuhan, is new media, presenting an irresistible call both spectacular and interactive to any nerds within sensory range. MIT’s Spacewar worked both as game and graphics demo, proof of what the state of the art in 1962 could do: fifty years later, the flatscreens of Best Buy are wired to Wiis and PlayStation 3s, beckoning consumers in endless come-on (which might be one reason why the games in so many franchises have become advertisements for themselves).

But the popular allure of computers isn’t only in their graphics and zing. We desire from them not just explorable digital worlds but minds and souls themselves: another sentient presence here on earth, observing, asking questions, offering commentary. We want, in short, company.

Watson, the IBM artifact currently competing against champions Ken Jennings and Brad Rutter on Jeopardy, is the latest digital ingenue to be prodded into the spotlight by its earnest creators (a group that in reaction shots of the audience appears diverse, but whose public face in B-roll filler sums to the predictable type: white, bespectacled, bearded, male). Positioned between Jennings and Rutter, Watson is a black slab adorned with a cheerful logo, er, avatar, conveying through chance or design an uneasy blend of 2001‘s monolith and an iPad. In a nearby non-space hums the UNIVAC-recalling bulk of his actual corpus, affixed to a pushbutton whose humble solenoid — to ring in for answers — is both a cute nod to our own evolution-designed hardware and a sad reminder that we still need to even the playing field when fighting Frankenstein’s Monster.

There are two important things about Watson, and despite the technical clarifications provided by the informational segments that periodically and annoyingly interrupt the contest’s flow, I find it almost impossible to separate them in my mind. Watson knows a lot; and Watson talks. Yeats asked, “How can we know the dancer from the dance?” Watson makes me wonder how much of the Turing Test can be passed by a well-designed interface, like a good-looking kid in high school charming teachers into raising his grades. Certainly, it is easy to invest the AI with a basic identity and emotional range based on his voice, whose phonemes are supplied by audiobook narrator Jeff Woodman but whose particular, peculiar rhythms and mispronunciations — the foreign accent of speech synthesis, as quaint as my father’s Czech-inflected English — are the quirky epiphenomena of vast algorithmic contortions.

Another factor in the folksiness of Watson is that he sounds like a typical Jeopardy contestant — chirpy, nervous, a little full of himself — and so highlights the vaguely androidish quality of the human players. IBM has not just built a brain in a box; they’ve built a contestant on a TV game show, and it was an act of genius to embed this odd cybernetic celebrity, half quick-change artist, half data-mining savant, in the parasocial matrix of Alex Trebek and his chronotypic stage set: a reality already half-virtual. Though I doubt the marketing forces at IBM worried much about doomsday fears of runaway AIs, the most remarkable thing about Watson may be how benign he seems: an expert, and expertly unthreatening, system. (In this respect, it’s significant that the computer was named not for the brilliant and erratic Sherlock Holmes, but his perpetually one-step-behind assistant.)

Before the competition started, I hadn’t thought much about natural-language processing and its relationship to the strange syntactic microgenre that is the Jeopardy question. But as I watched Watson do his weird thing, mixing moronic stumbles with driving sprints of unstoppable accuracy, tears welled in my eyes at the beautiful simplicity of the breakthrough. Not, of course, the engineering part — which would take me several more Ph.D.s (and a whole lotta B-roll) to understand — but the idea of turning Watson into one of TV’s limited social beings, a plausible participant in established telerituals, an interlocutor I could imagine as a guest on Letterman, a relay in the quick-moving call-and-response of the one quiz show that has come to define, for a mass audience, high-level cognition, constituted through a discourse of cocky yet self-effacing brilliance.

Our vantage point on Watson’s problem-solving process (a window of text showing his top three answers and level of confidence in each) deromanticizes his abilities somewhat: he can seem less like a thinking agent than an overgrown search engine, a better-behaved version of those braying “search overload” victims in the very obnoxious Bing ads. (Tip to Microsoft: stop selling your products by reminding us how much it is possible to hate them.) But maybe that’s all we are, in the end: social interfaces to our own stores of internal information and experience, talkative filters customized over time (by constant interaction with other filters) to mistake ourselves for ensouled humans.

At the end of the first game on Tuesday night, Watson was ahead by a mile. We’ll see how he does in the concluding round tonight. For the life of me, I can’t say whether I want him to win or to lose.

Requiem for a Craptop

Today I said goodbye to the MacBook that served me and my wife for almost three years — served us tirelessly, loyally, without ever judging the uses to which we put it. It was part of our household and our daily routines, funneling reams of virtual paper past our eyeballs, taking our email dictation, connecting us with friends through Facebook and family through Skype. (Many was the Sunday afternoon I’d walk the MacBook around our house to show my parents the place; I faced into its camera as the bedrooms and staircases and kitchens scrolled behind me like a mutated first-person shooter or a Kubrickian steadicam.) We called it, affectionately, the Craptop; but there was nothing crappy about its animal purity.

It’s odd, I know, to speak this way about a machine, but then again it isn’t: I’m far too respectful of the lessons of science fiction (not to mention those of Foucault, Latour, and Haraway) to draw confident and watertight distinctions between our technologies and ourselves. My sadness about the Craptop’s departure is in part a sadness about my own limitations, including, of course, the ultimate limit: mortality. Even on a more mundane scale, the clock of days, I was unworthy of the Craptop’s unquestioning service, as I am unworthy of all the machines that surround and support me, starting up at the press of a button, the turn of a key.

The Craptop was not just a machine for the home, but for work: purchased by Swarthmore to assist me in teaching, it played many a movie clip and Powerpoint presentation to my students, flew many miles by airplane and rode in the back seat of many a car. It passes from my world now because the generous College has bought me a new unit, aluminum-cased and free of the little glitches and slownesses that were starting to make the Craptop unusable. It’s a mystery to me why and how machines grow old and unreliable — but no more, I suppose, than the mystery of why we do.

What happens to the Craptop now? Swarthmore’s an enlightened place, and so, the brand assures me, is Apple: I assume a recycling program exists to deconstruct the Craptop into ecologically-neutral components or repurpose its parts into new devices. In his article “Out with the Trash: On the Future of New Media” (Residual Media, Ed. Charles R. Acland, University of Minnesota Press, 2007), Jonathan Sterne writes eloquently and sardonically of the phenomenon of obsolete computer junk, and curious readers are well advised to seek out his words. For my part, I’ll just note my gratitude to the humble Craptop, and try not to resent the newer model on which, ironically, I write its elegy: soon enough, for it and for all of us, the end will come, so let us celebrate the devices of here and now.