Digital Day for Night

A quick followup to my recent post on the new Indiana Jones movie: I’ve seen it, and find myself agreeing with those who call it an enjoyable if silly film. Actually, it was the best couple of hours I’ve spent in a movie theater on a Saturday afternoon in quite a while, and seemed especially well suited to that particular timeframe: an old-fashioned matinee experience, a slightly cheaper ticket to enjoy something less than classic Hollywood art. Pulp at a bargain price.

But my interest in the disproportionately angry fan response to the movie continues. And to judge by articles popping up online, Indiana Jones and the Kingdom of the Crystal Skull is providing us, alongside its various pleasures (or lack thereof), a platform for thinking about that (ironically) age-old question, “How are movies changing?” — also known as “Where has the magic gone?” Here, for example, are three articles, one from Reuters, one from The Atlantic.com, and one from an MTV blog, each addressing the film’s heavy use of CGI.

I can see what they’re talking about, and I suppose if I were less casual in my fandom of the first three Indy movies, I’d be similarly livid. (I still can’t abide what’s been done to Star Wars.) At the same time, I suspect our cultural allergy to digital visual effects is a fleeting phenomenon — our collective eyes adjusting themselves to a new form of light. Some of the sequences in Crystal Skull, particularly those in the last half of the film, simply wouldn’t be possible without digital visual FX. CG’s ability to create large populations of swarming entities onscreen (as in the ant attack) or to stitch together complex virtual environments with real performers (as in the Peru jungle chase) were clearly factors in the very conception of the movie, with the many iterations of the troubled screenplay passing spectacular “beats” back and forth like hot potatoes on the assumption that, should all else fail, at least the movie would feature some killer action.

Call it digital day for night, the latest version of the practice by which scenes shot in daylight “pass” for nighttime cinematography. It’s a workaround, a cheat, like all visual effects, in some sense nothing more than an upgraded cousin of the rear-projected backgrounds showing characters at seaside when they’re really sitting on a blanket on a soundstage. It’s the hallmark of an emerging mode of production, one that’s swiftly becoming the new standard. And our resistance to it is precisely the moment of enshrining a passing mode of production, one that used to seem “natural” (for all its own undeniable artificiality). By such means are movies made, but it’s also the way that the past itself is manufactured, memory and nostagia forged through an ongoing dialectic of transparency and opacity that haunts our recreational technologies.

We’ll get used to the new way of doing things. And someday, movies that really do eschew CG in favor of older FX methodologies, as Spielberg and co. initially promised to do, will seem as odd in their way as performances of classical music that insist on using authentic instruments from the time. For the moment, we’re suspended between one mode of production and another, truly at home in neither, able only to look unhappily from one bank to another as the waterfall of progress carries us ever onward.

11 thoughts on “Digital Day for Night

  1. I certainly get your point, but I think there’s still something to the “movie magic” of old-fashioned, handmade effects. If you watch magicians on TV, they’re always quick to point out that the camera never cuts and there are no digital effects used. This is because, like it or not, people are fundamentally more impressed by the simplest of real-world tricks (pulling a quarter out of someone’s ear) than the most complex CGI that’s painstakingly rendered on an entire server farm over the course of a month.

    If you went to the filming of George Romero’s 1985 Day of the Dead (and stayed at just the right angles and only looked when the scenes were actually going on), you’d basically see about what you see in the movie theater. The prosthetics might be a bit more noticeable, but Romero (and effects wizard Tom Savini) used real pig intestines and a slew of other gross stuff to accomplish the effects. Romero’s 2007 Diary of the Dead might have more ambitious effects, such as a skull decomposing before your eyes, but if you went on set you might be disappointed to find that there’s not much to see because all of these effects would be added in later on computers. There’s a sense in which any zombie filmmaker worth his salt shouldn’t be afraid to get his hands a little dirty in some good old fashioned pig intestine.

    When an incredible (and improbable) image appears on a site like 4chan or digg, it’s common to hear hordes of jaded viewers declare that it’s “shopped” and move on. You rightly point out that CGI can enable us to do things that wouldn’t otherwise be possible, but that may be part of the problem. In a pre-Photoshop world (or if we can be convinced that a particular image has not been shopped), there are many things that would be amazing, even though they are easy to slap together on a computer. Even when something like the battle scenes in Lord of the Rings are extremely complex digital effects that take incredible amounts of artistry to make, once we’re in this CG world it becomes easier to take that for granted as we’re more and more exposed to films in which anything is possible. I would argue that what constitutes “movie magic” is not that anything is possible, but that we are allowed to break the rules of possibility a little bit while still remaining plausible.

    This is not to say that digital effects are completely worthless. I think that Forrest Gump is one film that brilliantly blended digital effects into a plausible story. I would argue that digital effects are enormously helpful in terms of creating a false reality that would be more difficult to accomplish otherwise but are nonetheless plausible, rather than generating things that we could never conceive of seeing in the real world (of course genre plays into how much we’re willing to suspend our disbelief, nonetheless). A day for night filter is a perfect example of something that can alter the actual reality without calling plausibility into question. I think the problem with Indy 4 (which I did like) is that instead of blending effects into the “real world,” it felt like we would go into little videogame-style FMV moments where, even though incredible things were happening, we knew none were real so we could essentially write them off as “shopped” and move on. Interestingly, I don’t have this same reaction to nearly all-CG movies like Sin City, because we don’t have this jump between the more-or-less-real world and the cgi twilight zone of infinite possibility.

    If, as you would argue, we will get used to this quantum leap between real world and twilight zone (note that I think in names of TV shows), then presumably once we’ve “adjusted our eyes,” the next generation will be able to look back on this movie and appreciate it more than we could in the present. I just don’t see that happening. I think this movie will get even more ridiculous-looking as it ages (and technology progresses past its capabilities), which just doesn’t happen as much to a movie like Day of the Dead. I think it’s less a question of whether the viewers can get used to this jump, but if technology can ever seemlessly blend the two worlds so that the jump cannot be detected.

    In another 80s horror movie, Cannibal Holocaust, legend has it that the effects (which were more like physical magic tricks than camera tricks) were so realistic that they needed to have a court case to prove that the actors were alive and it wasn’t a snuff film. People would certainly not make the same mistakes about Indy 4. So, the real question is, when are we going to have a CGI-movie that’s so real that people honestly believe that what they see with their eyes is really the truth? When will the first CGI YouTube murder alarm the authorities? Will we ever get to this point, or will we be stuck in an uncanny valley of sorts, in which everything still has a slight video-game sheen? If we can get past this problem, you’re likely correct in calling this merely a transitional period in filmmaking. If our waterfall of progress can only carry us so far until we’re hurtled down into the uncanny valley, however, I think it’ll be time to pull out a bucket of pig intestines and get back to basics.

  2. I’m not a sophisticated enough moviegoer to determine whether my disappointment with Indy4 (to be sure, I enjoyed the movie, but it was like watching the Star Wars prequels–the event was great fun, but at the end of the day, the movie wasn’t that good) is due to it’s use of CGI effects (though the waterfall scene was obscene in its effects self-indulgence). What bothered me (and here is the literary critic in me coming out) is the implausibility of it all. As outrageous as the events in the first three movies were, they never felt implausbility. Those events were wrapped up in a better story and in a genre that allowed for them. Someone told me that I simply didn’t understand Indy4 and that’s why I didn’t like it. He assumed that I didn’t get that this was an homage to 50s sci-fi in the way that the original 3 were homages to 30s serials. But of course I got that. It’s just that Indy is not believable as 50s sf hero. He doesn’ make sense there. And so the movie, especially the last 30 minutes, felt, as I said, like some sort of parody of the Indyverse.

  3. Peter: great points, and I certainly hear where you’re coming from. I’m a fan of FX production/auteurship too, and it’s interesting how much of our reception of a given effect’s success or failure depends on having some knowledge of how it was done — Savini’s pig intestines being one such item of “authenticating” insider information. But I’m curious: where on the spectrum of real world <---> twilight zone would you put optical effects that are nondigital, e.g. traditional matte paintings or traveling mattes? One of my favorite shots in the first Raiders of the Lost Ark occurs at the end of the truck chase, showing the Nazis tumbling off a cliff into the sea. It was accomplished, if I remember correctly, with small stop-motion animated puppets bluescreened into a background plate that had both real scenery and a painted element. It looks clunky now, of course, but that’s part of the charm — you can see the labor that went into it. Would the current digital version of such a shot lack interest for you because it’s — in a sense — doubly unreal?

    Consuela, I hadn’t thought about the genre angle before. I know one early draft of the script was titled “Indiana Jones and the Saucer Men from Mars” — a lurid line that gets a winky mention in Crystal Skull. But if Lucas and Spielberg were really trying for cheesy FX a la 1950s science fiction, I give them points for artistic chutzpah!

  4. Good point. I was mostly looking past the mattes because pig intestines were more useful to arguing my own point. I’d say mattes are somewhere between real world and twilight zone, but the difference between these kinds of effects and the twilight zone is that physical effects are naturally more constraining; something closer to reality like intestines (that actually could happen in the real world) is just much easier to accomplish than some of Indy 4’s more improbable effects. Once we’re in the twilight zone, animating something real like intestines isn’t much easier than animating, say, those weird bugs in Indy 4.

  5. Whoops, pressed submit by accident. Anyway…

    Because of that I’d say that CGI isn’t necessarily a problem, but because of the fact that it makes the impossible just as easy to render as the practical, filmmakers need to exercise a little more discretion. Just because CGI opens up all kinds of new possibilities doesn’t mean that the story allows for them or the audience will believe them. When we don’t believe a physical effect but we can see how much work went into it (as in the Raiders scene), perhaps there’s a sense in which they’ve earned the effect with its ingenuity, whereas we’re more likely to take CGI for granted. I think I’ll personally always be more impressed by real effects like the martial arts in Ong-Bak, and if we made that into a CGI movie (even if, or perhaps especially if we were able to make them do even more amazing, impossible things) it really wouldn’t be the same. In some sense, then, I hope that we don’t fully adjust our eyes to CG, because there’s something special about accomplishing effects in front of the camera, even if they’re using obvious tricks like mattes.

    I also agree with Consuela that genre goes a long way toward defining how far we’re willing to suspend our disbelief, and therefore which impossibilities we’ll put up with and which we won’t. As Bob pointed out, the writers of Indy 4 seem to have written the story with the infinite possibilities of CGI in mind, and perhaps in the process they ignored the preestablished limits of their genre. It’s certainly possible that new genres will spring up in the wake of CGI’s infinite possibilities. Still, I think if you look at literature, which presumably doesn’t have the same technical restraints as film, it’s clear that stories are naturally more compelling when they push the limits of possibility, as opposed to discarding any sense of reality whatsoever.

  6. If they were actually going for cheesy, then job well done, though I’m reluctant to give them that much credit.

    Also, I’m not as grumpy about Indy as I may sound. I did enjoy Iron Man better though (despite its appaling politics and the fact that Tony Stark is responsible for Steve Rogers’s death).

  7. Another passionate response coming from Dr. M.D. 🙂

    Sorry, Bob, I’m gonna have to go with Peter on this one, and I have to question some of the language you use. I’m kind of surprised at, what I interpret in your essay as, an almost complacent response to this movie; you’re saying, “Well, this is the way things are, this is the way they will be, and we just have to adjust to the changes.”

    Go ahead and throw me in with the “haters” if you like, but the serious dislocation and disjunction of elements in this movie, and the significantly negative fan/audience response to it, indicates to me that this movie has serious problems in both its story structure and composition of “effects.” Sure, maybe nothing was going to live up to the anticipation, but I don’t think I’m misunderstanding when I say that it was alright to expect a lot better than we got. Adding to this fact, where was the moral at the end of the story? Did Indy learn *anything* from this journey other than aliens exist, and “oh, I have a son”?

    “Their power was knowledge. Knowledge was their power.”

    This is the great revelation?

    I’m particularly surprised that two cinematic storytellers can get together after all these years — one of whom pioneered in bringing visual effects industry and discussion to the mainstream, and the other, one of -if not *the*- most successful storytellers of the past 50 years — and churn out something this sloppy , loopy, and half-cocked (though I guess we shouldn’t be *that* surprised after seeing the Star Wars prequels). And I’m not even a great fan of the first three Indy films…or Star Wars for that matter…

    You said:
    “Some of the sequences in Crystal Skull, particularly those in the last half of the film, simply wouldn’t be possible without digital visual FX. CG’s ability to create large populations of swarming entities onscreen (as in the ant attack) or to stitch together complex virtual environments with real performers (as in the Peru jungle chase) were clearly factors in the very conception of the movie, with the many iterations of the troubled screenplay passing spectacular “beats” back and forth like hot potatoes on the assumption that, should all else fail, at least the movie would feature some killer action.”

    And so, this adds value to the movie? Isn’t designing a movie around visual effects sequences one of the chief criticisms of filmmaking these days? And doesn’t it make Spielberg and Lucas look even *worse* for getting sucked into this lowest-common-denominator notion of filmmaking?
    Like I said to you in your previous column, if they didn’t have a good enough story to ground the outlandish “effects,” this movie shouldn’t have been made. And Spielberg, especially, should have known better.

    You said:
    “We’ll get used to the new way of doing things.”

    I’m sorry, Bob, but this is what politicians tell us when they think they know better. This is in fact what Bush’s people said: “We build empires”

    It doesn’t mean that there aren’t new ways of doing things, I just get extremely skeptical when anyone says “This is the way it has to be.” No, it doesn’t.

    “And someday, movies that really do eschew CG in favor of older FX methodologies, as Spielberg and co. initially promised to do, will seem as odd in their way as performances of classical music that insist on using authentic instruments from the time. For the moment, we’re suspended between one mode of production and another, truly at home in neither, able only to look unhappily from one bank to another as the waterfall of progress carries us ever onward.”

    I’m of the belief that many films have already done this — and the fact that audiences haven’t noticed it as much as they notice the faults in Indy 4 proves that certain mixed/hybrid approaches are working.

    See for reference:
    Dark City
    Solaris (Soderbergh)
    Bourne movies
    Anything by Guillermo del Toro
    and yes, Romero.

    But Peter has said much of this more eloquently than I did.

  8. Thanks, Michael, for the food for thought (strongly spiced though it was!). You put your finger on it: the new Indiana Jones movie reduced me, for the first time, to a state of relative complacency about digital FX in contemporary genre filmmaking. I share your surprise at this reaction. Like you, I’m rather a purist about such things: till now, it’s been important to me that good films get made and beloved media brands respected. Clearly, visual effects are a crucial element of this process: used artfully and intelligently, they catalyze brilliant storytelling. Used clumsily and indiscriminately, they kill the imagination while hurting the eyes (see: Van Helsing, Speed Racer).

    I agree with you that Crystal Skull is not an especially good film. Further, I’d agree that FX design is one of its key problems, symptomatic of an overall failure of vision. That said, the movie is passable entertainment, and maybe that was enough for me at that particular weekend matinee. Every spectatorial encounter has its own historical specificity, its own unrepeatable parameters and unpredictable determinants; I may well revise my estimate downward upon a second viewing. For now, what’s interesting to me in my own critical response (or lack of it) is that I seem to have arrived at a kind of shrugging acceptance that this will be the trend — CG FX are clearly here to stay, they’re clearly driving a shift in the fundamental mode of production for blockbusters, and clearly (to judge by the box office) most audiences don’t mind. And I stand by my assertion that, one or two generations down the road, Indy-style filmmaking will be received as unexceptional, save by we poor wretches, bedeviled by our own good taste, for whom the state of the art will too often result in digital abomination.

    OK, that’s a little glib. Let me put it this way: technological change of this sort doesn’t tend to obey anything except the marketplace. Movies went to sound, to color, to widescreen, and at each juncture something was lost and something was gained. Certain stylistic horizons were closed down while others were opened. Hacks will exploit it, auteurs will innovate and dazzle with it, and past masters of the form — such as Spielberg and Lucas — will founder or adapt. The verdict on all of this is unclear; the jury is out, perhaps permanently. Who knows how the cinematic tendencies of this first decade of the 21st century will be viewed in retrospect by future audiences?

    I dunno, maybe it’s the very embedded historicity of Indiana Jones that’s arousing such ardor in some and such passive acceptance in others (like me). The Indy series, as I suggested in my original post, has always been a kind of palimpsest of the cinematic past, a “lost horizon” of movie pleasure simulated for current audiences through filmmaking that was, perhaps, stylistically accomplished on the surface, but charmingly hoary at heart. Maybe with this newest Indy film we glimpse at last the franchise’s true essence: it was always the flimsiest of constructs, and today’s “masters” of the form — straining with their much-hyped “masterful” technologies of representation — have built something that for you is a disappointingly crude approximation of the original, but for me is evidence that the original itself never existed to begin with. In its mordant way, a liberation from false consciousness.

    Hmm, I sound grumpy (though I feel oddly cheerful). Maybe Hulk or Wall-E will perk me up?

  9. Thanks for tolerating my attitude. See, this is why I can’t reveal my own blog. It’s way too passionate to give me any security in a job search…

    (Plus I haven’t found the time to add anything to it in months)

    I’ve determined that you will be writing the Introduction to our book. Much of what you’ve written in the last few days would fit well into it.

    Wall-E looks like another wonderful success for Pixar.

    Hulk looks dreadful, but of course I’ll see it because I’m an old-school Marvel nut (You should know that Ed Norton has pretty much disowned the thing, because he disagreed with the studio over the cut of the movie — but hey, I’m sure his “preferred cut” will show up on DVD six months down the line anyway!).

  10. Just found and am enjoying the blog and in particular the points about CGI in your post. Another thought about CGI and the old day-for-night method: When it came to day-for-night, I think that audiences forgave because on some level they recognized in those unconvincing scenes that filmmakers were bumping up against the limits of technology as they tried to include a big and real and recognizable piece of life in their stories — to include, that is, anything that happens outdoors after the sun goes down. If the process didn’t allow total visual fidelity to reality, well, audiences recognized the motivations as good and met the films halfway. With CGI, it’s a different and often reversed set of motives and results. Often it’s the perceived limitless of the technology that seems to be the starting point, that seems to goad filmmakers down the path not only of the impossible, but also the ridiculous and irrelevant (even for Indiana Jones). So combine the resulting artificiality of the effects (for in fact CGI is not yet limitless in what it can show) with the fact that the artificiality is being introduced for the sake of total nonsense, and there’s a source of audience resentment. Exceptions and nuances abound, but, still.

  11. Brian: welcome! Glad to hear you’re enjoying the blog.

    Your point about spectatorial perceptions of — and responses to — CG work is well taken. The question seems to come down to (A) how aware audiences are that a film is trying to put one over on them and (B) how that effort is framed in the audience’s minds. I see an analogy between the heavy CG work toward the end of Crystal Skull and the historical practice of day-for-night shooting because both, to my eye, are immediately spottable as illusions. Not because they are poorly done, but because they mark themselves as manufactured; they announce their artificiality rather than disavowing it. Both, in other words, are conventions. The digital convention seems unnatural to us at the moment, but we will soon become accustomed to it. More importantly, with the passage of time, it will come to be cherished as part of cinematic history; we’ll look back on Crystal Skull and even more egregious examples of (what seem to us now like) visual-effects excess as an historical artifact, morally neutral and aesthetically sui generis: “those nutty CG environments of the early 2000s.”

    I may sound more settled about this than I really am. I remember arguing years ago with one of my committee members about rear-projection shots in classical Hollywood films. He insisted that those shots were taken as unreal at the time, recognized instead as a convention on the level of a montage sequence — a snippet of cinematic grammar to convey the idea “time is passing” (or in the case of rear-projection, “these characters are driving somewhere”). I disagreed, arguing that rear-projection was an attempt to fool the audience, one that no longer works because the style and technology of such representations has changed. Within the current norm, what’s out of date becomes instantly spottable: either laughed at as a failure or nostalgically embraced as a hallmark of an earlier mode of production. (Laura Mulvey calls such moments the “clumsy sublime”; here’s a post I wrote about it.)

    So ultimately I’m not sure what I believe, except that it is one thing to assess a juncture of cinematic history while we’re living it, and another to assess it from a hypothetical future vantage point — the futur anterieur tense that Lacan was so fond of.

Comments are closed.