With great power comes great responsibility. From the release of the seminal work of science fiction “Metropolis” (1927), through decades marked by classics such as “The Twilight Zone” (1959-1964), “Star Trek: The Original Series” (1966-1969), “2001: A Space Odyssey” (1968), “Star Wars: A New Hope” (1977) and “Alien” (1979), science fiction — especially science fiction set in space or on distant worlds — has been forced to build lasting and meaningful yet awe-inspiring stories, despite only having access to, in some cases, extremely subrealistic graphics capabilities, whether it be the creation of miniatures, simplistic set design or practical effects.
For the past several years, however, sci-fi filmmakers have had the ability to synthesize fantastical worlds essentially photorealistically — all within the humming processors of computer banks. Computer-generated imagery (CGI) has opened the floodgates to nearly infinite visual possibilities.
Not all of them have used this power wisely.
I’m not talking here about the trend of CGI run amok with widescale, numbing destruction à la New York City in “The Avengers,” or the entirely unfollowable metal-clashing, jarring sequences of “Transformers.” In fact, I’m talking about films that often have a lot of critical praise.
Take James Cameron’s “Avatar.” Many critics have recently pointed out that, for a movie that still holds the rank as the highest-grossing film of all time, “Avatar” has had embarrassingly little cultural impact, and most of us can’t even remember details of the plot, except that they rode some flying things and maybe kind of had sex with their hair.
What shocked the world most about “Avatar” was its use of CGI. Even those critics that noticed the lack of underlying strength in story didn’t seem to care, such as Shawn Levy, writing for the Oregonian. He starts by admitting that “Cameron’s script is often pedestrian in dialogue, familiar in plot, boyish in depth and complexity.” But he concludes that “The quality of being inside something alien and unreal that the film affords isn’t a novelty slapped onto a slender story but rather the essence of the story and of the experience of it.”
I not only disagree heartily, but I think this sentiment is dangerous. Sci-fi is the modern era’s grand cinema, far removed from the days half a century ago when historical dramas such as “Ben Hur” could hold such a status. Sci-fi, in many ways through its pushing of the boundaries of how real and computer-generated imagery can be synthesized successfully, has a responsibility to get it right.
Levy, like many other critics who ignored the lackluster story of “Avatar” in deference to its graphics, perpetuates the idea that the “experience” of well-leveraged CGI is more important than a strong underlying story, that glorious computer renders can actually be the story. That thought process nearly netted the film an Oscar for Best Picture — an embarrassing thought, in hindsight. “Avatar” deserves enormous credit for developing many novel CGI techniques, but that doesn’t make it a good movie, and I think we can understand both concepts without conflating them.
Yet critics continue to do so. “Gravity” received near-universal acclaim, nearly all of which admits faults in its structure and characterization but waxes poetic about how beautiful the graphics are. I don’t like “Gravity.” The graphics are gorgeous, but the story just isn’t engaging enough, and I don’t cut it any slack for being pretty. “Interstellar” is a contentious film that most people either love or hate, and a lot of that boils down to whether one prioritizes story and characters or an appreciation for the film’s ambitious scale and grandeur.
If I had to choose, I’d choose the first. I’d rather watch 2009’s “Moon,” with its absolutely laughable CGI but half-decent attempt at a story, than a gorgeous film with a weak or hole-filled plot. Recent films such as “Star Trek Beyond” and “The Martian” don’t do anything to push the bounds of CGI in film, but their graphics are story-serving, and their stories, both Simon Pegg’s script and the adaptation of Andy Weir’s novel, respectively, received deserved praise for their liveliness.
Maybe we can go further. Maybe we can capture the ambitious filmmaking techniques of “Avatar” and “Interstellar” without sacrificing the the focused, character-driven dramas that make independent films such as “Moonlight” and “Hell or High Water” so great to watch. I don’t know that it’s been done yet. If anything, last year’s “Arrival” might be a model to build on — despite the film’s relatively small scale, it’s clear that director Denis Villeneuve focused both on trying to write an engaging story with relatable characters and on using CGI not just to tell the story, but as an artistic expression in its own right. Like any film, it has issues — I have my qualms with the third act — but it’s an intelligent film, and also a truly beautiful, atmospheric one.
If I had to make a prediction — or maybe it’s just a pipe dream — it’s that as we become fully normalized to and indeed expect amazing visuals, we will become less inclined to prop up a film’s score solely on the merit of its graphics.
Instead, we should take the time to actually give credit to those films with great stories and characters that also effectively wield the immense power of CGI, rather than praising the CGI in merely passable films and informing Hollywood that we will continue paying to watch CGI explosions for their own sake.
Imad Pasha covers film. Contact him at [email protected].