In 1995, cinema celebrated a distinctly ambivalent centenary, with most activity occurring at the intersection of Europe’s cinematheques, universities, and state-funded production centers. The collective commemoration yielded renewed scholarship on early cinema and even a few productions, such as the omnibus Lumière et compagnie and the BFI-commissioned ‘Century of Cinema’ documentary series. (Stateside, we made due with Chuck Workman’s nine-minute clip show ‘100 Years at the Movies,’ endlessly replayed on Turner Classic Movies and elsewhere.) Here’s to the next century!
The bureaucratic anniversary stirred a few notes of dissent, notably Susan Sontag’s widely-published think piece about the decline of film culture. Almost entirely absent from these discussions, though, was the possibility that cinema would radically transform itself over the next fifteen years.
Since 1995, cinema has become an increasingly digital medium. Beginning with the introduction of spectacular cgi f/x, the computer began to encroach on the generations-old crafts practiced in the film laboratory. (Look at the unfairly maligned Alien3 on the big screen and you’ll immediately appreciate the exquisite expertise that went into 65mm effects at the twilight of that particular art.)
Optical printing was hardly the only victim. Avid offered the possibility of non-linear editing, allowing the filmmaker to cut and re-cut sequences in virtual space, albeit at substantially diminished resolution. For a time, these lo-fi rough cuts provided a road map to the negative cutter, still charged with conforming hundreds of bits and bobs of the original camera rolls into the proper sequence laid out by the editor. (Because it is, by definition, unique and irreplaceable, the camera negative was traditionally withheld from cutting until very late in the process, with surrogates like 35mm work prints and video copies endlessly fussed over until a satisfactory ‘fine cut’ was achieved.)
Early in the new millennium, the digital intermediate process made this step irrelevant: the camera negative could be scanned at such a high resolution at the start of the game that its physical whereabouts became essentially irrelevant after its ingestion. Now the editor would work from high-quality digital files that could themselves constitute the finished product—a self-conforming copy. The post-production process became entirely digital, with film entering the equation again only at the very end, when the digital files were recorded back to a 35mm internegative, which could be used to strike dozens or hundreds of release prints. (And, of course, a 35mm camera negative has also become a luxury, with 2K and 4K digital cameras, like the RED and the Arri Alexa, claiming a dominant market share on the ‘image capture’ end.)
Over the last three years, the print itself has been pushed out of the marketplace, with fully two-thirds of American screens offering digital presentations. The conversion rate has been even faster abroad, with many governments essentially subsidizing the replacement of 35mm equipment as a matter of state policy.
For many, these data constitute a purely technical, and fairly obscure, landmark. Movies are about stories and actors, whether delivered through analog or digital means. The cinema continues, because the idea of it is longer and grander than a mere recording and projection medium. (The jarring and inherently political contrast between cinematography and videography that was once a staple on the art house scene—think of Stanley Kwan’s Actress, Krzysztof Kieslowski’s Three Colors: Blue, Abbas Kiarostami’s Taste of Cherry, Wong Kar-wai’s In the Mood for Love, or Jean-Luc Godard’s Eloge de l’amour—has subsided, with many productions seamlessly and guiltlessly shifting between the two.)
On some level, the field has been making preparations for this moment. Academic film studies departments settled on more politically-correct vocations like ‘Cinema Studies,’ ‘Media Studies,’ or ‘New Media.’ The closest thing we have to a professional alliance for film archivists is the cagily-named Association of Moving Image Archivists. Rochester, New York—once renowned for its optics titans Kodak, Xerox, and Bausch and Lomb—pushed itself as ‘The World’s Imaging Center’—a brand of committee-decreed boosterism that tourists themselves hardly understood.
Let’s consider the possibility that the celluloid film strip was not incidental to the notion of cinema, but an irreducible foundation of it. Such a possibility shifts considerably the very boundaries of cinema. As scholars and archivists have begun to suggest, ‘film history’ must take in all varieties of productions shot, edited, and exhibited on film—not just features from Hollywood and elsewhere, but cartoons, educational shorts, industry-commissioned promotional films, avant-garde art, home movies, telefilms, and commercials. Some definitions might even allow for microfilm as a critical cinema adjunct.
The age of film began in 1895 and will end, presumably, in the next few years. Its reign lasted slightly longer than a century, but that framework is still productive in rhetorical terms, albeit in a different way than the Europeans envisioned in 1995.
The entirety of the twentieth century was filmed and no other medium has ever been so inextricably and exclusively bound up with such a span, destined for terminal extinction. On some level, we must acknowledge that interest in film is inevitably a study and validation of the twentieth century in social, economic, political, and technological terms.
This is a double-edged clarity. It demands an active and critical engagement with the idea of the twentieth century. So long as we believe that the century can still yield lessons, films will still be relevant. This is not a given. Years of nostalgia and confusion could well turn the film scholar into the functional equivalent of the Civil War re-enactor.
The history of the twentieth century could be studied in video versions, if we believe that film is merely an innocent carrier of content. We may as well retrace the Oregon Trail from the comfort of our Chevrolet.
A slightly different version of this article was published in The Moomers Journal of Moomers Studies in June 2012.