The Cinematic Exploration of Artificial Intelligence: From Fear to Fascination

Reflections on Cinema’s Fascination with Artificial Intelligence

I’ve witnessed visions that defy belief, to echo a line from Ridley Scott’s 1982 classic, “Blade Runner.” As a movie critic, these fantastical images are part of my landscape. Among my favorites are the walking, talking, and often chilling robots reminiscent of those in the original “Westworld” and particularly in “The Stepford Wives.” During the 1970s, these films presented a starkly pessimistic outlook on our future, contrasting sharply with the more endearing robot companions that emerged in “Star Wars,” which would soon dominate both culture and cinema.

Throughout cinematic history, we have been haunted by these extraordinary machines, especially those humanoid creations that mirror us in unnerving ways. From the robot femme fatale in Fritz Lang’s “Metropolis” (1927) to the duplicitous android in Scott’s “Alien” (1979), these ingenious constructs are described as “virtually identical to a human,” echoing another quote from “Blade Runner.” More recently, the emergence of artificial intelligence has captivated and unsettled audiences both on and off the screen. In the latest installment of “Mission: Impossible,” Tom Cruise faces off against a sentient A.I.; meanwhile, in the upcoming post-apocalyptic thriller “The Creator,” John David Washington portrays an operative tasked with retrieving an A.I. weapon that takes the form of an innocuous child.

While I approach “The Creator” with curiosity, I can’t deny that the concept of artificial intelligence sends shivers down my spine. I attribute some of these anxieties to Stanley Kubrick—just kidding, mostly. However, my deep-seated suspicions surrounding A.I. have remained largely unchanged since the eerily emotionless voice of HAL 9000, the supercomputer in Kubrick’s 1968 masterpiece “2001: A Space Odyssey,” became ingrained in my psyche. It was HAL’s calm, measured, and relentless voice that resonated in my mind when I read the May 30 statement from over 350 A.I. leaders, which proclaimed, “Mitigating the risk of extinction from A.I. should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.”

By the time that alarming warning was issued, the Writers Guild of America had been on strike for four weeks, partly fueled by concerns that generative A.I. might encroach upon their livelihoods, potentially replacing them. Similar fears prompted SAG-AFTRA, the union representing approximately 160,000 performers and media professionals, to join the picket lines on July 14. This marked the first time since 1960 that both unions were on strike simultaneously. The Alliance of Motion Picture and Television Producers, the organization that negotiates on behalf of studios, dismissed union concerns with bland reassurances that all would be well. “We’re creative companies,” they stated in May, “and we value the work of creatives.”

If you found that statement laughable, you’re not alone. Considering the history of the film industry and the nature of capitalism, combined with the absurdity of using “creative” as a noun, it’s hard to accept this claim at face value. The writers’ concerns are indeed serious: they seek to prevent A.I. from being utilized to write or rewrite literary material or to serve as source material. In July, John Lopez, a member of the union’s A.I. working group, infused a romantic notion into these stipulations, stating in Vanity Fair that “meaning in art always comes from humans, from having something to say, from needing to connect.” While I empathize with this sentiment, I can’t help but wonder if he’s ever perused the transcript of a Disney earnings call.

Unsurprisingly, given that companies are already scanning actors’ faces and bodies, SAG-AFTRA’s stance on A.I. is alarmingly apocalyptic: “Performers need the protection of our images and performances to prevent the replacement of human performances by artificial intelligence technology.” As I read this, I couldn’t help but think of Andy Serkis, renowned for voicing and bringing to life motion-capture characters in the “Lord of the Rings” films and the rebooted “Planet of the Apes” series. Fans of his performances, including his co-star James Franco, rallied for Serkis to receive Oscar recognition. “This is not animation as much as it’s digital ‘makeup,’” Franco asserted in Deadline, a perspective that surely resonated with industry executives.

In the early, tumultuous years of cinema, filmmakers wore many hats: writing, directing, scouting locations, and acting. As the film industry transformed into a major enterprise in the 1910s, the quest for efficiency became a rallying cry, eventually evolving into a core ethos. The principles of scientific management were applied to streamline production, leading to the establishment of sprawling studio lots that centralized labor and created distinct departments (executive, wardrobe, electrical). This shift resulted in a significant division of labor. By the 1920s, directors, writers, and stars who once held sway over their work found themselves increasingly answering to producers and studio executives.

Some films seemed to nod toward the Hollywood factory model, such as Charlie Chaplin’s “Modern Times” (1936). In it, Chaplin’s Little Tramp toils in a factory designed for maximum efficiency, featuring a new “feeding machine” intended to serve workers while they labor, thus boosting production and minimizing costs. However, when the boss tests the machine on the Tramp, chaos ensues. Shortly thereafter, while tightening bolts on a conveyor belt, the Tramp suffers a breakdown, his movements becoming frantic as he is sucked into the machine—a striking image of radical dehumanization.

While some stars managed to carve out their independence within the system, especially those with savvy agents, the studios maintained tight control over the majority of performers. By the early 1930s, the industry’s most overt means of exerting dominance over its most prominent stars was the option contract, typically extending for seven years. Studios not only shaped and refined the stars’ images—changing their names and managing their public relations—but also retained exclusive rights to their services. They could drop or renew contracts, loan actors out, cast them in undesirable roles, and even suspend or sue those deemed problematic.

“I could be forced to do anything the studio told me to do,” Bette Davis lamented regarding Warner Bros., which signed her to a standard player’s contract in 1931. Frustrated with her roles, Davis realized that her only recourse was to refuse, a stance that led to her suspension without pay. “You could not even work in a five-and-dime store,” Davis remarked. “You could only starve.” While she won her first Best Actress Oscar in 1936, by 1938, she still lacked a provision in her contract for star billing. Although her fame and salary had escalated, her power had not: her third contract with Warner Bros. dictated that she must “perform and render her services whenever, wherever, and as often as the producer requested.”

Directors and writers contracted by the studios similarly grappled with the struggle for control and autonomy, as companies operated under the belief, as screenwriter Devery Freeman once articulated, that when they hired writers, they owned their ideas “forever in perpetuity.” Each studio presented a different landscape, with varied employment terms. In 1937, independent producer David O. Selznick, known for “Gone With the Wind,” explained that at M.G.M., a director’s role was “solely to get out on the stage and direct the actors, putting them through the paces called for in the script.” Conversely, at Warner Bros., he noted, a director was “purely a cog in the machine,” often receiving the script only days before production commenced.

Given the ongoing tension between art and industry that characterizes much of Hollywood’s history, it’s unsurprising that the metaphor of “cogs in the machine” frequently appears in narratives about the industry’s past. I cherish many classic Hollywood films (and miss their craftsmanship), but for all its brilliance, the system had its toll. The egregious outrages of sexual exploitation and racial discrimination are, in the end, merely the most grotesque examples of how thoroughly the system could—and did—devour its own.

“We have the players, the directors, the writers,” Selznick lamented in his resignation letter to the head of Paramount in 1931. “The system that turns these people into automatons is obviously what is wrong.” Selznick’s despair resonates with one of my favorite scenes in “Blade Runner.” Set against the backdrop of a futuristic Los Angeles, the scene involves Deckard (Harrison Ford), a gruff, Bogart-esque figure tasked with hunting down renegade replicants—lifelike synthetic humans produced as slave labor. Early in the film, Deckard visits the Tyrell Corporation, the manufacturer of replicants, to consult with its eerie founder. “Commerce is our goal here,” Tyrell states, exuding a disquieting calm as he explains his business. “‘More human than human’ is our motto,” he continues, echoing the sentiments of an old studio boss.

As in “Blade Runner,” many of the most memorable sentient machines in cinema take on human forms. This is also true in “Metropolis,” where a metallic automaton is designed to resemble a living woman, as well as in films like the original “Westworld,” “The Stepford Wives,” and the “Terminator” franchise. Even when A.I. lacks a physical body, the most impactful portrayals often feature recognizable human voices, such as Paul Bettany in “Iron Man” and Scarlett Johansson in “Her,” Spike Jonze’s whimsical yet poignant love story about a man (Joaquin Phoenix) and a virtual assistant—a disembodied entity that quickly transforms into an emotionally engaging character due to Johansson’s distinct voice and allure.

A.I. embodies a human essence in films like “Blade Runner” and others within Hollywood’s narrative landscape. Given the emphasis on character in cinema, this is hardly surprising. A robot formed from cold metal can evoke fear, but non-anthropomorphic machines lack the emotional resonance found in lifelike beings that traverse our screens. Alternately endearing and unsettling, these machines serve as companions, warriors, distractions, and ultimately, mirrors reflecting our own humanity. In Steven Spielberg’s “A.I. Artificial Intelligence” (2001), a poignant tale of a boy android named David (Haley Joel Osment) yearning for his human mother’s affection reveals a core reason for our unease: “In the beginning, didn’t God create Adam to love him?”

Isaac Asimov once noted that during his childhood, robot stories could typically be categorized into two types: “robot-as-menace” and “robot-as-pathos.” The emotional depth of Spielberg’s “A.I.” lies in its protagonist’s longing for love. Yet David is also intentionally disconcerting, embodying both machine and human traits, which ultimately renders him neither. In a sense, he becomes a troublesome child for his adoptive family and for Spielberg himself. This complexity is addressed with a fairy-tale conclusion, featuring ethereal robots known as “specialists,” slender beings that deactivate David. By that point, however, all organic life on Earth has perished, humanity having technologically advanced itself into extinction.

Whether intentional or not, films like “A.I.”, “Her,” “The Terminator,” and “The Matrix” have been foreshadowing a reality that now appears imminent. Since the launch of ChatGPT in November, the term artificial intelligence has infiltrated headlines, congressional hearings, and the picket signs of writers and actors who, understandably, fear they might be ushered toward extinction. “A.I. is not art” has appeared on several protest signs, though I prefer the more biting sentiment, “Pay the writers you AI-holes!” It’s a clever phrase, reminding us that writers are irreplaceable, or at least that’s the mantra I’ve been silently repeating while navigating this brave new world. Siri, do you review movies?

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *