AI has arrived in Hollywood. It’s a lot more boring than you might think

AI has arrived in Hollywood. It’s a lot more boring than you might think

We’re a long way off from a movie generated by a single prompt.

BY Ryan Broderick

The new horror film Late Night with the Devil hit theaters late last month amid a lot of really good buzz. It has a 96% on Rotten Tomatoes and has broken box office records for its distributor, IFC Films. It seemed poised to become the indie movie success story of the first half of 2024. But that buzz has curdled quite a bit once word started to circulate that generative AI had been used in the film.

It didn’t take long for viewers to notice the use of AI in the film, the film-centric social network Letterboxd filled up with negative reviews. “Listen. There’s AI all over this in the cutaways and ‘we’ll be right back’ network messages,” reads one top review for the film. “Complacency in accepting AI now is complacency for AI in the future—a very bleak future,” reads another. 

But the way AI was used in Late Night with the Devil was interesting because it highlights an often-undiscussed element of the tech: So far, it’s been deployed in pretty boring ways.

When you hear that a horror film used AI, you might assume that it was in place of practical effects or traditional CGI, but that’s not actually the case here. Late Night with the Devil is full of your classic puppetry, blood, slime, and levitating objects. Instead, the AI images only appear on screen briefly.

Late Night with the Devil is set on Halloween night in 1977 and follows a Johnny Carson-like late night host who ends up summoning a demon on live television. The movie is constructed like an episode of a talk show, and it cuts to ad breaks at different points throughout. It’s during these cutaways that the AI-generated images are used as interstitial cards. They’re basically just retro-looking pictures of skeletons with the fictional talk show’s name written on them. 

(Even more curious, they appear to be a relatively new addition to the movie. According to viewers who saw its premiere at SXSW 2023, the AI-generated images weren’t in that earlier version. Directors Cameron and Colin Cairnes told Variety, “We experimented with AI for three still images, which we edited further, and ultimately appear as very brief interstitials in the film.”) 

Doug Shapiro, a media consultant and analyst, tells me that generative AI is popping up most commonly in relatively small-stakes instances during pre- and post-production. “Rather than spend a ton of money on storyboarding and animatics and paying very skilled artists to spend 12 weeks to come up with a concept,” he says, “now you can actually walk into the pitch with the concept art in place because you did it overnight.”

Studios have also begun using AI to touch up an actor’s laugh lines or clean up imperfections on their face that might not be caught until after shooting has wrapped. In both cases, viewers might not necessarily even know they’re looking at something that has been altered by an AI model.

But Shapiro thinks that AI usage will increase in Hollywood as studios grow more comfortable with the tech. He also suspects the current backlash against it, as we’ve recently see with Late Night with the Devil, is likely temporary.

“There’s this kind of natural backlash that tends to ease over time,” he says. “It’s going to get harder and harder to tell where the effects of humans stopped and AI starts.”

Of course, that’s assuming both that AI can get better and unions let studios continue to use it. Last year’s WGA and SAG strikes were inspired by a bevy of challenges facing America’s entertainment industry, with AI cited as a chief concern. And both WGA and SAG walked away with basic protections against AI encroaching on their livelihoods. But that hasn’t stopped the AI creep.

One TV editor I spoke with, who asked not to be named, tells me that at their last job they were asked by an executive to use Adobe’s new AI editing tool, which would have essentially replaced their job. They say that it was pitched to them as a way to automate a part of their work so they had more time to “do more other stuff.”

“The excitement from the senior executive who brought it up was troubling,” they say.

 

And this is the main way viewers are encountering generative AI in movies and TV shows right now. Cheap filler used to speed up production or quickly fill in gaps.

All the way back in 2021, Marvel was making AI-generated replicas of extras to use the background of scenes of the WandaVision. And last year, Marvel’s Secret Invasion used a custom Stable Diffusion model trained on original artwork to generate an eerie opening credit sequence (which, if you ask me, was one of the only interesting things about the whole show). Earlier this year, HBO Max’s True Detective: Night Country was caught using AI-generated posters in one scene. And last month, R. Lance Hill, the screenwriter of the original Road House, sued Amazon Studios, accusing them of copyright infringement and claiming the studio used AI audio to do automated dialogue replacement during the SAG strike last summer. None of which are particularly exciting examples of this supposedly revolutionary technology.

“I think the people who make those decisions just . . . care about ‘Okay, well, do we have to hire an extra person to do this thing? No? Great. Let’s choose the cheapest option,’” says Zach Silberberg, a producer and TV editor who has worked on programs such as Patriot Act with Hasan Minhaj and The Problem with Jon Stewart.

But it’s not just Hollywood executives dreaming of cheaper productions driving the AI entertainment boom. AI companies have quickly realized that Hollywood is the perfect place to shop around their newest models. Bloomberg reported recently that OpenAI is now actively pitching studios on their new AI video generator, Sora.

“Someone put out this video they made with Sora,” Silberberg says. “It’s the first short film made with Sora. And it’s about a balloon-headed man. And every shot looks like stock footage. And there’s no intention or artistry.”

Finding actual artists who are willing to use AI tools with some kind of intention, however, is tough. Most major art-sharing platforms have faced tremendous user backlash for allowing AI art, and there’s even a new technology called Nightshade that artists are using to block their images from training generative AI. Most of the AI art you probably come across online right now is not coming from actual artists, but rather guys who pay for X verification and publish tech newsletters on Substack. That, and those pages on Facebook that post pictures of Shrimp Jesus

But graphic designer and digital art pioneer Rob Sheridan has been experimenting with tools like Midjourney since they first launched. 

“My natural instinct was to say, Okay, how do I, how do I make this work for me? How do I validate it fitted into me as a creator? And, and can I find authorship in it?” he says.

As he sees it, a large part of the backlash against AI technology in Hollywood is directly caused by both tech companies and studios claiming that it will eventually be able to spit out a movie from a single prompt. Instead, Sheridan says it’s already obvious that AI technology will never work without people who know how to integrate it into existing forms of art, whether it’s a poster or a feature film.

“The thing that is hurting that progress—for this to kind of fold into the tool kit of creators seamlessly—is this obnoxious tech bubble shit that’s going on,” he says. “They’re trying to con a bunch of people with a lot of money to invest in this dream and presenting this very crass image to people of how eager these companies are, apparently, to just ditch all their craftspeople and try out this thing that everyone can see isn’t going to work without craftspeople.”

Which is certainly true in the case of Late Night with the Devil. The AI interstitials aren’t so awful that they break the whole movie—they’re only on screen for a few seconds at a time—but they do stick out like a sore thumb in a movie that is so clearly handmade, full of practical effects. Which is really the ultimate question when it comes to using AI: Is the quick fix worth it? And it’s likely many studios are about to discover it’s not.

 

ABOUT THE AUTHOR

Ryan Broderick is a tech journalist who writes the Garbage Day newsletter and hosts the podcast The Content Mines. 


Fast Company

(6)