Making a movie is a complex, multi-stage process; from pre-production to post-production, there are several ways in which artificial intelligence (AI) can help to simplify processes. Modern movies use AI in some capacity, whether it’s through generating immersive backgrounds or even helping to write scripts.

We’re sure you can probably guess a few uses of AI in film – from the immersive backgrounds to completely alien characters, some things can be pretty obvious. But AI is doing more than just generating the atmosphere on different planets. It’s also helping professionals create engaging trailers, music, and more.

In this article, we’ll focus on five uses of AI in film:

Scheduling/pre-production

AI-based startup Cinelytic helps studios and independent film companies make smarter and quicker decisions through the film’s value chain. Warner Bros., for example, partnered with the startup to implement an AI-based project management system that provides financial modeling, analytics services, scheduling, and more.

Scriptbook, a Belgium-based company, provides an AI-based financial forecast and script analysis tool that examines scenes and offers recommendations on whether or not they should be used for promotion.

It’s not meant to be a replacement for a decision-maker, but it can be a helpful tool by it giving detailed scene breakdowns of age restrictions, genre, MPAA (Motion Picture Association of America) ratings, and more.

Scriptwriting

The AI Furukoto has written a script for a short film, called Boy Sprouted, which depicts a boy’s aversion to tomatoes and his mother’s efforts to get him to eat them. This 26-minute short was showcased at the Short Shorts Film Festival & Asia, which explored the theme of meta-cinema this year.

Following Google’s AI Go win, filmmaker Oscar Sharp and his technologist collaborator Ross Goodwin built a machine to write screenplays. Jetson was fed with hundreds of scripts and random seeds from a sci-fi filmmaking contest. The result? Sunspring.

Benjamin 2.0, an AI model, programmed by Ross Goodwin to write screenplays, is responsible for its sequel, It’s No Game, a sci-fi short film starring David Hasselhoff.

Visual effects

CGI, or Computer Generated Images, is being used to bring back to life dead actors on screen. In the Star Wars universe, both Carrie Fisher (Princess Leia) and Peter Cushing (Grand Moff Tarkin) were recreated for the film Rogue One. CGI was used so they looked like how they did in 1977’s Star Wars: A New Hope.

In Episode IX: The Rise of Skywalker, Carrie Fisher passed away before the film’s completion, and CGI helped complete her story.

How was Peter Cushing brought back to life?

British actor Guy Henry was cast for the physical role, impersonating Peter Cushing’s mannerisms in full costume. Industrial Light & Magic then applied motion capture dots over his face, with a head-mounted camera capturing his facial movements.

Henry’s movements were then transferred into a digital model of Grand Moff Tarkin, and the special effects team edited Peter Cushing’s facial expressions on the model, based on the actor’s past performances.

Watch the video below to learn more:

Trailer creation

Researchers from the University of Edinburgh combined two neural networks to create an AI model that could generate engaging trailers. The system’s now generated over 40 trailers for existing films.

The first neural network analyzes a film’s video and audio to pinpoint scenes of interest while the second neural network judges what’s interesting by watching a textualized version of the film. It uses natural language processing to then identify emotional and important moments.

This trailer automation is based on the performance of low-level tasks, like action recognition, person identification, and sentiment prediction, alongside high-level tasks, like event connection and causality. Based on how the two neural networks process input data, the model generates trailers by using “movie understanding.”

IBM Watson was used to cut a trailer for the film Morgan in collaboration with 20th Century Fox, creating the first-ever cognitive movie trailer.

Music composition

Reinforcement learning can help AI algorithms analyze data from various compositions, identifying patterns and understanding which ones are associated with specific genres or sound more enjoyable.

The AI model can also generate new musical patterns based on data that it’s analyzed, creating templates for background scores based on a musical genre or situation. Big technology companies are already using AI to create music autonomously or to help musicians compose.

Sony, for example, developed Flow Machines, which is an AI system that released the song Daddy’s Car.

The company entered the AI Song Content 2021 with the song Nous sommes Whim Therapy, which was a human-AI collaboration project. AI tools like BassNet, DrumGAN, DrumNet, NOTONO, and Poiesis were used to finish the song.