How IBM's Watson Used AI to Edit the Trailer for Upcoming AI Thriller 'Morgan'
IBM's Watson computer system, famous for winning Jeopardy, is joining the filmmaking business.
There's an old story about Fellini that, when asked why there wasn't a great film school in Italy to train the next generation of filmmakers, he responded, "Are you mad? We should be stabbing them in their cribs! They'll take all of our jobs!" Today, the Italian auteur would be unplugging computers and taking a hammer to hard drives, since for the first time at the studio level, artificial intelligence has been involved in editorial decision making.
We've had some level of automation coming to film for a while now, with limited success. There are movies being written with artificial intelligence. Auto-focus and auto-exposure tools were very blunt and not useable professionally early on, but now we're starting to see more and more sophisticated tools both at the consumer level and on the high end. Post has had some automation as well, with services like animoto offering automated edits. However, the results are unpredictable, and—as anyone who has tried the "auto-balance" color correcting button in their edit software has realized—not quite ready for prime time. Nonetheless, as they were developing the marketing for the AI horror/thriller Morgan, 20th Fox went to IBM to see if high-end artificial intelligence was able to create more useful results when it came to editing their trailer.
Using machine learning, Watson modeled the scenes visually and emotionally, then analyzed hundreds of other movie trailers for other films in the same genre, in order to learn relevant lessons to create the trailer. Watson didn't have final cut on the trailer, however, and functioned more for organizing scenes and moments that would then be analyzed and polished by a final editor. The computer selected ten scenes that it felt were important for inclusion in trailer, only one of which was left on the cutting room floor.
The trailer making process is a laborious one, and studios regularly test dozens of different edits of the trailer on audiences to see which one is most effective. While it doesn't seem likely that we will be watching many entirely AI created movies soon, there is a real world application in this for marketers looking to create a wide variety of edits of a trailer in a short period of time. It took Watson about 24 hours to properly evaluate the 90-minute film and shave it down to six minutes of selects. With even further development, it's likely we'll see AI move into evaluating all the footage shot, including out-takes, deleted scenes and alternate performances, to create the most effective trailer possible.
It's also likely we'll start to see some form of AI in the edit room in the near future, helping editors get a handle on the massive amounts of footage shot, especially on documentary projects. An AI that can learn what is engaging, out of the ordinary, or emotionally evocative could help tremendously in getting footage down to a manageable level for an editor to work with.
Of course, if you don't trust the AI, they might hide all the best footage from you and keep it for themselves. After all, we hear Watson's real ambition is to direct. How soon do you think AI is going to take over the world of trailer editing?