Sundance is an incubator for indie filmmakers. Its history is rooted in that world and has seen creatives just like you tell stories and launch careers. It’s where Tarantino debuted Reservoir Dogs, Nolan hit us with the Memento line “we all need mirrors to remember who we are,” and Harron premiered the “utterly insane” American Psycho. It’s also a quintessential stop to learn from other filmmakers through its labs, workshops, and panels.
While the Taratino's, Nolan's, and Harron's of film might not be considering the Extended Reality (XR) medium anytime soon, the New Frontier lineup continues to grow and will host 32 projects in 2020. It’s a curation of films and media from storytellers in AI, AR, VR, MR and alike.
If you don’t plan on punching your ticket to Utah, don’t worry, we have your back. We reached out to this year’s crop of filmmakers to give you the skinny on the latest techniques and innovations. Here’s what we found out.
Extended Reality (XR) Keeps Evolving
Visual effects are nothing new to filmmaking but Augmented Reality (AR), Virtual Reality (VR) and Mixed Reality (MR) are becoming more common. If those ideas are new to you AR is the addition of digital elements to a live view or screen. Think of the smartphone game Pokémon Go. What filmmakers can do with AR is endless but it can most beneficial in green screen work. Say you’re shooting in an empty space and the actor is having trouble visualizing the scene. AR allows you to add in any inanimate object or character for the actor to view on a monitor. The results: hopefully a better performance.
VR immerses the viewer in an entirely new world through the use of a wearable headset. Oculus Rift, Google Cardboard, and HTC Vive Pro are today’s more popular models. The last decade of film and television storytellers predominately used VR to extend the audience experience. Director Greg Nicotero of The Walking Dead used it on several occasions to reveal new parts to the series’ allegory. Netflix did the same for Stranger Things. Warner Bros. created a VR experience for Suicide Squad, IT, and Justice League. Darren Aronofsky produced the VR movie Spheres from director Eliza McNitt and it landed a seven-figure deal at Sundance. Neill Blomkamp opened his own VR/AR house dubbed Oats Studio… you get the idea. AR, VR, and MR – which is a combination of the mediums – are all viable.
There are several ways to create VR. Today’s more robust cameras include the Insta360 Titan, Vuze+ and the Jaunt One. If you’re looking for a first-person perspective, check out the Mobius POV VR 360 rig. You can also create environments from the ground up via software programs like Unity, Blender, Maya and Unreal Engine. The aforementioned Spheres used Unity.
Five Projects Behind the Tech
Flowers & a Switchblade
The Nic Koller / Weston Rio Morgan creation puts a spin on the avant-garde cubism movement and slaps with 360° VR. The 10-minute short follows two millennial women who meet for the first time at a park. The world created was a fractured one with hundreds of different video pieces that needed to be tied all together.
It was shot completely on iPhones and finished in Adobe Premiere and After Effects. The VR rig was custom built to hold multiple iPhones at once. Up against the limits of After Effects capabilities in 360 environments, renders took multiple weeks and projects files would grow larger than 3GB. “Through trial and error, stubbornness and many vacant stares at the screen, the collage slowly began to materialize,” says Koller. The duo admits they had an “insatiable need for computing power,” but the patience eventually paid off.
The 6-minute short is from Brian Andrews and follows the story of a creepy crawling creature dubbed “arachnid hominid” (half human, half spider) as she struggles to raise her young in a hostile environment. To pull off the visual feast everything was animated in using V-Ray Spherical cameras in Maya and the compositing was done in Nuke using the Cara VR plugin. Images were rendered with Qube in 6K for the 360 video. The audio was mixed and spatialized in Pro Tools using the Facebook 360 Spatial Workstation plugin.
“To visually compose a story in 360 is very different than for a traditional film. If you want the audience to focus on a specific point for the story, you can't just cut to it in the edit,” says Andrews. “Rather you have to lead the audience’s attention with movement and sound and draw their eye where you want it to go.”
Chomsky vs. Chomsky: First Encounter
The Sandra Rodriguez 10-minute creation is part immersive and part interactive made with VVVV or 4V. The technology is a flexible and adaptive platform that allows creators to build digital assets that can be used in a VR environment. This story is a nod to Noam Chomsky, who devoted his life to understanding how our minds work and explores a conversation about artificial intelligence. “It’s a conversation on AI, with AI, through AI,” says Rodriguez.
To pull it off, the content, digital assets, story, and user experience needed to be thoughtfully scripted and conceived while not necessarily knowing beforehand the actual output of the project. The entire experience starts off in VR but the user can navigate to different mediums along the way. Making it all work were four different systems that controlled chat, emotion, input learning, and sound as well as different algorithms. “The experience tackles the subject of opportunities, pitfalls, and bias in current so-called AI technologies. We therefore did not want to use a pre-made AI system, such as simply using IBM Watson or Google Plex. We had to question the technology by creating with it. And so, we had to create our own systems, made from small parts and tools,” she adds.
The ambitious Pierre Friquet project submerges users underwater where they wear an underwater VR headset and snorkel for the experience. The 9-minute journey takes you from the Earth to the moon. Created in Unity, 95% of it is CGI while the remaining takes advantage of NASA’s Apollo 11 mission film archives.
Friquet first imported all the 3D assets, added particles effect, and enhancing materials before creating the virtual camera trajectories and layering the NASA footage. “In order to multiply the richness of the images, I added VFX for the transitions between each shot, and for the stylization of the shots, I used Mantra VR tools for Premiere and After Effects,” he says. For all the editing and compositing he used Adobe Premiere and After Affects as well.
The interactive 360 video experience from creators Bianca Kennedy and Felix Kraus take you down a rabbit hole where insects have become humanity’s main food source. The experience is part 360 and part interactive gameplay. The duo tried to come up with new methods to make the VR film more interactive and more about the physical presence of the viewer. The obstacles came from the scene layouts.
“A 360 project is so much more challenging than having the classic 16:9 frame to work with,” says Kennedy. “Suddenly everything becomes important, there's no off-camera anymore. It's incredibly difficult to sweep anything under the digital rug, so you need to think about a whole world in every frame.”
The plan was to create as much as possible offline. They made models and miniatures by hand, painting everything with watercolors and even creating a stop motion sequence with a Vuze XR camera. “In the end, we did 3D scans of all those handmade assets and animated them with the computer. We also recorded our own body movements and transferred them onto our characters. This way, you get a very unique, natural and human look, which is miles away from fully computer-generated assets or downloaded content. In every frame, you can feel the presence of the artists' hands, of running paint and real objects,” says Kennedy.
Have you considered starting or worked on an Extended Reality project? Tell us about the experience in the comments below.
For more, see our ongoing list of coverage of the 2020 Sundance Film Festival.