A Cut Above: The Editing Tools Behind the Films at Sundance
New and familiar tools help shape Sundance stories.
The Park City festival has curated 118 feature-length films this year from 27 countries and 44 first-time feature filmmakers. 107 of those films will be shown for the first time. What jumps out is the number of first-time filmmakers who will get to share their story with festival-goers.
It’s well documented that Sundance is known for starting careers and developing talent. Last year alone, 45 filmmakers out of 112 films got their shot. We’ve put a spotlight on those stats to encourage you to be one of them next year. We also want to highlight the tools and technology used on Sundance projects as another form of encouragement.
Every project starts with an idea and the technology we use to tell those stories are only tools. The filmmakers behind the 2020 slate reaffirm the idea. The best tools to use are the ones that support the story you want to tell. For post-production workflows, many Sundance creators preferred familiarity over something new.
Ron Cicero of Happy Happy Joy Joy - The Ren & Stimpy Story tapped Adobe Premiere for offline, Photoshop for retouching archival photos, Pro Tools for audio and Flame for online and color grade because he “felt comfortable using them again.”
Relic writer-director Natalie Erika James approached it similarly. “We used Avid for the offline and DaVinci Resolve to grade the film. No specific reason for either – just what we’re used to and what software our collaborators worked best with.”
The Cedric Cheung-Lau film The Mountains are a Dream that Call to Me which follows two climbers trekking across Nepal, utilized Final Cut Pro X, graded in DaVinci Resolve and mixed in Pro Tools. “I don't believe there was a distinct reason that we used this software apart from the simple reason that this is what the relative collaborators used,” says the writer-director. “I did not want to interfere with our editor's creative capabilities.”
Collaboration is Key
Avid, Final Cut and Premiere all have collaboration tools making it easier for those in remote locations to connect. On And Then We Danced, editors Levan Akin and Simon Carlgren used Premiere Pro out of necessity as the story features multiple languages. “Simon already knew Premiere but since I was the only one who understood Georgian and lived in Sweden, we used Premiere to share the same project,” says Akin. Director Matt Wolf of Spaceship Earth used Avid to collaboratively work with 600 hours of archival footage across multiple systems simultaneously.
Other filmmakers took advantage of specific tools. For the trending comedy Save Yourself! from writer-directors Eleanor Wilson and Alex Huston Fischer and starring Sunita Mani (Glow, Mr. Robot) editor Sofi Marshall used Premiere Pro for its Frame.io integration which allowed for several rounds of edit notes. DaVinci Resolve was used for dailies and syncing.
The short film The Deepest Hole from Matt McCormick highlights one of the oddest stories from the United States-Soviet Union Cold War era where the two countries tried to dig the deepest hole. The project was made from animation and archival footage. In post, Premiere Pro and After Effects tuned the tale because of its dynamic linking ability. Even its sound design was arranged in Premiere Pro.
Extended Reality (XR) Finds its Home
Many of the well-known programs made an impact on the filmmaking. Others like the free sound software Audacity, the node-based compositing and visual effects platform Nuke, and even Flash got a nod too. But it’s the Extended Reality (XR) space where we’re seeing the most innovation.
Unity is one of those platforms which was used on Jon Favreau’s The Lion King where cinematographer Caleb Deschanel moved through the environment to choose the best camera angles of each shot from inside a virtual world. Unity isn’t the only name in the game. Maya, Unreal Engine and Blender are all viable options. Another, the open-source toolkit VVVV was used to create the interactive short Chomsky vs. Chomsky: First Encounter – a story surrounding artificial intelligence.
Depthkit is another XR option and is used for capturing volumetric video. It’s made for the Microsoft Kinect Azure. Creators Antoine Viviani and Pierre-Alain Giraud used it on Solastalgia, a work of fiction and art. Wearing a HoloLens mixed reality headset, people visit a planet in ruins and interact with various holograms. In post, Depthkit integrated the footage to work with Unity. Then Holocene, a software created by Asobo Studio, made the Unity project compatible with the HoloLens devices.
What Can We Learn?
Stick with what you know. The technology is at a point where it doesn’t matter what you choose but rather what you’re comfortable with. You might read or hear things like “industry standard” but the most important things to consider is what you need to tell your story and what will be your final deliverables. Are there mandatory specs? If yes, can you tools create the mandate? Try to keep it simple. You never want to get bogged down in the minutiae of workflow – it hampers creativity.
Update: In a previous version the infographic above mistakenly placed Filmmlight Baselight in the wrong category. It has been corrected.
For more, see our ongoing list of coverage of the 2020 Sundance Film Festival.
No Film School's podcast and editorial coverage of the 2020 Sundance Film Festival is sponsored by SmallHD : real-time confidence for creatives and by RØDE Microphones – The Choice of Today’s Creative Generation