While we might not be able to put our finger exactly on it, there’s something a bit different about this year’s Sundance Film Festival. It’s something in the air, something which almost feels like we’re being enveloped by a mysterious cloud that has engulfed this sleepy mountain town just as the film masses have made their way from around the world.

No, we’re not talking about the ominous cloud monster from Jordan Peele’s Nope, we’re talking about the all-encompassing Adobe Creative Cloud, which is powering the future of filmmaking here at Sundance 2023, of course!


As with years past, Adobe has made its presence known at this historic film festival. Unlike past years, the team on the ground for this fest has brought the power of cameras to Cloud and remote workflows with them.

We chatted with one of the wizards behind Adobe and Frame.io’s bold new technologies to learn how this year’s fest is powered by the cloud.

No Film School: How is the Sundance Institute using Adobe for remote coverage for this year’s festival?

Michael Cioni: So, there are currently seven different departments shooting right now. We have seven different crews at the festival equipped with Camera to Cloud technology. And they’re out shooting different interviews and panels and uploading them directly to the cloud. They’re all set up with modems and their Canon cameras and they each have an Atomos device and they’re all rocking around Park City in the cold and there’s just a modem that they’re carrying with them that is connecting with cell towers and they’re uploading everything.

So, they don’t even need WiFi, they’re just doing it all over cell, which is just crazy because it allows editors to begin cutting together these daily packages while they shoot. And you can see in some of the pictures, you have these situations where the team is sitting at a bar uploading after their shoot and they’re watching the edits as they come in.

And that's the power of this new way of working where it's completely non-linear. Instead of this linear process of your shoot, you download, you ship, you download again, and you edit… Now we're just breaking all those barriers out. And Sundance is not the first to use this tech by any means, but it’s the first time for them. And it's sort of a new application where they have to turn around stories really fast, but they need seven crews to go out. Because it’s Sundance and there's so much happening so quickly. They want all these crews up, they're shooting all day and they have all these editors cutting it together and we are basically making that the fastest turnaround possible. So that's what's happening at Sundance.

Sundance_1Credit: Adobe

NFS: That’s exciting, how do you see this technology evolving over the years for fests and other events in the future?

Michael Cioni: Well, what's happening right that's super exciting for creatives is that creatives have been using the cloud in post-production, but the future of all filmmaking will be creating in the cloud during the production phase. And that's still a ways off, but Adobe and Frame.io are positioned to build the technology to allow creatives to become cloud-native. At the very beginning, you know, you're a writer, you're probably writing in a tool that's writing in the cloud right now, and that's easy for text. But when you think about video—well, now this gets a little bit hard. How do you get this to be shooting into the cloud? And that's what these new devices with antennas are doing. And it's the first time we're seeing antennas that are not wireless transmitters, they're cloud connections, and now you can shoot directly into the cloud, and that is where filmmaking is going.

I mean, this is exciting! And then it unlocks this world of cloud automation and AI and machine learning and analysis. And filmmakers today, let's be honest, most filmmakers still have to log takes and scenes by hand labeling and hand syncing audio and video. And they also have to search for footage by searching for “man with gun” or “night exterior at the diner.” They're looking for that manually.

10 years from now, all of that will be automated because the cloud will grab everything as it's shot and analyze and sync it and merge it to the script and provide all the results. So an editor can just search “spooky castle” and you're gonna get all the takes. And then you can get even more granular. If you want it to be rainy, it will actually generate rain. The cloud will generate rain in a scene that doesn't have rain. Like all of that is what's going to come about. So we're inventing the foundation for that world to become possible.

NFS: That’s awesome if not just a little crazy to think about. How do you see these technologies being integrated into remote coverage and AI filmmaking in the future?

Michael Cioni: Yeah, I mean, in the AI world today, everybody's talking about generative AI. Generative AI, that word “Gen,” is all about creating an image out of nothing in the cloud, right? Okay. Now, what the generative AI is doing is you can take your imagination, type in a few words, and create something, and you get a still, right? And we're all impressed.

Everybody's freaking out about how amazing this stuff is. But what we've got to start doing is taking that little seed and following that to where is that going to go over 10 years. Where is generative AI going to go? Well, first of all, it's going to start to allow you to manipulate images you shot, not just images you have to be generating, you're going to generate on top of a physical image that's not yet available yet.

You're also going to be able to do it in motion too. So it's going to be able to do all the tracking and the rotoscoping and the lighting adjustments. It's going to interpret all of that so it physically works. And you can put an object or a person or change it in 3D space. You're also going to be able to do it ad hoc. Now, you’re going to have to do it quickly because it's going to use cloud computing. So there won’t be any rendering or no time to wait. So this means that for a creative in 2030, you’ll be able to shoot a scene and say, “I think I want more lens flares.” And then they show up. And you’ll be able to say, “I think it should be later in the day,” and it'll get later in the day. “I think it should be raining outside,” and it'll cut out the windows and it'll put rain outside.

Imagine that power in the hands of a director, not a visual effects company, right? And it means directors and cinematographers and all the creative stakeholders will be able to manipulate their own images and experiments at high quality and even start to understand that they may not need to rent a rain machine. They may not need to even put in muzzle flashes. They’ll just say, “I want muzzle flashes,” and it’ll just appear bigger, smaller, or whatever. They won't have to ship it out, get feedback, do it again, "Eh, too much, too little this, more smoke, less smoke." It'll just happen.

Sundance_3Credit: Adobe

NFS: What advice would you give aspiring film editors who are trying to start today, and who want to be able to embrace these new technologies?

Michael Cioni: When I was in college, I won the student Emmys. And one of the awards was that they sent me to the Cannes Film Festival. And there I got to go to these seminars. And I remember sitting there, this is like 1999 or 2000, I remember sitting there listening to Francis Ford Coppola talking about technology for filmmakers. And in the year 2000, he was already known as a guy that was always pushing tech and making these cool films. And yet he made giant indie films, right? He's like the godfather of independent cinema, right? And I remember, what got me to the Cannes Film Festival was embracing the newest technology of the time, which was mini DVs and FireWire and Final Cut Pro.

And so my advice to filmmakers is that every single iteration of technological change is an opportunity for filmmakers to exploit. And a lot of filmmakers make a lot of mistakes in being late adopters, when in fact it is the job of the artist to be an early adopter, to be a trendsetter.

Artists should be trend centers. And some of the worst decisions an artist can make is saying, "I'll wait 'til it comes mainstream. I'll wait 'til that technology is proven." Like the people that jumped on Netflix and episodic content first capitalized on it. Now it seems easy, but people said, "Well, I'm not so sure about this internet thing," right? The people that jumped on digital cinema first, the RED camera, capitalize on it. Other people said, "I'll wait, I'll wait, I'll wait 'til it's matured," right? It doesn't matter what the tech is because technology is a tool for trendsetters.

Artists should be setting trends, not following. And the opportunity now is to become a master of the cloud early so you can learn how to exploit it. Some of us hear about these YouTube stars that make all this money. They were early adopters of the technology. They weren't late adopters. That's why artists should always be striving to set trends, not follow them.

Sundance_nfs_adobe_2023

No Film School's coverage of Sundance 2023 is brought to you by Adobe.