NAB 2023 was a whirlwind of innovation. If you want to know about everything we found during this year’s trade show, check out our articles and the No Film School Podcast.

Yet beyond the amazing tools we saw at the show, one of the coolest experiences we had this year was having a one-on-one chat with Grant Petty to ask him some questions. Our two big takeaways? AI tools have been in use by BMD for years, and the Blackmagic Cinema Cameras have always been about delivering more than you need.

One quick note before we get started: This entire interview was recorded on the show floor using an iPhone with the Memo app. That noisy recording was run through the new AI transcription tool in DaVinci Resolve. None of the text was corrected in any way, and we didn’t discover any mistakes in the transcription, which shows off the power of Resolve's new features. 

How BMD Approaches AI

Throw a stone at any creative product nowadays, and you’ll hit an AI-supported feature. But for Blackmagic Design, AI has already been used for about half a decade under a different name — the DaVinci Resolve Neural Engine.

“We've been doing it for a few years,” Petty said. “I think we've had AI features in [Resolve] for about three or four years now. We were quite early in it. But we use it mainly to speed up workflows and support creatives.”

DaVinci Resolve Patch ReplacerDaVinci Resolve Patch ReplacerCredit: Blackmagic Design

For Blackmagic Design, the idea of using AI isn’t how it can create things for you. It’s about how it can support you as a creative. Petty never saw AI as a tool for generating new content.

“I'm a bit concerned about the idea that the AI will replace creative people,” Petty said. “The problem with AI is it's really an aggregator of content. Like it's like a super-intelligent search engine.”

Petty continued to say, “[I]t's only going to be able to see what exists. It's scouring. Like, you know, you see someone do a bit of AI, and it writes a bit of code. That code is essentially derived from all the open-source code that's already out there. Otherwise, it wouldn't know what to do. So it's not actually intelligent. It's like the ultimate MBA.”

So, I asked, “What is the approach that Blackmagic Design is taking to implement AI tools into its products like DaVinci Resolve?”

“In DaVinci, we use it for isolating objects in the image, which is fantastic for creating windows that match like faces or elements of faces,” Petty said. “So, if you're really getting in there and doing intricate grading, you can do a hell of a lot with it and move really quickly on it.” 

DaVinci Resolve Face RetouchingDaVinci Resolve Face RetouchingCredit: Blackmagic Design

Petty continued, "AI should really support the creative because what we want to see is people going in directions that we didn't expect or in new directions."

I asked Petty to elaborate on what he meant.

“If you look at Resolve, it's got AI in all different places. There’s a lot you can do with it,” Petty said. “I mean, if you look at some of the stuff we're doing with it now, depth maps and being able to set light sources in the color page and all kinds of stuff, you know, analyzing the dialogue, the text out, the credits to subtitles. I mean, this is really, really useful stuff. It saves huge amounts of time.”

That was the foundation of our conversation: Time.

When filmmakers, editors, and colorists create something new, how much time are they putting into their craft? What happens when you can do more with the time you have? Those are the questions driving the development of DaVinci Resolve. 

DaVinci Resolve Depth MapDepth Map generated from an image in DaVinci ResolveCredit: Blackmagic Design

“Are we using it a lot, and is it a time-saver?” Petty asked. “I think that's the big question of it. If it's supporting creative people, I think it's good, you know? And if it's saving time, it's good. Sometimes the simplest user interface changes are actually things that matter the most.”

For colorists, AI support can be invaluable when isolating different subjects in the frame. Petty said that Resolve “is using artificial intelligence to recognize where your eyes are. You can look at a picture and go, that's eyes, that's a nose, that's a mouth, that's cheeks, that's an elephant, that's a horse, that’s the rider on the horse. That's their pants, that's their shirt. You know, it can detect all these different things using artificial intelligence to work out all the pieces, then create [power] windows for them. Then you can color correct all the elements of the window.”

Having worked with Resolve’s power windows quite a bit, I would have loved to have those AI-assisted tools at my disposal.

But the support doesn’t stop there.

One of the coolest DaVinci Resolve features that Petty mentioned was the audio clean-up tool that removed background audio.

“In some situations, we want to clean up the audio, the presenter's audio. But what we do is we use an AI to recreate the dialogue,” Petty explained. “What we do is we essentially throw away the original audio, and we recreate the audio based on the characteristics of the person talking. So what you're actually hearing is, it sounds like the background sounds just gone away.”

DaVinci Resolve Fairlight FX and Plug-insFairlight FX and Plug-ins in DaVinci ResolveCredit: Blackmagic Design

Petty also gave another example. “A good example is the lens flare filter in DaVinci. It actually simulates the lens. It simulates the light,” Petty said. “DaVinci is a high-end Hollywood tool. So, when we have a lens flare, we're not just going to draw circles. What we do is we pick the lens, we analyze the lens characteristics because we've measured it and calculated it, and it works out what the light would be like by going through that lens, and it creates a realistic lens flare.”

“So this is the way we use it,” Petty said when talking about AI. “You look at a filter in DaVinci, and you go, "Oh, lens flare. Okay, why'd you just adjust the lens flare?” No, it's literally an artificial intelligence simulating a lens. And the way you see, it is just this lens flare.”

Changing How We Edit Documentaries

For NAB 2023, Blackmagic Design released DaVinci Resolve 18.5, and with it came text-based editing, which I think will be one of the most invaluable tools for creatives working on documentaries.

Petty explained the concept, saying, “You can go to a track and just say, I want to make subtitles. So now what it will do is it will build a whole subtitle track from the dialogue. Now it's using artificial intelligence to analyze the dialogue and then recreate the subtitles. So not only can it detect and create subtitles from the dialogue, [but] you can then also use text for searching. You can look and go, 'Okay, I want to find dogs.' And it'll get all the media you've got and group it by what's in it and create sort of categories.”

Using this approach, creatives can not only sort their footage faster but also use the text to edit their footage. Just copy the text, paste it, and your connected clips are reordered to the text you chose.

“What you see is better sorting, better editing features, and subtitle features.”

DaVinci Resolve Speech to TextSpeech to Text in DaVinci ResolveCredit: Blackmagic Design

Why 12K Matters

When the URSA Mini Pro 12K came out, people were floored by the resolution (for better or worse). But for Petty, the resolution wasn’t just about the pixels. It was how they were utilized in camera and upon delivery. 

“One of the things we did with the 12K is we went so high on resolution. But it's actually not just a 12K sensor. It's a multi-resolution sensor. Because it's a symmetrical color pattern, we can scale the sensor.” With this foundation, the URSA Mini Pro 12K can shoot in multiple different resolutions without cropping in on the sensor.

URSA Mini Pro seriesThe URSA Mini Pro seriesCredit: Blackmagic Design

But the resolution was only half the equation.

“So what I thought is if you got enough high enough resolution, not only do you get beautiful, subtle fine detail handling, but you also get full RGB because you've got enough pixels. When you reform the color, you've got more than enough color bandwidth," Petty said. "So you get full RGB bandwidth, and that’s really just beautiful. Like, it creates a detail that's rich in color, like major scenes and things like that. It just looks insane. Because the detail and the fine anti-aliasing sort of just has a smoothness in the high detail.”

This idea of high resolution truly started with BMD’s first camera.

“Like even our first camera was 2.5K. You know, that was really for HD shooting. But we wanted the extra resolution to be able to handle fine detail, you know? We've always tried to go a bit further to get a bit more resolution,” Petty said. It was all about giving creatives more resolution than what they are eventually delivering.

Blackmagic Cinema Camera 2.5KThe original Blackmagic Cinema Camera 2.5KCredit: Blackmagic Design

“That’s what the cameras need to do," Petty said. "They need to be higher resolution than the output standard. You want a much higher resolution, so you can create beautiful images, and you can manage all that detail. It's the detail that matters.”

Petty continued, stating, “You can definitely see the difference when you're running that 6K camera in Ultra HD. Versus the 4K camera in Ultra HD. The 6K ones, the images just look a little bit more subtle in the detail and everything. That's what I think the high resolutions give you. In HD you've got detail. You know, versus SD to HD. But in Ultra HD and high resolutions, what you get is that texture. And when you really look at the texture, that's what makes real life. Life is textures.”

On the Future of Blackmagic

So, where does a company go when it achieves 12K resolution using a symmetrical sensor? Is there a new camera on the horizon? Will BMD focus on DaVinci Resolve for the foreseeable future?

“Sometimes it's hard to know how things are going to come off,” Petty said. “What's going to happen when and even what's going to sell? Sometimes you can do things, and people don't really need it yet.”

Petty even mused that the URSA 12K might have been a bit early to market even. But for him, it’s all about trust. “It's taken a long time for people to really trust us," Petty said. "But I think people do now.”

While I didn’t get any hints at what’s coming down the pipeline for BMD, I’ve realized that I trust Petty and his team to deliver the tools that I may need. I've shot on my early BMD cameras for most of my early live production work. Now that I'm moving more in the narrative world, Blackmagic Design is always at the top of my list of trusted tools.

But what do you think? Let us know in the comments!


No Film School's complete coverage of NAB 2023 is brought to you by Blackmagic Design and Lexar.