The iPhone 13 Pro is available today, and it comes with two super cool features filmmakers should take note of. The first one, ProRes video, actually isn't available as of the release and should be coming sometime this fall, so we can't talk in detail about it yet. It should be good; bigger files, yes, but more flexibility in post.

But the other key feature, Cinematic Mode, is here, and we got hands-on time with a unit as early as we could so we could report back on how it operates.


Cinematic Mode

First off, what is Cinematic Mode? It is a mode that creates an artificially smaller depth-of-field effect for your videos and allows you to rack focus between subjects.

Generally, smartphone video, especially in a bright light environment like a day exterior scene, has a massive depth of field because of its smaller sensor size. This means that everything is in focus, robbing you of the cinematic ability to use focus to drive attention.

Cinematic Mode takes advantage of not just the powerful processor in the iPhone 13 Pro, but also the LiDAR sensor that can tell how far objects are from the camera. This is the same tool that allows for "portrait mode" in stills photos, only here used in motion. Unlike the "artificial blur" something like Zoom uses to create depth of field, which keeps you sharp but blurs everything else equally, Cinematic Mode is more sophisticated. It can tell how far away something is, and it can use that depth information to apply more blur to objects further away, creating a more realistic sense of depth of field. 

Iphone_13_pro_lens_0

Apple modeled the effect on cinematic imagery in films and on the behavior of real-world lenses, and in all the test footage released by the company, it shows. Apple is also using some pretty sophisticated technology to analyze the content of a shot and know where you want to focus before you do.

If a character turns from the camera, Cinematic Mode should rack focus to what they are looking at. If someone is about to walk into frame, the LiDAR (which has a wider field of view) should sense that and rack before they get there the way a pro first AC would. 

Does It Work?

So, here's the thing: for the right kinds of shots, it's kind of amazing.

You stack up some actors and have them look at each other, and it will magically rack back and forth between them depending on who is looking where. You set up a street shot and it'll rack between characters, and if you settle on one, it'll settle on them. It feels a bit like working with the amazing autofocus on the Sony a7S III, but in at least one way, slightly better.

There are of course flaws. 

There were definitely shots where we thought it would effortlessly rack, and it didn't. Maybe it was something about where folks were in the frame. It didn't seem to catch folks if they were too close to the edge. But if we tried it again, or repositioned it slightly, it mostly just worked.  It felt very much like the kind of thing you'd quickly learn how to do to properly frame a shot to take advantage of the focus features.

And really this is no different from the countless times on a real set you ask an actor to walk in a banana to stay in frame or turn slightly to catch a light. It felt well within reason for a tool that is so surprisingly powerful. We're always learning to work with the limitations of various tools, and if there are slightly framing and blocking tricks you are going to need to use this properly, it won't take long to master them.

The major limitation right now is when someone is looking at something other than a person. We couldn't consistently get it to rack to a building in the background or to a dog. For narrative work, this isn't that big a deal, since largely you have people looking at each other in frame. But for travel videos, this will be frustrating. If you use Cinematic Mode for a selfie, and you want it to rack to the mountain or temple behind you when you look, it doesn't. We suspect that there is something that will improve with time, and maybe even a "travel" or "monument" mode.

The "racking to someone walking in" works surprisingly well. This is one of those things where the technology is put to its best use. By having LiDAR that looks around the image, it can tell someone is about to walk in and rack to them before they walk in. It's the kind of thing a pro AC is doing all the time, but seeing it on a phone is frankly a pleasant surprise.

We tested with both actors, and also just folks on the street, and it's awesome. This is really the standout feature that puts it above almost all the other autofocus we use where someone has to walk into frame, then the camera realizes they are important. 

Its other major limitation is when you have something very sharp against a very out-of-focus background, like our test shot of a hand, done on the 3x lens, against an out-of-focus background. It's just too "extreme" for the image to handle.

Image Comparison

This is, of course, artificial digital "out of focus," so we put an ND2.1 filter on a 50mm prime lens and shot a quick side by side to see what their "digital" bokeh looked like vs. actual "bokeh."

As you can see, the "bokeh" itself matches relatively well, but it's the contrast from bokeh to not bokeh that really makes traditional lenses sing.

Screen_shot_2021-09-24_at_1150mm Prime Lens, ND 2.1 filter, T2, "real" bokeh.

Screen_shot_2021-09-24_at_11iPhone 3x cinematic mode; look at the buzzing around the fingers.

Screen_shot_2021-09-24_at_11Even this wider angle has fringing on the finger.

This fringing is likely a result of the lower resolution on the LiDAR. While the video is 4K, it's likely that the LiDAR is much lower resolution, so when it's used for an extreme example like this (hand against a faraway background), the heavy "cinematic" effect creates image artifacts.

This isn't a dealbreaker; remember, you can always just turn down or turn off the Cinematic Mode in post. But it is something to be aware of as you plan out your shots and think about when and where you are going to be able to put this to use.

Editing After

One of the niftiest tricks is that in the iPhone photos app, you can just click "edit" and change everything in post.

You have little focus points at the bottom that show you where the focus moves, and you can click through them and change settings for where it's focused in post. You can even change the artificial "f-stop" it's using to create artificial focus. It's delightful.

Right now there aren't apps in post that support this data. If you airdrop the file, it gets "baked in" before it's delivered, and it is delivered with your edits locked in.

However, it's very likely, practically guaranteed, that Final Cut Pro X will eventually support this data, and we bet there will be a way to export it without baking that hands over to FCPX to allow you to edit these settings in your final edit.

Macro Mode

The camera has a surprisingly useful macro setting, though it works best when working with normal video mode, not Cinematic Mode.

This is likely since macro images already have such a tiny depth of field, and that close, the LiDAR sensor might not be giving as accurate info, so it isn't worth it to keep it in Cinematic Mode.

There is a noticeable "pop" when you push in from a wide to a macro, but that isn't surprising and shouldn't be a deal-breaker. In my whole career, I've done only one shot I can remember where it was really important to go from macro to normal in shot without a pop. 

The macro is, again, super impressive, and while macro work has limited applications in filmmaking, it's powerful when you need it, especially for transitions. It's also a staple of doc work and shouldn't be underestimated as a tool for gluing an edit together. 

ProRes

ProRes is unfortunately not supported at launch.  We'll have more on ProRes the second we can, but we aren't there yet.

Conclusion

Cinematic Mode is definitely a huge step forward for the iPhone and capturing "cinematic" imagery where you can focus the audience's attention, all on a phone camera.

It's surprisingly pleasant looking for a "digital defocus" effect, really taking advantage of the LiDAR. We are hopeful that the tools to edit the footage roll out for post soon, since we would really love to see it on a bigger screen as we make our tweaks and adjustments. But overall, this is absolutely a tool to keep on your radar. When you need a camera to squeeze into a small place, the iPhone 13 Pro continues to make an argument that it should be considered.

We've had the camera for mere hours at this point, and this is what we've learned so far. We're going to keep shooting, but there is definitely a ton of interesting technology here for filmmakers to keep on the radar.