That’s an odd implementation Adobe has chosen. It certainly demos well with such a short and simple timeline, but I can see that getting really messy and cumbersome with long sequences that have dozens of audio tracks. It would have made more sense if Adobe allowed the media types to be set as metadata at the browser level BEFORE the clips are cut into the sequence. FCPX works that way using its ‘roles’ feature, and it works quite well.
There is a huge caveat with Using Clip > Video Options > Scale to Frame Size -- because it resamples the image to the sequence frame size, so any moves on that still image that zoom in will not be sampling from the original pixels and the result is softness and (at worst) pixellation. If you use the "set to frame size" option you avoid this problem. This video explains in detail: https://www.youtube.com/watch?v=OU9S2gjFyG8
Yep, I agree. Though I think Yedlin’s demo only contradicts “key takeaway” number 2 — the one which is total speculation without proof.
How about don't use a camera that throws away more than half the color information in your scenes? 8 bit 4:2:0 and 10 bit 4:2:2, just won't cut it. Subsampled color was developed specifically for TV broadcast, not for digital cinema acquisition. If you want to be able to color grade your images to yield a cinema aesthetic, 12bit 444 or 12bit raw recording are what's required. So that still image in this article of a DSLR.... nope.
BMPCC is a lovely little camera, but neither it, nor the GH4 would be good candidates for what Yedlin is discussing here. He proved that the quality of each pixel recorded matters more than the number of pixels once you get to the level of human vision, which hits somewhere just before HD. Supersampling at the sensor level still matters because of the Bayer pattern. On this count the BMPCC falls short because it has a 1920x1080 photosite sensor that delivers HD luma, but far less than HD chroma. The Alexa s35 sensor is not 4K, but it's ~3K sensor enough to yield HD chroma. The GH4 falls short because the codec is extremely compressed, and the color sampling is limited to 8bit 4:2:0. Yedlin's sources for the test are all at least 12bit 444 (or film scanned to 12bit 444), regardless of the resolution.
No, do not always shoot extra footage at 60 or 120. If shooting overcrank without a specific plan of how it will be used in the edit, its best to stick to multiples of the project frame rate so that those clips can be easily played at real time speed without any interpolation if needed. So, yes 60 and 120 for projects that will be released at 29.97p (or 30p), but for 24fps (or 23.976) projects shoot your over crank at 48, 72, 96, or 120. For 25fps projects, shoot over crank at 50, 75, or 100. You get the idea.