Just to be clear, you can trim Blackmagic RAW source clips and send them along with Resolve project file. You can’t transcode other source codecs into Blackmagic RAW. It’s an acquisition codec with metadata that allows for trimming, but it’s not an intermediate codec that other formats can be transcoded into. The article confuses this important difference.
It’s a single file with a .braw extension, so infinitely easier to manage than the image sequences in cDNG.
That’s an odd implementation Adobe has chosen. It certainly demos well with such a short and simple timeline, but I can see that getting really messy and cumbersome with long sequences that have dozens of audio tracks. It would have made more sense if Adobe allowed the media types to be set as metadata at the browser level BEFORE the clips are cut into the sequence. FCPX works that way using its ‘roles’ feature, and it works quite well.
There is a huge caveat with Using Clip > Video Options > Scale to Frame Size -- because it resamples the image to the sequence frame size, so any moves on that still image that zoom in will not be sampling from the original pixels and the result is softness and (at worst) pixellation. If you use the "set to frame size" option you avoid this problem. This video explains in detail: https://www.youtube.com/watch?v=OU9S2gjFyG8
Yep, I agree. Though I think Yedlin’s demo only contradicts “key takeaway” number 2 — the one which is total speculation without proof.
How about don't use a camera that throws away more than half the color information in your scenes? 8 bit 4:2:0 and 10 bit 4:2:2, just won't cut it. Subsampled color was developed specifically for TV broadcast, not for digital cinema acquisition. If you want to be able to color grade your images to yield a cinema aesthetic, 12bit 444 or 12bit raw recording are what's required. So that still image in this article of a DSLR.... nope.