May 18, 2017 at 11:42PM, Edited May 18, 11:50PM
Can we have a nerdy discussion about video bit rate vs resolution?
In the midst of talking over an upcoming shoot today where we'll be using a Sony A7SII recording Slog out to an Atomos Ninja Flame (w/ Rokinon cine 24/35mm f/1.4s and Zeiss 50/85mm f/1.4s ), my partner (who DPs for our production company) and I started a conversation about bit rate vs. resolution and both of us realized we were both a little hazy on the relationship. Is the higher bit rate getting us the detail or is the resolution? If it's both overlapping, how would we prioritize for different sorts of shots? If, for example, we wanted a get a wide master at 4k 24p with the 35mm (i think 100mps? don't have the menu open here) are we getting more detail in general in the wide than when we do a 1080p 60p medium shot (at 50 or 60mbps i think?), if the distances were accounted for and we punched in on the master? I'm sure it totally depends on the content- one actor in a hazed out, heavily gelled LED light flood, cyc wall situation- but neither of us were sure. I guess we just aren't totally sure how bit rate plays against resolution where detail is concerned - if the new gh5 can do 10-bit 422 4K at 400 mbps VS our 10 (with 2 empty, due to a7sII HDMI limit) bit 422 prores 4k at 100mbps, what is it we're NOT getting? is it 300mbs of detail in MOVEMENT per second, or summed between frames? if 300mbps divided into the FRAMES per (24/60 etc) is it fine detail in hair and fabric, consistency like in motion blur? I just honestly just don't know exactly what these number/metrics mean in IMAGES. Any insight would be helpful - everything matters when it comes to telling the story well. I want to at least understand the technicalities well enough to be able to make necessary considerations regarding movement, detail, color, light etc all having their best chance to play their role- thanks in advance, sorry if that was overly convoluted -_-