February 27, 2015 at 12:34AM
4K and HD bitrate
How can we interpret footage made with following bit rates e.g.:
4K 4:2:2 @ 240 Mbs codec XAVC intra 10bit
HD 4:2:2 @ 176 Mbs codec ProRes 10bit
(Figures taken from Sony FS7. This is an academical question, please no brand discussion.) Different codecs are in use.
The 4K rate has 4.27 times more pixels to cover than the HD rate. The bit rate is only 1.36 times bigger though.
Is the HD footage better than 4K?
What are the criteria to distinguish a theoretical potential better footage?
1. efficiency of codec
2. . . .