It dind't have full color depth, only the luma equivalent of 8K sampling. But the color sampling was the same as any 4K camera.
I was being provocative with my sentence (maybe you were the one narrow-minded for not realizing it). I actually agree 100% with you. A movie should be anything the director wants it to be. The problem is, as I said above, it seens that in the future, the director won't have a say if he want's his film to be delivered in SDR. That's already the case when it comes to resolution. You can not have a 2K delivery for Netflix content for instance. It's a technical standard imposed by bureaucrats on the wrongly assumed belief that since 4K is a higher number than 2K, it's therefore better. And the Director and the DP can't have a say in it. For instance, I personally believe that acquisition in 4K + resolution is a good thing, but I hate 4K home exhibition with all my heart. I honestly believe that 1080P home exhibition is a more pleasant experience. And I am afraid that in the future, we won't have a say also on the issue of SDR vs HDR.
I totally agree with you when you say that it depends on what the director wants. I totally support any director that thinks that his story will be better told at any technical specification he chooses to do it! Be it 3D, HFR, miniDV or whatever. look at dogville for instance. I think tha Lars Von Trier nailed it when he choose to shot it with no real scenario and using 30fps instead of 24. It totally fits his story. What I will never agree on, is when we are forced to accept new standards just because the are supposedly technically superior. of course 60 fps is more than 24 fps. But is ir better? Of course 4K is more than 2K, but is it better (I personally think that for acquisition, the more the better, but for home exhibition, 4K is too much), of course HDR is more than SDR, but is it better? Fore one thing, we know that HDR monitors must have brightness levels far superiors of what we have now, and I can't think how many one can think that looking at a 4.000 nits screen with lights of in the middle of the night is a good idea for instance. On the instance of HFR, here is the best explanation I have ever read on the subject: https://www.quora.com/Why-does-video-at-high-frame-rates-look-cheap/answ...
I personally believe that everytime someone try to make films "as experienced in real life" they fuck things up.
That's the case with 3D, with HFR, and I`m afraid it will be so with HDR (I hope I`m wrong tough)
A movie is not supposed to be like real life, a movie is supossed to be "larger than life". But therefore, it needs to be technically different from what we experience in real life. I don't see the current limit in dynamic range exhibition as a limiting factor, I see it as a creative tool. When I`m shooting a silhouette, I want the viewer to experience it as a silhouette, not to see detail on the shadow as he would if he were actually there. If I want a blow up window as a background, that's my choice, I don't want the viewer to see all the detail from outside (if I wanted it, I would light it to make it so). When it comes to technical innovation the scientists are so preoccupied whether they can that they don't stop to think if they should.
People that criticized the choice saying that nobody will ever see the film in 8K or that that are not lenses that actually resolve 8K, are missing the point. Shooting 8K for a 4K finish will yeald better results then shooting at 4K. And even if the lenses don't really resolve 8K (wich they do). You have more color density and less noise the more resolution you shoot.
RED didn't sue Arri because of that, they sued Arri because Arri spyed on them, and they didb't drop it, they won. They did sue Sony for copying their technology tough, and I don't know the end of that...