Thanks for taking a crack, but that doesn’t really answer the question.
Let’s say I already have an HDR reference monitor. According to Nvidia, the Quadro graphics card that I have is capable of outputting a rec 2020 signal.
Why can’t Resolve (or Premiere, or others) send that rec 2020 signal from my Quadro directly to the reference monitor and provide an image accurate enough to analyse and grade. Why is extra hardware connected by T3 even required? Is there a technical reason, or are they just intentionally not allowing that to happen.
“If you want to accurately monitor your video image while you edit or color grade, you need a hardware converter that outputs a true video signal from your computer to hook up to a monitor. You simply cannot accurate analyze video for color on a computer desktop.”
I’ve never really understood why this is the case. I’m not doubting it’s true, but would love it if someone could explain the technical reason for this.
It seems ridiculous that you can spend $5k plus on getting the top Nvidia Quadro graphics cards but then you still need to spend at least $145 to get accurate colour out of your pc.
Can Nvidia / AMD not include this on the graphics card? Is the software limiting it? If so is that the individual app, or is it the OS?
The most expensive configurations are going to be stupidly expensive and only make sense for very few people, but I would love to know how you think you could achieve your last statement. Particularly in something really RAM or graphics intensive. Because if you can purchase 1.5TB of RAM for less than $6k you’re doing pretty well. Not that the w3175x supports that much memory.
And you’ve figured out a way to get 128gb of graphics memory into a single tower without buying 3 Quadro RTX cards at $5k each? Will parts from your 7940x build help with that?
I wish people would simply say, this is an impressively powerful computer, but it doesn’t make sense for me, rather than whining because they can’t afford it.
And I know plenty of people who use windows “pro” who spend ages complaining about how much it crashes. Seems like there are plenty of companies that exaggerate what’s really going on.
Primarily my workstation is built for 3D work using 3ds max, but it has to do AE work as well because I can’t justify a second workstation on top of a $15k machine. But to be honest, other staff in the office have different priorities and have Titans and 1080s and the AE performance is just as bad. They do see some improvements in Premiere compared to my Quadro. But not in AE.
My understanding is the architecture between Quadro and GEForce is largely the same but the drivers are different. And the Quadros are available with a lot more Video RAM. Which I use/need.
So I guess my answer is two fold. I need the Quadro for other things, and the top GEforce cards don’t improve AE performance over my setup anyway.
Could not agree more.
My 5yr old, quad core macbook virtually keeps up with my 24 core, 128GB RAM, Quadro P6000 workstation. And that's not an endorsement of the macbook, but an infuriating truth about how bad AE is at using resources.
We've started looking at Nuke and Fusion. I think we'd recover the cost of Nuke relatively quickly compared to how long we spend waiting for AE to process anything.
Ha - I was involved in the research from which that NZILA document was produced. Some of it is very poorly worded, but the technical aspects of this guide are mostly correct.
If you take a panoramic image around the nodal point of a lens and stitch those images together accurately, you will get the same result with a 24mm lens as you will with a 50mm. The image produced with a 24mm lens will only require 3 individual photos to get the 124x55 degree fov. The 50mm lens will require 10 individual photos. If you use the same sensor, the final resolution, will therefore be much greater in the 50mm image, but once scaled, the relative position and scale of all objects within the shot will be the same.
If you're interested, this book is a good reference;https://www.amazon.com/Windfarm-Visualisation-Perspective-Alan-MacDonald...
Correction on my original post - FOV of a 35mm lens on full frame is actually 55 x 38 degrees. Accidentally had my app set to Nikon DX and not full frame. My point is still correct. This is no where near the FOV of human vision.