That's right, it was made for the GH5. The S will be non-reference; that said, I want to do it directly for the S soon (which will be an update to current 5 users). I just got a new Alexa last week for other camera conversions
The Firefox situation was a surprise to me too; the colors felt off and I thought it was the streaming sites at first. All the browsers are a bit off, except for Chrome with the Dev Fix
Take a look at the raw footage side-by-sides from each camera - these are provided so that everyone may confirm for themselves that every single point in the VLog color space has been seamlessly matched to LogC, as measured through the lens. For some, this is worth it, and meaningful, and for others, its not. And that's ok with me.
Hey there, Alex here. I love that you got around to Emotive Color. I'd like to suggest a small correction -
Granted 1) the footage was shot under reference sun or halogen lighting 2) with a neutral WB 3) within the limited DR of the GH5, GHa really will make the GH5 (not the S) look *exactly* like the Arri Alexa at identical settings, with the same lens
The actual core conversion, which Cameron does not show in the video, is a technical transform to a measured LogC (through the lens). I supplied raw side-by-side footage from both cameras for everyone to confirm the match:https://www.mediafire.com/#x3ub82eh21u63
There are post settings to simulate the OLPF texture, as measured in side-by-side. I have been recently researching an optical addendum to that, and will be publishing the details a bit down the road.
I understand why the qualifier was given, to tamp down expectations in the face of uncertainty. Also, the website is dated, with graphics and video going back to V1 (in the process of an update)
Take a closer look. Over the past year, the response has been so positive, but there has been a group that doesn't actually look into what it is I'm doing, because of the prevailing reputation of look-up tables. Here is a response I gave on Reddit sometime back:
"Hey, Alex here (author of the GHa conversion)
I understand LUTs have a poor reputation, appropriately so, because they have often been used to achieve lackluster color, not geared for professional purpose (artifacts, garish saturation, weird hues). An uninspiring... something, not in any way connected to the cinema screen.
This is different. Since April of last year, I have continuously refined the model to achieve two things:
- Absolute accuracy to the Alexa's actual response (not just LogC format params), at every point in the color space
- Smoothness (silk gradients, not a single artifact)
How did I do it?
- Built my own sample system that measures ~27K color points for each camera
- Bought an actual Alexa Classic to iterate with (really)
- Coded a custom color engine to interpolate between those ~55K data points
One could think of it as an epic color chart match, of one color science to another cherished color science (well beyond a Resolve color chart match), with special provisions for rolloff. It is not an instant grade, a notion which runs counter to the ethos of the project (as in the PDF, at minimum the luma of each shot must be placed)
Additionally, the project has involved discovering and correcting for NLE distortions of formats and color grading tools (especially PPro), measuring and standardizing OLPF texture emulation, and developing a starting color base that rivals quality colorist pipelines (and emphasizes correct cinema luma placement). Raw footage side-by-sides are available for anyone to confirm, for themselves, the extent of the color science match.
The LUT format is a simple, elegant tool that may be used to achieve a thing of beauty, or most often, blunder. I hope you'll take a closer look with this one."