Description image

No ROCKET Necessary: 3rd Party GPU Acceleration is Coming to REDCINE-X PRO

REDCINE-X PRO Without ROCKETWhile proprietary used to be the name of the game, we are entering an age where camera systems and post solutions are choosing more open source options for maximum compatibility (and a likely wider install base as a result). Look no further than the increasing use of CinemaDNG with cameras like the Ikonoskop, Blackmagic Cinema Camera, and Digital Bolex. Now it looks like RED is at least partially entering that arena with an update to REDCINE-X PRO.

Up until now, a RED ROCKET was all but necessary for real-time performance as well as faster proxies in REDCINE-X PRO. This is fine for productions with huge budgets who are going to post-houses, but for smaller indie productions, a ROCKET card is about twice as expensive as a fast computer system alone. Jarred Land, who is now the public face of RED according to founder Jim Jannard, recently posted this tweet to REDUser from Bill Bennett, ASC which shows a new build of REDCINE-X PRO:

Jarred said this in the forum, and he also mentioned that having more than one GPU will increase performance:

Single Titan 6K @ 24fps. Rocket will be still be alot faster.. but it becomes more of a luxury rather than a necessity.

This is gigantic news for anyone who has dealt with RED workflows. Yes, you don’t absolutely need a ROCKET card, but it makes for much longer transcoding times and you’ll have to deal with lower-quality debayering performance for playback. It seems like the ROCKET will still be useful, but if you’ve got a decent GPU, it likely won’t be worth the money to spring for the proprietary RED card once this new REDCINE-X comes out.

They were originally going to announce this at IBC, but obviously the cat has been let out of the bag early. I imagine that we will get our first DRAGON .R3Ds right around IBC if that’s when the new version of RCX PRO is set to come out. It also coincides with the first DRAGON sensor upgrades for users, so it only makes sense that the software has to be public by then.

Bill also mentioned some other interesting stuff on Twitter:


Since the sensor is so clean with such a high signal to noise ratio, pushing it to higher ISOs doesn’t add much of a noise penalty. To my knowledge, RED’s not doing any gain on the sensor internally with DRAGON when you change ISO, so that would mean that the sensor is just that clean. This seems similar to when you’re shooting RAW photos with DSLRs and the cameras are actually native at a lower ISO, but can be pushed a few stops to higher ISOs without much of a noise or dynamic range penalty.

The good thing with DRAGON being able to shoot at lower ISOs with the same dynamic range is that you can use a stop or two less neutral density outdoors. Also, if the dynamic range is really that high, it should be possible to bring it down to 100 ISO or lower and still keep dynamic range somewhere around the MX sensor.

Either way we should find out soon enough when we’ll get the newest version of RCX and hopefully some DRAGON .R3Ds to play around with.



We’re all here for the same reason: to better ourselves as writers, directors, cinematographers, producers, photographers... whatever our creative pursuit. Criticism is valuable as long as it is constructive, but personal attacks are grounds for deletion; you don't have to agree with us to learn something. We’re all here to help each other, so thank you for adding to the conversation!

Description image 30 COMMENTS

  • Since its possible to use GPU acceleration for other RAW workflows, it’s way overdue with R3D.

  • Stu Mannion on 08.20.13 @ 7:15PM

    Great news for those that use the Red workflow occasionally.

  • It’s about time. Sheesh.

  • This one guy who prefers to avoid blacklists on 08.20.13 @ 9:33PM

    Just as Jannard walks into the shadows, we see the end of money grubbing, proprietary technology from Red. Coincidence? Maybe. I am pleased with this decision. Open source will help us all.

    • This was hinted at a while ago, back in early July, so this was already in the works well before Jannard said what he did. Sony is just as bad, probably worse, when it comes to proprietary stuff. Sony has proprietary media and card readers for everything they’ve ever done.

      • Sony’s been like that in the consumer market forever. That cost it its Betamax video recorder line and the associated billion$ in the VCR and the camcorder markets. And don’t even bring up Elcaset.

      • This one guy who prefers to avoid blacklists on 08.20.13 @ 11:13PM

        I’m not saying Sony’s any better. I’m just saying that BlackMagic and others with CinemaDNG and a focus on employing open source tech are where we should be putting our money. It’s great to actually be able to figure out what the debayering algorithm is doing under the hood. If you know about the image processing, you can tweak it to do what you need for each occasion.

        I can’t wait until everything shoots CDNG, converting to OpenEXR in ACES colorspace, with OpenCL debayering algorithms, etc. The future of cinema is working collaboratively on technology and forgoing proprietary tech for the sake of transparency….I hope. Effects plates need to match the camera originals for instance.

        • Don’t get me wrong – I’ve been pro open source for a long time. Not only does it make sense for the editing and coloring portion of the biz, it makes all the sense for the screenwriters as well (not a Final Draft fan).

        • CinemaDNG currently has a terrible workflow, and the way it’s debayered is different in every program. Red footage on the other hand is supported in pretty much everything, and will look exactly the same in each program. It also seems to work great with ACES judging by Elysium and Oblivion.

          • This one guy who prefers to avoid blacklists on 08.21.13 @ 8:25AM

            That terrible workflow will absolutely improve and surpass the closed source ones. Give it time. CinemaDNG is relatively new to the industry.

            From personal experience and significant testing, in particular Nuke’s (as well as After Effects’ output doesn’t match that of RedCineX with IDENTICAL settings. I actually did a test where I output R3D’s with every imaginable iteration of settings from Nuke and A/B’d them with RedCineX’s output (I was trying to get footage that was composited to match the camera original files……is that too much to ask?) The footage WASN’T EVEN CLOSE to matching.

            Maybe someone told you it does, but unless you have done tests (as I have), I can tell you that they are flat out lying.

            Compare that to even a quick test with CineDNG and you’ll see that you’re absolutely incorrect. That same test matched perfectly the first time I tried it (without resorting to iterations of the same frame) between Nuke, After Effects, and Resolve.

          • This one guy who prefers to avoid blacklists on 08.21.13 @ 8:31AM

            Has anyone informed you that Oblivion was shot with the Sony F65?

            The plates for the sky house were shot with Red’s then projected onto a backdrop on set.

          • Who’s going to improve CinemaDNG though? Sometimes open source works out, and sometimes it stagnates because there isn’t a company with motivation to make sure it works. Right now the only company that sort of seems to be doing that is Blackmagic, and the CinemaDNG results in Resolve look great…but then that means you have to transcode everything in resolve which makes my workflow a lot clunkier than an R3D workflow. The default look I get in other programs is terrible, and I don’t want to have to mess with a ton of different stupid sliders when it should be as simple as setting a curve and some straight forward color settings.

            I’ve personally never had issues with output being different in programs. Stupid question…are you sure you’re working in the same color space and gamma (not just in the R3D settings, but in the programs as well)?

            And yes I know Oblivion was primarily shot on the F65…the F65 was pretty much built for ACES though. So the fact that the Epic (which was also used for steadicam and vehicle mount shots) cut just fine with the F65 certainly indicates it fits into ACES. Well that, and you know, all of Elysium.

      • This is for “This one guy who prefers to avoid blacklists” but you are aware that a good deal of Oblivion was also shot on the Epic as well, almost all the hand held or steadicam work was using it. Check out the BTS footage to take a look. The F65 is incredibly bulky for what it is. Also RED’s workflow is a breeze compared to the headache and CPU suck that is ARRI RAW, hence why most productions stick to ProRes.

  • I wonder if this will be opened to other programs as well (Premiere, FC, etc.). That would be huge…

  • 6400 ISO in video with clean image. Anyone got a spare $50,000.00?

  • About fookin time, Red.

  • The ISO claims by RED and by others actually using it in the field vary wildly, 320 vs 800 on the Epic. The only way we’ll know if they’ve really nailed the noise issue on RED Dragon is to see someone straight up test it.

    • Epic is 320 native but they recommend shooting at 800 because it changes the midpoint of exposure to give more stops in the highlights. Guessing that’s why they said “ISO 2000″ for Dragon despite it coming out that it’s 200 native…

  • One issue here is Red; the other, some sort of an industry wide initiative to synchronize their standard measurements and software formats a la H.264 or 265. At this point, there’s really no reason to put the individual wrappers around the same basic codec. This just makes it difficult for the filmmakers.

  • Forgive my ignorance, but does this mean that using this upcoming version of REDCINE-X on my MBP will speed up the r3d-to-proxie process?

    • Possibly, if you have a decent video card. They’ve just been teasing information, we’ll need some proper tests and benchmarks to get a clearer sense of the impact this will have.

    • Angelo Lorenzo on 09.15.13 @ 10:20PM

      Agent55, I did a quick test with my late 2012 MBP with Nvidia GT 650M and the conversion speeds are substantially slower than just using the CPU. RED is open about this process being very graphics RAM heavy and Apple’s mobile offerings just can’t perform. A GPU in a Thunderbolt chassie may be worth looking into even if Thunderbolt can’t sustain transfer speeds. I tested my desktop with my RED Rocket off and just my Nvidia GTX 560 and 2 gigs of ram and I was getting maybe 4:1 real speed when debayering 4K – a substantial improvement over 20:1 real speed of just the CPU.

      RED also admits that the REDRocket is still the way to go.

  • To be the trusted source in Maternity Wear and the Expectant Mom Arena. MommyLicious Maternity will actively engage our community (you!) thru our frequent Blog Updates, website and numerous Social Media Sites by engaging in real conversations with the real Moms of today. Moms like Natalia. Moms just like you. MommyLicious Maternity will provide clothing and items that every mom-to-be will feel trendy, cute and comfortable wearing. The most important aspect of MommyLicious Maternity though is real moms just like you won’t break the bank while retaining your individual sense of style

  • This was a great adventure, thank you for sharing it with us!