December 12, 2016

540 Terabytes of Data: 4 Takeaways from the Groundbreaking 'Billy Lynn's Long Halftime Walk' Post Team

What does it take to complete post on a film shot in 4K, 120fps, and stereo 3D?

Everyone who signed on to work on the new Ang Lee project Billy Lynn's Long Halftime Walk knew they would get to work a project at the bleeding edge of film technology. But as David Bowie once said, “The first one to the bleeding edge gets cut.”

Lee’s vision for the film involved shooting in a format that no one else had shot before: 4K, 120fps, stereo 3D. This created a final delivery that they referred to as “the whole shebang,” more than 540TB of dailies, and a final delivery file that was 84TB. To deliver on that vision, he assembled a technical and creative team to build a brand new workflow from scratch. The team re-assembled to share what they learned at the HBO offices in Manhattan last week, in an event arranged by the Blue Collar Post Collective.

“We would go to all the big vendors, and explain what we needed and the format we were working in, and most would pass. In the end the only way to do it was internally.”

On stage were assistant editor Andrew Leven, dailies technician Derek Schweickart, colorist Marcy Robinson, technical supervisor Ben Gervais, VFX artists Alex Lemke and Michael Huber, and VFX producer Leslie Hough. The talk was moderated by Shahir Daud from The Only Podcast About Movies.

Billy Lynn Post PanelCredit: Sony

Here are four of the big takeaways that the post team had to share to keep in mind for future projects where you might be working in a brand new format and developing a workflow as you go.

1. Sometimes new technology feels like old technology

One of the first things technical supervisor Ben Gervais remarked on was working with the stereo rig. “With a native 800 ASA camera, shooting 120fps 360° shutter through a 50/50 split mirror, you end up with functionally 160 ASA sensitivity, so it was like shooting a '70s movie in 2016.” This required a larger amount of light on set than is typical on modern shoots. Combined with the tremendously heavy stereo rig and two Sony F65 cameras, the crew was simultaneously working with the most cutting edge of formats but light levels that haven’t been required in decades.

Ang Lee demonstrating the precision of eyeline required by the new format.Credit: Sony

2. Building it yourself can save you money, even on a big studio movie

VFX producer Leslie Hough perfectly articulated a theme repeated by many people on stage when she discussed getting bids for particularly complicated shots. “We would go to all the big vendors, and explain what we needed and the format we were working in, and most would pass, while the rest would give us a bid so high it was basically a pass. In the end the only way to do it was internally.”

When doing particularly complicated workflows, sometimes it’s just cheaper to build the set up yourself and execute in house than it is to go to an outside vendor. This is at least partially due to the fact that the vendor is going to charge you for the time they need to learn how to deal with your format. Since you already understand its challenges, building elements, workflows, or equipment from scratch can save your production a lot of money.

Credit: Sony
3. There are still benefits to getting everybody together in a room

Outsourcing many parts of the post workflow is increasingly common, but when inventing new processes or working with non-standard formats, having the team all together can be a real life saver. “We could literally just walk one office down and talk to VFX, or another office down and preview things in the Baselight, and immediate know if something we were working on was going to succeed,” said Schweickart.

Those are conversations that are much harder to have even with someone else in your same town over the telephone, and having to drive to another post house every day to preview shots would slow everything to a crawl.

The entire team worked out of Ang Lee’s office, and it was the only way to pull it off.

If you are moving to a new format, you’ll have to not just shoot principal in the format; everything is going to need to be in the new format.

4. VFX libraries aren’t ready for new formats

VFX artist Alex Lemke realized partway through how much he depended on previously built effects libraries. “Like any veteran of the industry, I have accumulated a library of elements over the years, bullet hits, flares, etc. that I could use at will. But they were all 24fps, and didn’t work at all in 120fps! We pushed for time shooting elements in production, but it was never possible, so we often had to create 3D elements from CGI that on another production we could just use a flat plate we had on our hard drive already.”

Adding muzzle flashes and bullet hits to this shot would be easy on most movies, but required building fresh elements from scratch on Billy Lynn.Credit: Sony

Anyone considering a new workflow should definitely keep this in mind. Between stock footage (often used in sky replacements), and other VFX elements, most modern projects are assembled out of a great variety of footage, and the vast majority of that is 2D, 24fps. If you are moving to a new format, you’ll have to not just shoot principal in the format; everything is going to need to be in the new format.

For indie projects without a CG team, it becomes even more important to find time to shoot plates, elements, etc. in your custom format, or to find a way to access footage from that format in post. If you are planning to do some drone establishing shots, you'll need to find a way to shoot those in your new format. Getting to "the whole shebang" requires making sure everything in "the whole shebang" was captured in "the whole shebang."

Ang Lee directing Vin DieselCredit: Sony

You can hear the whole event on the Only Podcast About Movies.        

Your Comment

12 Comments

For those that saw it in 120fps what did you think of the movie?

December 12, 2016 at 3:06PM

0
Reply

According to American Cinematographer, there won't be a 120fps exhibition. The frame rate was chosen so it could be converted to both 60fps and 24fps without artifacts. I'm guessing that's why they used a 360 degree shutter, too. You wouldn't notice it as much if you lower the frame rate.

December 12, 2016 at 5:49PM

0
Reply

I saw it projected at 120fps at one of the two theaters in the US that is projecting it at this frame rate. The effect is a little hard to describe. I understand the creative decisions to go for something so disorienting, but it is exactly that - disorienting. Takes you out of the movie and the hyper-reality of it all just makes you hyper-aware of all the flaws. It makes everything look fake, almost like it's so real that it just feels like you are on set watching actors performing scenes instead of actually being swept up into a fictional story.

The movie itself isn't great either, so that might be part of the problem - the film had odd tonal shifts and pacing issues, and bizarre casting choices. I'm not even sure what Ang Lee really saw in the source material, to be honest.

December 13, 2016 at 1:35AM

0
Reply
avatar
Oren Soffer
Director of Photography
1891

I've experienced similar sensations with some blu-ray movies. It was a hollywood blockbuster, but felt like it had a tv-soapy kind of aesthetic to it with many scenes. It was easier being aware of a 'performance' rather than a 'role being played' and it can take you out of the film.

December 26, 2016 at 10:26AM

4
Reply

Haven't made the time to listen yet, so I'm relying on someone who has. I know that's a faux pas, but whatever...

So, it was the final DPX that was 84TB, right, not the delivered file? Or was it the 4K, 120fps 3D DCP version? The only way to deliver an 84TB file is on some kind of array. That's not economically effective for a wide release, but maybe for a selective number of screens? Like, only the venues capable of projecting 4K @ 120fps?

December 12, 2016 at 4:34PM, Edited December 12, 5:00PM

0
Reply
John Sartin
Technical Director / Editor / Producer / Dude
82

File size for the HFR 3D version of "The Hobbit" was about 475GB, but that was only 48fps. I would estimate that a 4k 120fps 3D DCP would probably be around 2TB.

Normally the studios ship ALL the DCP versions (3D and 2D) of a movie onto one hard hard drive, generally a 1TB drive. However, with "Billy Lynn," the few venues that are actually capable of projecting the movie will probably have a special version separate from the regular DCP release.

December 12, 2016 at 4:53PM

0
Reply
avatar
Film Voltage
Director
194

I went back and did the math on resolution, bit-depth, stereo imaging, frame rate and run time. I still didn't get 84TB, but I did get 54TB. That's pretty close. Like, close enough to be a typo on a 10-key.

December 12, 2016 at 5:18PM

0
Reply
John Sartin
Technical Director / Editor / Producer / Dude
82

Keep in mind folks, DCI does not yet support 120fps 4K 3D in a DCP format. The 120fps presentation of this movie is not projected on standard cinema series projectors nor are they processed via a standard media block. 4K 120 in 3D is currently done by proprietary non industry standard media players through AV series projectors. That said the claimed total storage needs of the movie would depend largely on the distribution format, media player, bit depth, etc. In the end there is a sh** load more information per second and that equates to a lot of bandwidth and storage.
Have fun.
In time like everything else, compression will get better and more efficient, storage will become cheaper and faster. Looking back from the not to distant future these specifications will seem insignifigant and we will likely to be discussing 8k or great resolutions at 240+ fps.

December 12, 2016 at 7:18PM

0
Reply
mr120
CTO
16

Thank you! I didn't know that DCI didn't have a 4K 120fps 3D format yet. That 84TB makes more sense when were' talking about a proprietary format/delivery method.

December 13, 2016 at 10:53AM

0
Reply
John Sartin
Technical Director / Editor / Producer / Dude
82

Can anyone speculate as to why the choice of the 360 degree shutter? I'm very curious as to what the reasoning was behind this decision? Is it due to the fact that it was shot at 120 fps - and because the higher frame rate it was needed to compensate by slowing the shutter, rather the opposite? Would this make the footage appear more fluid, rather than jumpy in any manner?

Also, one thing I don't understand is how 360 degrees would work with 120 fps. Maybe this is my fault for these assumptions but, 360 degrees converts to around 1/24 sec. Is that valid for rotary shutters? Because youre shooting a single frame at 1/120 of a second, how is each "frame" "exposed" to 1/24 sec of light? Or, am I wrong for trying to convert here? Does this simply mean that for every "frame" captured, the shutter did a 360 degree revolution?

Regardless, my assumptions and knowledge could be way off here. I've just been trying to figure out the reasoning here for this choice and it's stumping me!

December 13, 2016 at 10:29AM

0
Reply

Or....is it because of the such high frame rate, that each frame is exposed to only a portion of the 360 degree shutter? i.e. a few frames are exposed during when the 360 degree shutter is rotating - in turn equating to each frame getting a portion of the shutter opening? So in reality its something like 300°, 270°, 180°, etc.

Or I could be wrong in all aspects here..

December 13, 2016 at 11:02AM

0
Reply

I'm pretty sure it's because the movie wasn't designed to be viewed at 120 fps. That frame rate was chosen because it's easily divisible by both 60 and 24. I think 60 fps was what they were aiming for, but they had to protect for 24 fps for certain theaters. That might be kind of disappointing for people to hear.

The shutter speed that the shutter angle is describing is relative to the frame rate. Mechanical shutters are a disk with an opening that allows light to get to the sensor. So a 180 degree shutter is half closed and half open, which makes a shutter speed of 1/48 sec at 24 fps, but 1/240 sec at 120 fps. A 360 degree shutter is completely open, so the shutter speed is equal to the frame rate, 1/24 or 1/120 sec with those frame rates. So if you convert 120 fps with a 360 degree shutter to 60 fps, it will look like it was shot with a 180 degree shutter. At 24 fps, it will look like a 45 degree shutter.

December 14, 2016 at 1:46PM

0
Reply

Remember, digital does not follow the rules of traditional film cameras. The carry over of terms like 180 or 360 degree shutter are conversions to make digital cameras friendly to the long standing terminology and comforts and understanding of DPs. So in reality when you refer to physical revolution of a shutter it is not a direct translation.
Billy Lynn was shot with, I believe, Sony F65 at 360 degree global shutter. The statement about shooting 120fps is correct, it is easily divisible to traditional film and video frames rates like 24, 30, 48, 60, 120 and even 240 fps when you talk about dual projection 3D presentation like the NAB presentation on Christie Mirage projectors.
Shooting 120 fps does leave a narrow exposure of the sensor and requires much more light to compensate. Holding the shutter 360 degrees (1/120 of a second vs 1/240 of a second for a 180 degree shutter) not only allows twice the light in but captures 100% of the data instead of 1/2. It's easier to remove data in post than fill it back in! The 360 shutter at these high frame rates also allows for a little natural motion blur which helps maintain the "film look" better than smaller shutter angles which will sharpen edges dramatically giving it a more video look.

Sorry, all I have time for right now.... TBC...

December 17, 2016 at 11:47AM

0
Reply
mr120
CTO
16