April 28, 2016
NAB 2016

EXCLUSIVE: Watch Lytro Change Cinematography Forever

The Lytro Cinema Camera could be the most groundbreaking development in cinematography since color motion picture film. We go "in-depth" with the entire Lytro system in this exclusive video:

Since Lytro first teased their Cinema Camera earlier this month, articles have been written. Press conferences were held. Lytro's presentation at NAB 2016 was standing room only and hundreds were turned away. Several press outlets did write-ups of the demo; we’ve been writing about the technology concept for five years. But words don't do it justice: you have to see the new Lytro cinema system in action, including its applications in post-production, to understand just how much this technology changes cinematography forever.

On its own, it would be a supreme technical accomplishment to develop a 755 megapixel sensor that shoots at 300 frames per second with 16 stops of dynamic range (for reference, the latest ARRI and RED cinema cameras top out at 19 and 35 megapixels, respectively). But those outlandish traditional specifications might be the least interesting thing about the Lytro Cinema Camera. And that’s saying something, when developing the highest resolution video sensor in history isn’t the headline.

The headline, as Jon Karafin, Head of Light Field Video at Lytro, explains, is that Lytro captures "a digital holographic representation of what's in front of the camera." Cinematography traditionally "bakes in" decisions like shutter speed, frame rate, lens choice, and focal point. The image is "flattened." By capturing depth information, Lytro is essentially turning a live action scene into a pseudo-CGI scene, giving the cinematographer and director control over all of those elements after the fact.

Lytro Cinema Camera focus in post

The technique, which is known as light field photography, is not simply enabling control ex post facto over shutter speed or frame rate: the implications for visual effects are huge. You can "slice" a live scene by its different "layers." Every shot is now a green screen shot. But it's not an effect, per se; as Karafin notes, "it's not a simulation. It's not a depth effect. It's actually re-ray-tracing every ray of light into space."

To fully understand the implications of light field photography requires a lengthy video demo… so we released one (watch our 25-minute Lytro walkthrough above). Karafin gave us a live demonstration of "things that are not possible with traditional optics" as well as new paradigms like "volumetric flow vectors." You can tell the demo is live because the CPU starts heating up and you can hear the fans ramp up in our video… and the computer was on the other side of the room.

You can "slice" a live scene by its different "layers." Every shot is now a green screen shot.

Lytro Cinema Camera depth screening

Visual effects implications

If light field photography fails to revolutionize cinematography, it will almost certainly revolutionize visual effects. Current visual effects supervisors tag tracking markers onto various parts of a scene—think the marks you often see on a green screen cyc—so that they can interpret a camera's movement in post. The Lytro camera actually knows not only where it is in relation to the elements of a scene—because of its understanding of depth—but where the subjects are in relation to the background (and where everything is in between). This depth data is made available to the visual effects artists, in turn making their integration of CGI elements much more organic because now everything in the scene has coordinates on the Z-axis. They're not matteing out a live-action person to mask out a CGI element; with Lytro they are actually placing the CGI element behind the person in the scene. And green screening takes on a new meaning; as you can see in the demo, it's no longer chroma- or luminance-based keying, but is instead true "depth-screening." You can cut out a layer of video (and dive into more complex estimations for things like hair, netting, etc). With Lytro, you don't need a particular color backdrop to separate the subject; now can you simply just separate the subject based on their distance from other objects.

Lytro Cinema Camera 4D light field

Every film is now (or could be) Stereoscopic 3D

Stereoscopic 3D is another matter entirely. To shoot in 3D currently involves strapping two cameras together in an elaborate rig, with a stereographer setting and adjusting the interaxial separation for every shot (or doing a 3D "conversion" in post, with artists cutting out different layers and creating parallax effects manually, which yields inferior results). The Lytro camera, because it has millions of micro lenses scanning a scene from variegated perspectives, can do 3D in post without it being a simulation. You don't need to shoot on two cameras for a 3D version and then just use the left or right camera for the 2D version. With Lytro you can set the parallax of every shot individually, choose the exact "angle" within the "frustrum" you want for the 2D version, and even output a HFR version for 3D and a 24P version for 2D—with the motion blur and shutter speed of each being "optically perfect" as Karafin notes. Even if you shoot your film on a Lytro only with 2D in mind, if advances in 3D display technology later change your mind (glasses-less 3D anyone?), you could "remaster" a 3D version that doesn't have any of the artifacts of a typical 3D conversion. With Lytro you're gathering the maximum amount of data independent of the release format and future-proofing your film for further advances in display technology (more on this in a bit).

What light field photography doesn't change

The art of cinematography—or as many of its best practitioners deem it, a "craft" as opposed to an art—is not limited to camera and lens choices. Cinematography is often referred to as "painting with light," and the lighting of a scene, with or without a Lytro camera, is still the primary task. While Lytro is capturing depth info, to my understanding the actual quality and angle of light is being interpreted and captured but is not wholly changeable in post (it's also possible that, if it is, it simply went over my head). As Karafin notes, Lytro's goal with the cinema camera (as opposed to their Immerge VR technology, which allows a wholly moveable perspective for virtual reality applications) is to preserve the creative intent of the filmmakers. This means your lighting choices and your placement of the camera are, for the most part, preserved (the current version of the cinema camera can be "moved" in post by only 100mm, or about 4 inches). As a director you are still responsible for the blocking and performances. As a cinematographer you are still responsible for all of the choices involved in lighting and camera movement.

This is part of a continuum. As cinematography has transitioned into digital capture, it has in many ways become more of an acquisition process more heavily involving decisions made in post. The Digital Intermediate gives the colorist a greater amount of control than ever before. Cinematographer Elliot Davis (Out of Sight, The Birth of a Nationrecently told NFS if he could only have one tool besides the camera, it would be the D.I., not lights or any on-set device. Lytro is maximizing the control filmmakers have in post-production but it is not actually "liberating your shots from on-set decisions" (our fault for not fully understanding the technology, pre-demo). 

The short test film Life

I can already hear people reacting to some of the shots in the demo with "but this effect looks cheesy" or "that didn’t look realistic." The same can be said for any technique used by inexperienced hands; think of all the garish HDR photographs you've seen out there. With those photographs, HDR itself isn't the issue, it's the person wielding the tool. And in the case of Lytro, there is no such thing as an experienced user. The short film Life will be the first. Even in the hands of experienced Academy Award winners like Robert Stromberg, DGA and David Stump, ASC (along with The Virtual Reality Company) the Lytro Cinema Camera is still a brand new tool where the techniques and capabilities are unknown.

The success or failure of Lytro as a cinema acquisition device has little to do with how Life turns out, given the technological implications extend far beyond one short film. Lytro has only publicly released this teaser, but the full short film will be coming in May:

Cameras aren’t done

On our wrap-up podcast from NAB, my co-host Micah Van Hove posited that "cameras are done."  His thesis was that the latest video cameras have reached a rough feature parity when measured by traditional metrics like resolution, dynamic range, frame rates, and codecs. We deemed it a "comforting" NAB because as filmmakers there wasn’t something new we had to worry about that was going to make our current cameras obsolete.

And then the next day I went to the Lytro demo. So much for "nothing new." And when it comes to making current cameras obsolete… you can see a scenario where the future, as envisioned by Lytro, is one of light field photography being ubiquitous. Remember when you distinguished a camera phone as being different from a smart phone (as being different than a "regular" phone)? Now that the technology is mature they’re just "phones" again. With handheld mobile devices, features like capturing images and the ability to use the internet have become part of what’s considered "standard." Similarly, maybe light field photography will just be called "photography" one day. Is it only a matter of time until all cinematographers are working with light fields?

"Our goal is ultimately to have light field imaging become the industry standard for capture. But there’s a long journey to get there."

Lytro Cinema Camera cinematography

Democratization of the technology

Right now the tools and storage necessary to process all of this data are enterprise-only. The camera is the size of a naval artillery gun. It is tethered to a full rack of servers. But with Moore's law and rapid gains in cloud computing, Lytro believes the technology will come down in size to the point where, as Karafin says, it reaches a "traditional cinema form factor."

If Lytro can also get the price down, you can see a scenario where on an indie film—where time is even shorter than on a larger studio film—the ability to "fix" a focus pull or stabilize a camera without any corresponding loss in quality would be highly desirable. That goes for documentary work as well—take our filming of this demo, for example. The NFS crew filmed what will end up being over a hundred videos in 3.5 days at NAB. To take advantage of the opportunity to film this demo we had to come as we were—the tripod was back in the hotel (the NAB convention center is 2,000,000 square feet so when you’re traversing the floor, you are traveling light). As a result, I’m sure viewers of our video demo above will notice some stabilization artifacts on our handheld camera. Had we captured the demo using light field technology, that stabilization could be artifact-free.

Lytro Cinema Camera motion tracking stabilization

In filmmaking, for every person working in real-world circumstances, where time and money are short and Murphy’s Law is always in effect, there are ten people uninvolved in the production who are quick to chime in from the sidelines with, "that’s not the way you’re supposed to do it." But experienced filmmakers know that no matter how large your budget or how long your shooting schedule, there is no such thing as an ideal circumstance. You try to get it right on the day, with the personnel and the equipment you have on hand, but you are always cleaning up something after the fact. Even Fincher reframes in post. Lytro allows for much more than reframing.

The ability to "fix everything in post" is surely not the primary goal of Lytro (see "storytelling implications" below), but it’s an undeniable offshoot of capturing all of this data. And should the technology be sufficiently democratized, it would be enabling for all levels of production.

For anyone opposed to this greater amount of control, let me ask: do you shoot with LUTs today? If so, you are using a Look Up Table to help the director and cinematographer see their desired look on the monitors during the shoot, but that look is not "baked in." In post, you still have all of the RAW data to go back to. In a way, to argue against Lytro’s ability to gather the maximum amount of data and retain maximum flexibility in post would also be to argue against something like shooting RAW (not to mention that celluloid could be developed in any number of ways… which always took place after shooting).

With these display advances, storytelling will change—just as it did when the basic black-and-white motion picture added sound, color, visual effects, and surround sound.

Corresponding advances in display technologies

Imaging isn’t going to advance only from an acquisition standpoint; there will be a corresponding advance in display technologies. Taking all of this information and displaying it on a flatscreen monitor (even a conspicuously large one) feels like a compromise. Lytro surely isn’t aiming to bring the best-looking images to the existing ecosystem of 50" flat screens; they must be thinking way beyond that.

Imagine your entire wall is a display, not just a constrained rectangle: you're going to want that extra resolution, going well past 4K. Imagine you can view material in 3D without glasses or headaches: you're going to want the depth information and control. Imagine you're wearing a headset and you can look around inside of the video (and each eye, in stereoscopic 3D, effectively doubles your resolution needs): you’re going to want all of these things together.

What are the storytelling implications of Lytro?

With these display advances, storytelling will change—just as it did when the basic black-and-white motion picture added sound, color, visual effects, and surround sound (I was going to add 3D to the list but there’s still so much backlash against it, and the technology is still in its infancy, that it’s not a great example… yet). Suffice it to say that the experience of watching Clouds Over Sidra on a VR headset is entirely different than watching it contained within the boundaries of small rectangle on a flat screen. The storytelling was informed by the technology.

In an era where video games and computer-generated animation are seemingly advancing faster technologically than traditional filmed live-action, Lytro shows there are still plenty of new techniques in the “live action” toolbox. But on its own a tool does not change anything; it will always come back to the same age-old question of, can you tell a story with it? Lytro is a newfangled, technically complex, eyebrow-raising, mind-blowing tool. But, as ever, what you do with it is up to you.


No Film School's complete coverage of NAB 2016 is brought to you by My RØDE Reel, Shutterstock, and Blackmagic Design.

No Film School's complete coverage of NAB 2016 is brought to you by My RØDE Reel, Shutterstock, and Blackmagic Design.

More of No Film School's coverage from the NAB showroom floor:

Your Comment

31 Comments

"i understand how this technology works" - not me

April 28, 2016 at 4:43PM

0
Reply
avatar
Alex Mallis
Director / DP / Editor
242

As a VFX artist, this would be so awesome for compositing for live action. So much more control! Does this basically make roto artist's position obsolete? The VFX field is already flooded with talent, but would this cut jobs down even further or create new ones?

April 28, 2016 at 4:54PM, Edited April 28, 4:54PM

0
Reply

Thats a good point. It'll be interesting to see how much technical skill it takes to actually perform these tasks.

April 28, 2016 at 5:24PM, Edited April 28, 5:24PM

2
Reply
avatar
Michael Muench
DP, Editor
176

This is technology for you, mate. Instead of crying about losing jobs - adapt. It was always like this and it will be. Ask analog film lab workers and film loaders.

YES, MR. WHITE, SCIENCE.

April 30, 2016 at 6:52PM

0
Reply
avatar
Terma Louis
Photographer / Cinematographer / Editor
892

I wonder if you can put it on a gimbal.

April 28, 2016 at 5:10PM, Edited April 28, 5:09PM

0
Reply
avatar
Dantly Wyatt
Musical Comedy & Content Creator.
503

why do you need a gimbal when the camera tracks it's own position all the time, smoothing that out will be no big deal.

April 28, 2016 at 11:32PM

0
Reply
avatar
subhakar tikkireddy
FilmMaker
137

Why not? How else can you get a proper image when you are holding this camera in your hands, maybe while walking up and down of stairs or holding it out of a car`s window?

P.S.: joking ;-)

April 29, 2016 at 3:59AM

2
Reply

There was a HUGE joke there. I think you missed it haha

April 29, 2016 at 2:03PM

0
Reply
avatar
Charles C.
Editor/ Director/ Director of Photography/ Wannabe Thinker
842

My mind. Wow, we are on the edge of something truly innovative.

April 28, 2016 at 5:22PM

0
Reply
avatar
Michael Muench
DP, Editor
176

The future of warp stabilizer should be a movement of settings in the 1-5% range. I'd rather see the B cam subject shake instead of the background jiggle and twist while we're waiting for this technology to fix our everything for us.

April 28, 2016 at 7:01PM

0
Reply
avatar
Gordon Byrd
Owner, Byrd Pictures
256

Fincher will fucking love this.

April 30, 2016 at 6:52PM

0
Reply
avatar
Terma Louis
Photographer / Cinematographer / Editor
892

Ryan, thank you for your closing comment: the very reason that Lytro is in the position they are in is because the concept, while sound, is not sexy in practice... and that has been true of their first 2 hardware offerings, which is why they are still living [and potentially dying] on investor's money.

My greatest hope for Lytro is that they become something like a "Panavision of Light Field cinema"- relevant at the high end, and eventually free/generously loaned to students and indies.

April 28, 2016 at 7:20PM

0
Reply
avatar
Seth Estrada
Director
81

Hypothetically, this technology is great, but in practice, the results seem abysmal. All they seems to be showing off is "Hey, just shoot literally wherever you want and fix it in post," and still, what they end up with looks now better than a mediocre green screen, if that. I'm sure this could be creatively used to some extent, but it inevitably leaves me wondering why someone would go through all the trouble and cost to shoot like this, when it would be fundamentally easier to decide what you want before hand and just shoot in real locations. Like we've always known, real things look way better than CG things, that's why sets have not been eclipsed by green screens, and why the Star Wars Prequel's effects are so often derided.

In both articles written about the Lytro cinema camera, I've seen this site gush over the possibilities and how amazing it looks, but, frankly, everything they've shown footage-wise looks terrible. It all has this depthless uncanny valley effect, compounded by the awful post production DOF effects and poor backgrounds. I've heard VFX professionals declare it a godsend, but have yet to see any of them prove it's potential. And aside from being able to remove little screw ups or fix focus or stability in post (things that any decent filmmaker can and should be able to manage for cheaper and with much simpler tools), I've yet to see anything that really makes this camera "revolutionary" (let alone the greatest advancement since color film). NFS has yet to call out any of the problems, nor ask real questions regarding practical use of this camera. So far it's mostly just been marketing for them and gawking at the tech.

It kind of makes me wonder if the site is sacrificing journalistic cred in order to maintain these interviews and close ups with Lytro, or if they really aren't thinking critically about what is being presented to them. Either way, it's not good for anyone except Lytro themselves. Lytro has a long history of promising the moon, and delivering an essentially useless or poor performing product that ends up forgotten very quickly. This looks to be much the same, from everything shown thus far. I could be wrong, I won't rule that out. All I'm saying to the writers, editors, and readers of NFS, is slow down, calm your hype train, and really consider everything.

April 29, 2016 at 1:22AM

0
Reply
Jacob Floyd
Writer / Videographer
1083

"Let's fix it later in post" are the first words that came to my mind watching the first minutes of this video. And they are the words I hate the most.

April 29, 2016 at 4:17AM

0
Reply
avatar
David R. Falzarano
Director / Writer / Editor
1004

In an age of studio control you're not going to find many DoP's or Directors willing to lose even more control of the final image.

April 29, 2016 at 10:53AM

0
Reply

I see one real adventage : you never miss your shot for technical reason, no more out of focus shot, it doesn't mean that you don't made the choice before, it's just that now one person can control it all very precisely after. It is a real benefit, how many time do we still see slightly out of focus close shot no matter how big is the budget it happens as it is almost impossible to follow the move of a character when you have very shallow deph of field.

April 29, 2016 at 12:52PM, Edited April 29, 12:52PM

0
Reply
AvdS
1296

As someone who is constantly being as to fix it in post all the time anyway, I would welcome this baby with open arms.

May 9, 2016 at 8:12PM

0
Reply

Intriguing concept, but in practice... all I know is that the much-hyped Lytro Illium still camera takes really lousy pictures, or is so complicated to use that it's not worth it. But with an 800x400 pixel screen (you heard me right), what do you expect? They're on sale right now, by the way, for about as much as a carton of cigarettes. Deduce what you will from that.

April 29, 2016 at 9:17AM, Edited April 29, 9:29AM

0
Reply
Darby Powell
filmmaker
173

To bad the examples all look so shit

April 29, 2016 at 10:47AM

0
Reply

All these videos and everything people are excited about is the ability to fix mistakes. Even if you don't use a green-screen, if the lighting of the subjects you are compositing don't match the environment you are compositing them into, then it will look like garbage. Avoiding that requires pre-planning so the DP knows how to light the subjects, so I actually don't see how that gives you any more contorl in post.

Cinematography is not just about painting with light, or whatever book title you want to quote. Lighting is one element of it -- lens choice, framing, camera movement, timing, and blocking are all equal elements.

*LUTS are in fact baked in on any camera that isn't shooting RAW and RAW outside of RED is not the majority of the workflow people are using. Use a LUT on a Cannon, Sony, Arri (when recording ProRes) and that look is baked in.

April 29, 2016 at 11:57AM

0
Reply
Derek Means
Director of Photography
225

Higher end Canon and Sony camera's, and all arri camera's all have the ability to output a lut for live viewing while recording a log image. It's not baked in unless you've changed the record settings to those parameters.

May 13, 2016 at 1:11AM

2
Reply
chris larsen
1st AC
81

At last, behold the power... of the God camera!

April 29, 2016 at 12:45PM

0
Reply
avatar
Edgar More
All
672

So is this tech going to turn us into lazy or lazy-er film-makers because we won't have to decide on anything until we get to it in post ?

Everyone will be singing from the "We'll Fix It In Post" choir...

April 29, 2016 at 3:21PM

0
Reply
Guy McLoughlin
Video Producer
27043

Did anyone else get incredibly turned on by that tech demo (especially the motion tracking and composing workflow possibilities)? Was that just me?

Also based on the article below, this company is engendering me me close to Elon Musk levels of science joy to think they scrapped their plans to make consumer cameras and came up with all this magic.

http://petapixel.com/2016/04/04/lit-lytro-scrapped-strategy-ceo/

April 29, 2016 at 8:25PM

0
Reply
avatar
chris allight
Writer/Editor
31

This is amazing. Also, is it just me or does Jon Karafin speak just like Joe MacMillan from Halt and Catch Fire? https://youtu.be/gNygCiX_uVs

April 29, 2016 at 9:43PM, Edited April 29, 9:43PM

0
Reply

I have an illium stills camera - it's pretty limited as a normal camera (2.5k by 1.5k 2d image res, needs a lot of light, built in lens, big) but fascinating in possibilities. So I don't think assuming this technology is just about fixing shots in post is fair. For example, with the illium camera image, the software lets you create a video where you fly around the image changing focus and perspective as you progress. It's like a distant cousin of bullet time and a really interesting effect. You can also have multiple things in focus that shouldn't be (in terms of how your eye sees). So imagine each frame of your film offering up those things - not to fix a missed focus (you don't really focus light field stuff that way - you have to think in terms of focus depth range, where you want the centre of that range...) but rather to do things dramatically in your final output of the scene that other technology can not facilitate. Doesn't particularly make it better or worse - just different if you allow yourself to think creatively.

April 30, 2016 at 5:11AM

0
Reply

I don't think this specific camera is going to change how we shoot all that much. It's just not particularly practical, is it? Cinematography is definitely a human discipline, where emotion can be portrayed through movement etc.

However, I do think that some of this technology could make it into more conventional/affordable cameras in the future and be of huge benefit.

It's always good to see companies push the boundaries as these things eventually filter down to be beneficial to us all.

April 30, 2016 at 10:48AM

0
Reply

Awesome..eager to work on this camera before i die... lol..

April 30, 2016 at 12:38PM

0
Reply
avatar
prasun
director-writer
74

OH NO! The art of Cinematography is finished!! Baaah, baaaah!

April 30, 2016 at 6:49PM

0
Reply
avatar
Terma Louis
Photographer / Cinematographer / Editor
892

I wonder how well it handles very small particles, like smoke with light coming through it, and actually understanding where each little particle reflecting light is located in z-space. Imagine you have a shot with smoke and light, and you want to add a CG-robot walking through the smoke.

May 1, 2016 at 3:50AM

0
Reply
avatar
John-Erling H. Fredriksen
Cinematographer
74

Would love to try it... Can't afford it :)

May 5, 2016 at 9:15AM, Edited May 5, 9:15AM

0
Reply

This technology definitely looks amazing. And it was a great interview. And I know this is the kind of stuff NFS readers eat up. But is it just me, or does this guy sound like an android? I can't help but listen to him talk about the tech and think, "Wow! Where's the heart and passion about filmmaking."

I couldn't help but think of that opening scene in Star Trek IV: Voyage Home, when the reincarnated Spock is retraining his brain, and after correctly answering a series of complicated mathematical, scientific and philosophy questions, on three separate terminals no less, the computer asks him "How do you feel?" and he gets stumped. Ha! Maybe I'm over-romanticizing it all. :)

May 6, 2016 at 10:18AM

0
Reply
avatar
Ron Dawson
Filmmaker & Host of "Radio Film School"
99