July 5, 2011

In Case You Missed the Crazy Future Camera That's Refocusable in Post, Here It Is

Years ago a reader emailed me about plenoptic cameras, also known as light-field cameras, which allow an image to be refocused after the picture is taken. Sometimes referred to as a 4D camera, this crazy technology is now headed to a consumer camera from new manufacturer Lytro. News of this development, which utilizes technology first seen in a 2005 Stanford research paper, hit the internet last week, with Lytro now taking reservations for the device. Check out the refocusable images in action, and let me know what you think -- game-changer or gimmick?

First off is the company's pitch video:

https://www.youtube.com/watch?v=7babcK2GH3I

And here are a few images from the company's full gallery; click anywhere on these images to refocus:

Here's an explanation of this form of computational photography in an article at The Economist:

Dr Ng's camera recreates the light field thanks to an array of microlenses inserted in between an ordinary camera lens and the image sensor. (Dr Ng declined to reveal the precise specifications for the commercial device, but prototypes from his academic days sported 90,000 minuscule lenses, arranged in a 300-by-300 grid.) Each microlens functions as a kind of superpixel. A typical camera works by recording where light strikes the focal plane—the area onto which rays passing through a lens are captured. In traditional cameras the focal plane was a piece of film; modern ones use arrays of digital sensors. In Lytro's case, however, light first passes through a microlens and only then hits the sensors behind it. By calculating the path between the lens and the sensor, the precise direction of a light ray can be reconstructed. This in turn means that it is possible to determine where the ray would strike if the focal plane were moved.

This technology obviously has a lot of potential for professional applications, but no word on higher-end implementations at present.
Why? Because, as the article brings up, the resolution of the camera is limited, presumably coming in far lower than current digital cameras (the resolution is limited by the number of microlenses, each of which produces one pixel of the "final" image). As a result, "the new device might just reignite the once-furious race for ever more megapixels." With the added depth element in the image, a twenty megapixel sensor will output an image with a much lower horizontal and vertical resolution.

This brings up another question: what about for video? Presumably the data throughput would be tremendous -- not to mention the processing power on the consumer's end in terms of decoding such a signal -- but imagine watching a movie and focusing on what you want right now instead of what the director wanted when it was shot. I'm not saying that would be better -- but it would certainly be different. Hook up such a technology to an eye-tracking mechanism that allows viewers to focus with their eyes instead of a mouse or touchscreen, and heads may explode.

For more on the camera and company, which is being hyped as "the first time picture-making has been changed since 1826," here's an interview with Lytro's founder, Ren Ng:

As Ng says in the video, pricing and availability are not public at present, but the cmaera should be out by the end of 2011 at a "competitive consumer camera price."

I'll ask again -- what do you think, game-changer or gimmick? What other applications could you see for light-field cameras?

Link: Lytro

[via FilmmakerIQ]

Your Comment

16 Comments

WOW!!!!!!
what a concept. i wonder what the implications are for the lens.
no need for follow focus. concentrate on the framing and then take care of the shot in post? sign me up

July 5, 2011

-1
Reply
gili

Agreed. This basically cans the 1st AC job if it's applied to video. This is the next generation of photography and cinematography as we know it. I predict in 5 years, we'll all be scrambling for hdslr-like cameras with these type of lenses. And then in 5.25 years, our clients will be saying "what, you can't just fix that in post?" if we happen to have the old-fashion-y manual focus lenses : D.

July 5, 2011

-1
Reply
RevBenjamin

I do think there will always be a place for a 1st AC and keeping good focus. But the ability to change the focus afterwards would be great to fix any mistakes or for a run+gun docu shooter who has no 1stAC. Even switching to digital media didn't can the 2nd AC, instead of a loader now we have a "Data Management Technician" who, at least around here, is considered part of the Camera Department (as much as it is almost assistant editor work). Likewise if this new lens cans the 2nd AC job, the workload of Assistant Editors in post will be that much heavier that we'll see 2nd AE's or some other name pop up to fill all those extra jobs that the AE does now. These technologies are never really replacing people, just moving them to different departments.
I welcome the new technology, it's a great power to have in a pinch, but nothing beats getting footage from a seasoned pro and his team that needs the slightest curve and colour adjustment to fit into sequence.

Fix it in PRE, not POST.

July 5, 2011

-1
Reply
MRH

Sorry that should say "Likewise if this new lens cans the 1ST AC job..."

July 5, 2011

0
Reply
MRH

I agree. Interesting development and a *potential* game-changer. adobe now already offers a software based solution where you can change the focal plane on sth that was shot with deep focus. This can be a workable solution, but requires yet another few more hours in post.

IMHO, these technologies are best suited for those who have their story developed in the edit bay, rather than as a well thought-out script beforehand. I mainly do live marketing events, so I can see the benefits, although my clients are genarally not willing to pick up the tab for fixing it in post...

July 6, 2011

-2
Reply

The little discussed facts about plenoptic lenses are that they have less sharpness and contrast than ordinary lenses, and they require massive amounts of data storage and processing bandwidth to actually work. I would be surprised if these lenses make their way to video any time soon, and even if they do, they will not touch the quality and clarity of pro lenses.

For further research you might want to read about Adobe's Plenoptic camera system. http://www.engadget.com/2010/09/23/adobe-shows-off-plenoptic-lenses-that...

July 5, 2011

0
Reply
Rob

I think this will turn into a kinect-like device which will eventuallyy allow to capture hi-res depth information from footage from the actual sensor (I'm guessing you could build a z-depth map from those pictures using highpass filters). the applications in post production and interface design are endless.
This thing makes me very happy :)
totally a game changer!

July 6, 2011

0
Reply

This is very cool if it moves to video camera's this would be insane to be able to just film something and not worrying about it being in focus is crazy! definitely a game-changer just WOW

July 6, 2011

-1
Reply

If this is the real deal? I'll take it in a heartbeat. Nothing against my buds who have great paying union gigs on studio union shoots; but for us indies...I need to shoot on a string, just getting my own stuff going.
Smaller the crew the better. That's just the way it is.

July 7, 2011

1
Reply
MARK GEORGEFF

The ac's job is essentially over. We will go the way of the negative cutters.
Aahh, but we will still be needed, that is, until they find a robot to move and service the camera.

Everette

July 7, 2011

0
Reply
EVERETTE NICOLLS

I think this may, at some point, spell doom for the 1st AC, but it's a long, long way off: the cost of a skilled person is still much less than the investment in technology--including even MORE storage and back-up (it's cheap, but it's not free)--and even the increased time ($$) on the back-end (if the files are, say, five times bigger, that means five times longer to transfer, right?) And this is all assuming a plenoptic system that delivers video at professional quality.

For now, it's an interesting curiosity...a bit more than a gimmick, but not yet a game-changer.

It will be interesting to see what impact it has on editing systems, though: real-time focus shifts that don't need to be rendered until the final print? Hm...

July 8, 2011

0
Reply
Don Strong

it is fake according to me! go through pictures in the complete gallery. you might find one that is clearly photoshoped. better ask a photographers opinion

July 8, 2011

-1
Reply
amandeep

This is so lazy. Surly the image is in the eye of the beholder captured in the timeless capture of the image, a Photograph, is this a photograph? Is it a still image? Or is it something else.

July 12, 2011

0
Reply
anonymous

"imagine watching a movie and focusing on what you want right now"

Of course, porn would be the obvious early adopter of this potentially interactive format

July 14, 2011

0
Reply
Rich

Image quality is such crap though! Worse than an iPhone camera. Let's see some good glass on that and then I'll listen.

October 28, 2011

0
Reply
eph

Showed this to a VFX artist and they were convinced that it was a digital blur (gaussian, fast, etc.), so a deep DOF with just real time post processing. What would be the best way to explain how it really works?

December 17, 2012

0
Reply
Antony Alvarez