August 16, 2013

Meet KaleidoCamera, a Plenoptic Add-On that Brings Amazing New Functionality to Your DSLR

kaleidocamera-452x273Modern imaging technology never ceases to amaze me. In the past, we've talked about light field cameras, such as the Lytro, which allow the user to refocus the image in post. We've also talked about new sensor technologies that use color diffraction instead of traditional filtration. Now, a group from Saarland University in Saarbrücken, Germany have developed a camera add-on that sits between the lens and the sensor that brings an array of impressive plenoptic features to any DSLR camera. What features. you might be asking? Light field refocusing, HDR imaging, and light polarization without filters, just to name a few. Interested? Read on for the details:

So what exactly is the KaleidoCamera, and what does it do? Well, in short, it's an optical accessory that sits between your lens and your camera body that creates plenoptic information through a system of mirrors and filters.

The channels of information can then be manipulated independently of each other in order to create various effects, such as HDR (without having to combine separate exposures), multispectral imaging, filter-less polarization, and of course, light field imaging for management of depth information.

The KaleidoCamera accomplishes this impressive list of tasks through a process in which light is broken into various elements within the add-on, and then re-combined before reaching the camera's sensor.

Essentially, when light enters the KaleidoCamera, it hits a kaleidoscopic element that creates multiple copies of the information, which are then individually modified by different filters before being re-combined into a single image.

Here's a demonstration video from the KaleidoCamera's creators that shows some of what the device is capable of:

This device seems to have quite a few applications in both the photographic and scientific communities. However, the ability to manipulate plenoptic depth information could potentially be huge for filmmakers, especially those doing extensive green screen work.

If the KaleidoCamera allows various depth cues to be removed from a shot, then it's entirely possible to be able to separate actors based on their distance from camera rather than through keying out a certain color. If that's the case, then green screens could become a thing of the past.

One of the biggest drawbacks at this moment is the fact that the device seems to be diminishing the visual quality of the image. In the video, it's mentioned that, at the moment, it's producing slightly less than full HD results. However, considering the fact that the KaleidoCamera is still in the early stages of development, it seems likely that image quality will get a boost as the product matures.

More in-depth information about the KaleidoCamera and how it works can be found in this scientific paper.

What do you guys think? Is the KaleidoCamera going to revolutionize photography or filmmaking? What other applications might this device be able to achieve? Let us know in the comments!

Links:

Your Comment

22 Comments

My brain exploded about one minute into that video. I know he was speaking English, but I think I only understood about 20% of that information. Darn you public schools!!!

August 16, 2013

0
Reply
Sean K

I will wait for them to develop the product more but I am definitely interested!

August 16, 2013

-1
Reply
Kareem Kamahl

It seems like their different features could be useful for different areas of employment. High HDR to the high end/pro users (although that Magic Lantern high ISO/Low ISO HDR solution seems adequate for now). The refocusing is mostly for the low end consumer level and smart phones (Canon 70D AF system is arguably a better system for the "enthusiast" tier). And the high budget production will need a perfected design, which may be years away (plus, the CGI monster flicks are going away or so we had been told by Spielberg and Lucas)

August 16, 2013

1
Reply
DLD

After i finished watching this video, I realized my nose was bleeding. I think that was too much info at one time.

August 16, 2013

-1
Reply
alex

never go to graduate school

August 19, 2013

1
Reply
HVdude

They post their samples at 360P? Even if it isn't delivering full HD quality just yet, at least they should give it the best that is available. Theory is great, but the proof is in the pudding! I would bet that this type of image manipulation will never deliver the image quality of a lens that is directly projecting an image of light on the sensor. But I hope I'm wrong, as it would be great to have such capabilities! Oh, and they never mention about what effect this device has on the speed of a lens.

August 16, 2013

2
Reply

+1

I was waiting the time they'd say about decreasing some stops.

August 16, 2013

1
Reply
Rodrigo Molinsky

One of the alternative technologies is in the new Nokia phone. When you have a 41 megapixel sensor, you can then pick/magnify any portion of the screen down to 5 MP or so and it will already be in focus.

August 16, 2013

0
Reply
DLD

ok.. is this a plug? a Nokia phone having 41 megapixels has absolutely nothing to do with, and is not even in the same league as an add-on that simulates wider apertures, High Dynamic Range in one shot, and a whole bunch of other awesome stuff that I barely understood!

August 18, 2013

0
Reply

awesome i knew someono would invent some kind of filter to increase dynamic range ..

August 16, 2013

-2
Reply
paul

It's really not that hard to follow along with the explanation of the technology but I will admit the actual calculations involved to achieve this are far beyond my comprehension. There are some amazing possibilities with this tech, potentially, but at the moment the image quality leaves a lot to be desired. The fact that they have to run the image through a diffusion filter just kills the sharpness completely.

I hope they can refine this.

August 16, 2013

-1
Reply
Damon

Do you suppose that light field photography could be combined with pupil tracking so that what you look at is always in focus on the screen? So in a 3d film you could decide to look at something in the back of the scene and it would adapt instantly to what you were looking at? This would simultaneously increase realism and take a tool away from cinematographers who could no longer control what you saw in focus, but it would be very very interesting all the same. I think it is one of the barriers to convincing 3d too, because in real life, your eyes forced to focus on a given plane.

August 16, 2013

-1
Reply
Dovahkiin

Please do it!!!

August 16, 2013

-1
Reply
andrea

This is exactly the first thing I was thinking of too when I heard of light field photography too!

Then maybe using 3d combined with screen goggles that track head movement and move around an hd image within a bigger image

August 17, 2013

1
Reply

just show me how to use it, and show me the results. Not too interested in the science behind it just yet.

August 16, 2013

2
Reply

Tremendous possibilities. Very cool.

August 16, 2013

-5
Reply

The ultimate "fix this in post" crutch.

August 16, 2013

1
Reply
ronn

Very Cool, but I wonder if it's too much of a middle man approach. I think this idea could be constructed more efficiently (as far as resolution and Light loss is concerned) if the Plen-optics were built into the camera body and use multiple C-Mos sensors to capture the image. It would be an expensive camera, but man....

August 16, 2013

0
Reply
David Sharp

Did they have a custom built camera for interpreting the light field data? Or was it being output as a RAW file and they were able to translate the data from that?

August 16, 2013

3
Reply

Even at the developmental stage, this is pretty impressive stuff. In particular, the ability (on a limited scale) to navigate around the subject in three dimensional space *after* the image has been captured reminds me of Deckard's look-around-the-corner technology in Blade Runner. :D

Also the capacity to extend or retract the planes of action in the image provides an 'effect' equivalent to something which - outside the animal kingdom - has largely been impossible till now: a flexible lens surface. In the future, with this technology perfected, will we only need one lens? With this device, one could shoot extreme telephoto from a distance and then extend the p.o.a. to render the image like a 50mm or wide angle capture, which would be a mind-blowing capability, if realized.

Finally, away from filmmaking, there's been much talk about plenoptics as a possible game-changer for astronomical photography: imagine being able to separate out stars from one another in an image?

Anyway, I hope some serious money is thrown behind this technology.

August 16, 2013

1
Reply
Dolly

good to know i am not the only one who feel stupid after watching this video ;-O

August 24, 2013

0
Reply
Evelina