October 5, 2013

Who Needs a $20,000 Lens? New Technique Creates Amazing Images from Cheap Lenses

If you've taken photos or video with a cheap prime lens, like a newer 50mm f/1.8, you might have been surprised by how good the lens performed. That's because lenses have been designed with the help of computers for some time now, and even cheap lenses can correct for many of the issues that must be accounted for to get a sharp and error-free image. But lens development for the same sensor sizes can only get so good, and if you want perfect lenses, like Zeiss Master Primes or Leica Summilux-Cs, the cost is very, very high. What if we're going about this all wrong though, and we should use the considerable power of the cameras or post-production to make our lenses essentially perfect? That's exactly what the SimpleLensImaging research group is working on.

The group has been able to take an extremely cheap single element lens and have created fantastic images from it using a complicated algorithm:

Modern imaging optics are highly complex systems consisting of up to two dozen individual optical elements. This complexity is required in order to compensate for the geometric and chromatic aberrations of a single lens, including geometric distortion, field curvature, wavelength-dependent blur, and color fringing.

Many cameras already correct internally for specific lenses, and some can even do it for video -- but this is a whole other situation entirely. While we've got certain products like Metabones that actually improve performance as long as the lenses were designed for a larger format than the sensor, lens development has more or less plateaued for lenses under $2K or $3K. That's not to say lenses aren't getting better, they absolutely are, but most of the improvement over the last 5-10 years has been in zoom lenses, which until recently have lagged behind in performance compared to prime lenses.

Cameras will reach a point where they have way more processing power than they need, and they'll be able to perform all sorts of calculations in real-time, for photo and video. Lenses won't need to cost $20,000 to be perfect, because cheap lenses will be able to perform almost as well thanks to these kinds of algorithms. That doesn't mean there won't be benefits in having quality lenses. Getting something right optically in the first place even with excellent algorithms may still prove to be better quality, but something like what is being done in the video above helps close the gap tremendously.

You can read more about the techniques here.

Link: High-Quality Computational Imaging Through Simple Lenses

Your Comment

39 Comments

But lens development for the same sensor sizes can old get so good?

October 5, 2013 at 6:49PM, Edited September 4, 11:21AM

0
Reply
Larry Vaughn

*Only

October 5, 2013 at 6:53PM, Edited September 4, 11:21AM

0
Reply
avatar
Joe Marine
Camera Department

Technology is awesome sometimes.

October 5, 2013 at 7:16PM, Edited September 4, 11:21AM

0
Reply

Does this mean that the algorithm would be applied from the lens to the camera in real-time, or would this be something that has to be installed into the camera and/or applied in post? If the latter, then I can't even imagine what problems a 1st AC might have pulling focus.

October 5, 2013 at 8:03PM, Edited September 4, 11:21AM

2
Reply

If this is implemented, there will likely be a new mount, new cameras and new lenses where the lens contains the information for the relevant camera (info can be upgraded via lens firmware) and it all adapts nicely. Probability won't tag on though, and will be put into cellphones to reduce production cost and a gimmick to increase price.

October 5, 2013 at 8:30PM, Edited September 4, 11:21AM

3
Reply
Tyler

Since it would all be done in firmware/software, you really wouldn't need a new mount, but yes, if you wanted to do it all in-camera live for video, you'd need new hardware. At the moment this is a software solution, but just like cameras already correct for lens issues, this is the next logical step, and I wouldn't be surprised if Nikon and Canon develop their own versions. They could bring manufacturing costs down on their lenses by making them simpler, and let the cameras do a lot of the work.

October 5, 2013 at 9:20PM, Edited September 4, 11:21AM

0
Reply
avatar
Joe Marine
Camera Department

How long will some of these positions like 1st AC even be around for though? In ten years it'll probably only take 1/6 of the people to make a film...

October 6, 2013 at 2:42AM, Edited September 4, 11:21AM

2
Reply
bwhitz

And now lenses will require own software packages but they can obviously be made lighter and cheaper under this concept. The happy medium probably won't be a single element contraption for most products with the exception of the camera phones, if that, but for the consumer level, a 4-6 element optics may be sufficient. The higher end market will probably respond by willing to invest into larger glass with more light passing capabilities for higher HDR.

October 5, 2013 at 8:29PM, Edited September 4, 11:21AM

0
Reply
DLD

Maybe they can make a lens or program that takes 8 bit video and turns it in to 12,14,16 bit color video.
This looks interesting.

October 5, 2013 at 8:30PM, Edited September 4, 11:21AM

0
Reply
VinceGortho

I was going into optical engineering, but it looks like that might not be a wise choice any more.

October 5, 2013 at 8:55PM, Edited September 4, 11:21AM

3
Reply
Zach

Since being bittin by the anamorphic bug im hooked...maybe we'll end up getting smaller and cheaper anamorphics down the road. I'm working with a Helios 44-2 58mm in the back and a hypergonar STOP 16mm in the front and so far I'm impressed. Just waiting for my diopters to come in and I'll post some footage, it really adds to the cinematic quality we all long for. Thanks

October 5, 2013 at 9:47PM, Edited September 4, 11:21AM

1
Reply
Anthony Marino

You're going to be flooded with cheap Anamorphics in the next 2 years. It's going to be great.

October 6, 2013 at 1:40AM, Edited September 4, 11:21AM

0
Reply
marklondon

a cheap anamorphic is just that in every sense. There is a reason no one can make a lens comparable to the iscorama 36. The thing cost a mint when it came out because they are difficult and expensive to make. Chinese knock offs like the 1.3x slr magic look barely any different to just letter boxing footage.

October 6, 2013 at 2:57AM, Edited September 4, 11:21AM

0
Reply
Stew

Music to my ears! Thanks Mark. :)

October 6, 2013 at 11:09PM, Edited September 4, 11:21AM

0
Reply
Anthony Marino

Its partly already aplied in piont n shoot cameras

October 5, 2013 at 9:57PM, Edited September 4, 11:21AM

3
Reply
Nazdar

I was watching this on the Magic Lantern page, it got some of the developers exited. I hope for the best.

October 5, 2013 at 11:40PM, Edited September 4, 11:21AM

0
Reply
Edgar

This is cool. Now, when are we going to see an inexpensive set of auto-focus lenses with focus and aperture controlled remotely by an inexpensive hand-held device and utilizing LSI's technology?

October 6, 2013 at 12:26AM, Edited September 4, 11:21AM

0
Reply

Or you might see more two-piece cameras like that new Sony smartphone attachment. You clip the processor+battery to your belt, hold the lens in your hand and control the focus/aperture/exposure via Google Glass using voice commands. Or you could have a GoPro size one handed stabilizer for the lens alone without a need for a much heavier MoVi.

October 6, 2013 at 12:37AM, Edited September 4, 11:21AM

0
Reply
DLD

Well it can't add new information except fill in blanks perhaps and what I could see reading some of the out of focus words was higher contrast but no more easier to read. Those lenses looked awful and what we term as bad are much better than unusable. I think it would be more use as a way to rescue slightly out of focus footage if there was no alternative. Might be an idea to put a speedbooster in front of such awful lenses.

October 6, 2013 at 3:59AM, Edited September 4, 11:21AM

2
Reply
Mark Scott

A have few olf manual lenses which have a big problem when the sun is in the picture. I tried to create the sun star effect (with closed aperture) with them, but it cant be done. The sun is spilling everywhere. Is this a problem of a bad anti-reflective coating?

October 6, 2013 at 7:04AM, Edited September 4, 11:21AM

0
Reply
Laurel

Well, some "simplified" version of these algorithms already exists in some cameras/lens duo. Panasonic implements this in some of their high end ENG/Studio cameras (and in a simplified form, in cheaper cameras - HPX-250, HVX-200), correcting especially lateral chromatic aberration issues. They will won an technical Emmy next year for this.

http://www2.panasonic.com/webapp/wcs/stores/servlet/prModelDetail?storeI...

October 6, 2013 at 8:46AM, Edited September 4, 11:21AM

0
Reply
Francisco

I don't quite understand; if this is some sort of software algorithm, could this have implications for image processing in general eg might it be possible to take some old 70s TV video and substantially improve it like the demos here ?

October 6, 2013 at 9:03AM, Edited September 4, 11:21AM

0
Reply
Saied

It looks like a software based processing for a given lens and a given sensor. (in the above example) Hypothetically, of course, you could model any lens and any sensor and mix&match throughout, as long as someone bothers to write instructions for whatever is out there. This is why the Magic Lantern guys were so excited. If this comes down to writing software, they should be up for it.
.
Now, it'd be interesting if they could get an anamorphic image out of a regular lens. As long as you're just manipulating numbers ...

October 6, 2013 at 11:49AM, Edited September 4, 11:21AM

0
Reply
DLD

Good question.

October 6, 2013 at 4:58PM, Edited September 4, 11:21AM

0
Reply
Sam Biel

Put this code in ML! :)

October 6, 2013 at 1:42PM, Edited September 4, 11:21AM

0
Reply
Premini

This is complete and utter nonsense. Throw out a bunch of words that nobody understands and draw up some pretty mathematical formula and all the idiots eat out of your hand.

You can't make something out of nothing. If the camera didn't capture the information, then you don't have it. Computers can pretty up images in all kinds of ways, but a crap lens is still a crap lens, and a good lens will capture better images, period.

October 6, 2013 at 5:04PM, Edited September 4, 11:21AM

0
Reply
buzzkill

lol and how do think is Photoshop working, and how you change speed of your footage in editing timeline? All this is calculated by some formulas which alnost nobody undersand ;-) What about Shazam? That is freaky f... stuff!
Camera captured information, but its scattered: If you know right order you can put it back together :) Its maybe not 100% same as reality but you can barely rocognize it.

October 6, 2013 at 7:36PM, Edited September 4, 11:21AM

0
Reply
hawaj

I've got a bridge I can sell you - cheap!

October 6, 2013 at 7:45PM, Edited September 4, 11:21AM

3
Reply
buzzkill

That's right. It's sort of like rearranging 1,2,3,6, 8,7, 5, 9, 4 into a sequential order. If you know where the errors are, you can correct them.

October 6, 2013 at 11:01PM, Edited September 4, 11:21AM

2
Reply
DLD

The camera DID capture the information it is just that the visual information is distorted. If you know how it is distorted you can correct for that and sacrifice (for some artifacts at least) very little quality. I don't know if it will ever be possible to enable purely software-base anamorphic shooting (that's pretty extreme distortion) but in a way shooting anamorphic is a good example. When the exact nature and amount of the distortion is understood it can be compensated for with very little loss of quality.

I do think that like anamorphic this technique will probably yield tell-tale signs or signatures which JJ Abrams may one day use to ruin movies but overall it works. I did notice that while sharpness can be brought back macro-contrast is still pretty terrible.

I personally wonder if this can be applied to lens profiles for existing camera/lens combos.

October 15, 2013 at 12:34PM, Edited September 4, 11:21AM

4
Reply

Could this be used in post? Possible after effects plugin?

October 6, 2013 at 5:47PM, Edited September 4, 11:21AM

0
Reply

Setting for eliminating distortion of certain glass already exist for Lightrom, Ps, AE and other software.

October 6, 2013 at 7:39PM, Edited September 4, 11:21AM

1
Reply
hawaj

What does it mean to the average user?
Can I do something with this information?

It seems like they found something which will have to be used by software companies until the user can put his hands onto it, right?

October 7, 2013 at 3:06PM, Edited September 4, 11:21AM

2
Reply
Masterman

so glad to see I was not the only one confused around here! same questions.

October 19, 2014 at 1:05AM

2
Reply
avatar
Rebecca Pelagio
film student
240

Olympus and Panasonic Micro 4/3 camera bodies are already doing this in-camera, with the algorithms tweaked for individual lens performance. ( applies to both still photos and video )

October 10, 2013 at 5:36PM, Edited September 4, 11:21AM

0
Reply
Guy McLoughlin

@buzzkill: What's with the nasty tone? You haven't got a clue at all what you are talking about. If you can't understand the ideas involved in this research, what makes you think you can comment at all, much less make derogatory comments about "idiots"? This concept has nothing at all to do with "mak[ing] something out of nothing", as you have labeled it. Those of us who in fact do understand the words used in the article are in fact pretty impressed. It is basically the same thing for moving images that is used to make studio-recorded audio sound as if it had been recorded in a cathedral, or a stadium, or whatever. You use the measured response of the sensor/lens combination to known stimuli, compare it with what the response would look like if the capture were perfectly transparent, and from this you can calculate with fairly high accuracy how to adjust the capture of an unknown, random stimuli to mimic the response of a perfect sensor/lens combination. That's it, in a nutshell. You should really examine your motives in posting such a bitchy, know-it-all response to something which is obviously beyond your intelligence. Have a nice day.

October 10, 2013 at 5:55PM, Edited September 4, 11:21AM

2
Reply
NedB

Good day, i think that i found people frequented this site so we arrived at give back the actual favour? . My business is trying to find concerns to reinforce our site! I assume their good enough to make use of a number of your thinking!

October 13, 2013 at 9:13AM, Edited September 4, 11:21AM

0
Reply

Amazing!

...

...

...

...

...Thank you

October 15, 2013 at 6:10PM, Edited September 4, 11:21AM

0
Reply

hmm I've read the whole thing plus their website but I'm still not sure how and if I can apply this technique on my own equipment?

October 19, 2014 at 1:04AM

0
Reply
avatar
Rebecca Pelagio
film student
240