Sony plans to ship a 4K home theater projector, the catchily-named VPL-VW1000ES, for 25 grand in early 2012. Given the $13.50-a-ticket price to see a movie here in New York City, I've found myself disappointed at a few recent films where the image felt soft. Sony is on the record about 4K in theaters (PDF link), and I'm convinced that it is indeed the future for the big screen. But at home? I have a 720p projector in my apartment, and it looks pretty damn good. I can only imagine that 1080p would look better, and I don't know that I could ever tell the difference between 1080p and 4K. Still, that's not stopping Sony -- and RED -- from pushing 4K projection in the home.
Always one to push 4K, the folks at RED also have a 4K laser projector on the way, for "theater and home." Both companies have different approaches to 3D, with Sony relying on active shutter glasses, an approach that Jim Jannard has called "incredibly stupid." But even in only two dimensions, projection at 4k is only half the battle. You need a playback system, and with current 4K players costing $65,000, if 4K is going to come to the home the price of that is going to have to come down a lot. RED's solution is called RED RAY, and I got a look at it this past April. Here's my aptly-named hands-on video:
The consumer version of the above was, at the time at least, to be priced at "$1,000 or under." But as with everything RED, take specs and dates with a grain of salt, as word was floating around at NAB (April, mind you) that the RED RAY would be shipping "in the next eight weeks."
The math shows that on a 40-50" screen at normal viewing distances, 1080p is plenty, which makes it unlikely that 4K will ever have the adoption rate of 1080P HDTV. Is 4K on a plasma screen overkill? Certainly. But if you've got dreams of one day building a proper projection-based home theater, then 4K may be in your future. Even older film restorations look better at 4K. I also wonder about the relevance of higher resolution for 3D applications -- as far as production is concerned, on the latest installment of Pirates of the Caribbean they used the 4.5K resolution of the RED ONE MX to set the 3D convergence in post, which leads me to wonder if an ultra-high resolution projector could somehow take two video streams and set the appropriate convergence in real-time... okay, now we're talking "overkill."
Consumers are already swamped in 3D offerings. At some point the public's willingness to buy this stuff for incremental gains stops. Furthermore, Blue ray is good enough to the average viewer (especially when you introduce additional costs into the equation).
I'm already getting tired of hearing about 4K.
November 30, 2011 at 10:27AM, Edited September 4, 7:54AM
I am not sure I am the average consumer, but DVD is good enough for me. There is a point of diminishing returns or bang for your buck. But if some people want to buy the premium stuff, have at it I say. More power to you.
November 30, 2011 at 10:48AM, Edited September 4, 7:54AM
IMAX did an experiment with 4K. They showed two pixels. One white one black, then four. Then eight etc. etc. etc. Before they got to 4K the screen was completely gray if you were more than about 6 rows back. This is in IMAX. So yeah, I feel as though it is fair to say that a 4K projector for any sort of home use is overkill
November 30, 2011 at 10:48AM, Edited September 4, 7:54AM
Wouldn't that make it perfect for home use then? Since in a home you'd be sitting closer to the screen, and therefore get greater benefit from the 4k projection. Maybe? (I'm not sure myself, just thinking aloud)
Coincidentally, yesterday I re-read a couple year old Creative Cow John Galt piece where he mentions this anecdote.
November 30, 2011 at 11:15AM, Edited September 4, 7:54AM
Kevin you are right, at home you are sitting way closer to the screen than in a cinema, but you most probably also wont have a screen as big as in an IMAX theater.
Not only the resolution is important, but also the amount of pixels per inch projected (ppi) on the screen and also the viewing distance. That means, that you can have a much higher amount of ppi with an HD projector than with a 4K one. All you have to do is to make the projected area of the HD projector small enough. (assuming the viewing distance stays unchanged).
Then again if you have the same projection area with a 4K and an HD projector, you have to sit close enough to even see a difference. How close, I can't tell. But I have heard - pretty close. :)
As far as I know when reading distance apox. 30 cm you will not notice any pixels as long as there are around 300ppi (see the new iPhone retina display)
Please correct me if I'm wrong.
November 30, 2011 at 12:48PM, Edited September 4, 7:54AM
my own empirical tests say my eyes can see slightly above 2k but certainly not 4K
you can run my test yourself with the video at the end of this post
December 1, 2011 at 8:27AM, Edited September 4, 7:54AM
I run my test again, on a third screen (in this case, a smallish PC screen: 22" 1080p) and the result is the same as with previous tests: I can't see anything above 1080p, as long as my eyes are further from the screen than the length of its diagonal
sitting closer than that would be crazy, so yes: for me 4K is overkill
December 1, 2011 at 8:40AM, Edited September 4, 7:54AM
I am very happy woth 1080p for home use. I don't think that anyone, besides the home theater enthiusiasts, needs 4K. There have been too many format changes in a very short time (dvd->hddvd/bluray->3D) and I don't think the market will be ready for another one...
November 30, 2011 at 11:59AM, Edited September 4, 7:54AM
You got a point there. I was actually wondering just how big the capacity of a disc would have to be to hold let's say a movie shot at 4K that's about say 93 minutes long, I know most Blu Ray movies are somewhere in the 25gb to 50gb range, so what would a 4K movie be?
November 30, 2011 at 12:33PM, Edited September 4, 7:54AM
Since 4K is twice as many pixels as 2K, then if we assume Blu-Ray quality levels, you'd be looking at 100gb to 200gb for one film.
But why stop there? Why not use a DCP style playback mechanism where every frame is a separate JPEG2000 image? A typical 2K DCP that would be projected digitally in a theater might come in around 200gb. I guess that means a 4K DCP could weigh in at 800gb, nearly a terabyte.
Personally I'm sticking with 1080p in the home until 4K gets a whole lot cheaper. Maybe someday.
November 30, 2011 at 1:46PM, Edited September 4, 7:54AM
4k has four times the pixels of 2k, it is an area...
Sorry for the smart-assing
December 1, 2011 at 6:37AM, Edited September 4, 7:54AM
A 4K video would have about 4x as many pixels as a 2K video. So if you make your "4K class" video system quadHD (eg, 3840x2160), you'll be encoding twice as many pixels.
That doesn't mean twice the bitrate, any more than regular Blu-ray required twice the bitrate of DVD. In fact, Blu-ray peak bitrates are less than four times that of DVD. Of course, using the best Blu-ray CODEC, AVC, you have about twice the coding efficiency. But in general, Blu-ray rates are more like 20-30Mb/s average than peaked out. They vary considerably, and of course, any encoded in MPEG-2 will be higher bitrate... you wouldn't want MPEG-2 for your 4K disc format.
So, you expect 3-4 hours of video on a Blu-ray in AVC. You might get away with 50Mb/s for the 4K film, perhaps as high as 75Mb/s. That's over the rate of standard Blu-ray, but within the rates of 3D (up to 60Mb/s) and 2x BD drives, it not new tech. That would give 1.5-2.0 hours on a disc... not great.
Using today's tech, though, there's already the BD-XL format, up to 125GB per disc. So that's 4-5 hours per disc in 4K, or a bit higher (up to 80Mb/s or so, on a 2x disc) AVC bitrates. Typically, doubling the resolution doesn't require a doubling of the bitrate, since the block-oriented compression algorithms are working on smaller blocks, slicing the image finer. And even for the same level of compression artifacts, you eye won't pick out the smaller artifacts on a higher definition image.
So this can all be made pretty much using off-the-shelf tech. A video DSP that can decode quadHD would be the real issue... maybe run several in parallel, which was kind of the tech, in reverse, for JVC's 4K prosumer camcorder (works like four synched separate cameras, even recording to four parallel SD cards).
Of course, if the power that be wanted more, they could also use the H.265 algorithm for video, which is being finalized early next year. This is expected to have twice the coding efficiency of H.264/AVC, so a 4K video might be 1x-1.5x larger than a current HD film, using H.265. But the decoding hardware would be need even more horsepower.
September 6, 2012 at 12:10PM, Edited September 4, 7:54AM
I'm waiting for 8K, to hell with 4K. Next we'll see 4K iPad screens, and endless other shit nobody needs.
The thing with TV and movies is that I have no desire for them to be projected, presented, played in reality resolution. To me the whole point of film and TV is that it's an escape. The moment it looks real, you lose me.
Sony will sell a lot of 4K projectors to basement porn watchers, 'cause there's nothing better than "in-grown hair reality" resolution.
November 30, 2011 at 12:17PM, Edited September 4, 7:54AM
Actually, there's an industry group working on 8K. I suspect they've realized that 4K, for the home anyway, is likely just a mode added to existing 2K/HD gear, like Blu-ray, rather than a whole new thing. So starting 8K now makes a whole new set of technologies possible.
But whether the line is drawn at 2K or 4K or 8K, there's definitely a point of diminishing return here. The quality improvements may technically be about the same (6x boost for HD over SD, 4x most likely for consumer 4K being quadHD, the 8K systems being quad 4K, etc. You certainly don't need 4K or 8K, and most won't budge until there's software for consumers, full toolchains for pros (though honestly, that's already kind of "in there", at least for some tools), etc are available and priced right.
And then there's football. Unless 4K football looks better than 2K/HD football, don't expect any big push, anytime soon. But for the tech sector, there's always the "because we can" factor. If someone's selling me a low-enough cost 4K television, I'm probably buying. That could phase into the available units, just as Blu-ray has moved into the cheaper segments of the market... anyone who just wants a good DVD player may have very few non-BD choices. And given I have 3K on my home desktop already, I can see a 4K monitor there in another 5-6 years, no problem. On a PC, that extra screen space is useful (and that's with two monitors, and an occasional third).
Unless, of course, Microsoft relegates us to nothing but full screen applications... that'll kill the need for more screen resolution, once every application is designed for tablets and tablets only.
September 6, 2012 at 12:20PM, Edited September 4, 7:54AM
Yes, I think it's overkill unless you have a crazy home theatre. Honestly I don't even know why they're trying to push video beyond 1080-nobody is going to get a big enough TV to notice any difference at all (again, unless you're rich and have a crazy home theatre). What's ridiculous is that once 4K is widely used, they'll try to push another, bigger resolution on everyone. Obviously, improving technology is a noble goal-but it's useless if nobody's going to notice that it's changed.
November 30, 2011 at 1:00PM, Edited September 4, 7:54AM
At a certain point in the near future, resolution will be like audio bit rates & kHz. It will get to a certain point to where anything beyond a certain number will be completely negligible. Example: 24bit audio at a rate of 48kHz is and has been a standard of "high quality" audio for years even with the advent of 96kHz and higher. Fact: You can't hear the difference. When computers get faster, 4k will mostly likely be that visible limit & anything beyond that will not make a difference to the human eye just as 96kHz does not make a difference to the human ear.
November 30, 2011 at 3:08PM, Edited September 4, 7:54AM
Thanks but I'll wait for 8k 4D and the free zero gravity spacesuit that comes with it but must be worn to experience the sensory effects built into the film.
November 30, 2011 at 3:49PM, Edited September 4, 7:54AM
Try LSD. Then you don't have to hassle wearing a suit.
November 30, 2011 at 8:36PM, Edited September 4, 7:54AM
LSD, Chris Nolan film...same thing, actually LSD would produce a better structured narrative.
December 1, 2011 at 4:03AM, Edited September 4, 7:54AM
My parents still can't tell the difference between HD and SD. I don't think this is uncommon
November 30, 2011 at 6:17PM, Edited September 4, 7:54AM
My dad got a 42 inch LCD Full HD TV with one of those horrid fluid-motion gimmicks installed that makes everything look like it was shot on interlaced home-video (this technology is so impressive that it was probably designed by satan himself). This TV set is in their bedroom, and in their livingroom they got a similar TV set that show's progressively shot material as it should be.
I asked my parents if they could see a difference between the Ryan Reynolds going about in their livingroom, and the Ryan who's also being displayed in their bedroom.
They couldn't tell a difference.
For me it felt like you couldn't tell the diffrence between a baby's head and a football.
When I pointed out there's a difference in motion, and asked again if they could spot something now. They said they did notice something was kinda different, but they didn't really care. Oh yeah, and my mom has compained that Full HD broadcasts feel like knives were penetrating her eyes. So at least they kan 'feel' some difference between SD and HD. Ordinary every day consumers rock.
December 1, 2011 at 1:54AM, Edited September 4, 7:54AM
Now, Stewie, don' t upset about that snide remark!
September 6, 2012 at 12:21PM, Edited September 4, 7:54AM
It's overkill. I'm probably the only who does think like this, but, I never thought HD was something worthwhile, and I still don't. I mean come on, did anyone complain when the only thing we had was SD? Nope, no one said a word. But as soon as the marketing machine for HD started, everyone was like: "Oh yeah, SD looks totally crap, the only way to go is HD".
Bullcrap. This is just the usual tendency to milk the consumers for more money. It's a useless gimmick. Did any of the past "advancements" enhance the feel of a movie? I doubt it. A shitty movie will be a shitty movie, regardless if it was shot in SD or 4k, or whatever.
Also just as Erik said, "generic" everyday people can't tell the difference between these things, nor do they care. They can enjoy 30 fps SD movies, whereas we constantly blab about the need of 24 fps and higher resolution, whereas the "bigger side" is clueless and uninterested about this.
December 1, 2011 at 2:25AM, Edited September 4, 7:54AM
You cant possibly think HD is a gimmick! You're just not big on celluloid are ya... You go to the movie theater and see an incredible image so you decide you want to own the film when it comes out . Expect your crappy tv is showing the wrong colors and at a fraction of the resolution, its not the same movie cause its not how it was intended to be seen by the people that made it... I would be disappointed! You saw cyan in the theater but the home version only comes in blue .
Now that HD is possible for everyone i go to the theater half as much as i used to thanks to enjoying a comparable experience at home. Its a wonderful thing. Would i spend $25k for 4k res at home? probably not, do i want it? Sure! Does hd make a crappy movie better? no, not saying that either. But saying HD is a gimmick?! Do you buy ten dollar headphones and call boss and sennheiser sound quality a gimmick too? Just cause your happy with the inferior quality doesn't make the best, a gimmick.
I for one was never happy the home experience was not like the one at movie theater. It didn't sound the same it didn't look the same. Did i still love the movie on my crappy home quality growing up? yes! would i go back to that? NEVER!! If you play hd/sd back to back you'll think your sight is going bad. I SAY SD WAS THE GIMMICK!
December 1, 2011 at 4:15AM, Edited September 4, 7:54AM
So you watch a movie to see details and accurate colors? I don't know about you, but I watch a movie to see a story. I could give a rat's ass if the cyan looks blue to me.
Sorry, you've just proven the man's point. I watch movies in HD and I watch my DVD flicks on my plasma and I get equally engaged by good stories. HD looks better... of course, but it's not a requirement for me to enjoy a film. In that sense... his point is quite valid and both HD and 4k are gimmicks.
You should try watching a movie. I mean... really watching a movie, not sitting there counting pixels and measuring color accuracy. ;)
December 1, 2011 at 6:58AM, Edited September 4, 7:54AM
How is HD a gimmick? It does bring the picture closer to the image one would have experienced in the theater. While marketing may have used it as a tool to generate revenue, the technology itself isn't gimmicry, just an improvement in image processing.
And to be honest - as long as films have been projected there's never been a point where we "only had SD." In the broadcast world, yes, due to television sets physical limitations but not in film. Hell, if you wanted you could have ordered your own 16 or 35mm prints to enjoy them in "higher resolution."
I agree, story is principle, but I do appreciate a better scan of my favorite films more than a low resolution one.
December 8, 2011 at 10:16PM, Edited September 4, 7:54AM
You mean, complain about fuzzy images, Never Twice the Same Color, that weird feeling that you saw a completely different film in the cinema than at home, etc. Yeah, I complained. Never bought a single pre-recorded video cassette (did some transfer work to my SVHS deck, and the occasional bit of time shifting), because I didn't think the image quality made it sufficient for pay-viewing.
It's like many other things. Some people are totally happy with the quality of their iPod phones, or their iPhone as a camera, etc. Others have found it worthwhile to own DSLRs with many lenses, to buy music on Blu-ray or from HDTracks, etc. Some watch television on a 20" screen, others a much larger screen (mine is 71", and I could go for larger).
I think it's true that with each major improvement, the set of those who care is smaller, and perhaps substantially smaller. That, and the format war, pretty much killed the first shot at commercial better-than-CD audio. Blu-ray's done better vs. DVD, but in part by becoming DVD... all but the cheapest players today do both. And still, the growing thing is heinous quality on-line video (3.8Mb/s peak in 720p VC-1 for Netflix).. but hey, if you only care about the story, not the actor's expressions, the soundtrack, the cinematography, etc. that'll do ya.
September 6, 2012 at 12:29PM, Edited September 4, 7:54AM
Seen my own 720p and 1080p footage on the big screen - hate to say it but 720p looked perfectly fine.. none said anything to me about it and the other partner on that production said he even preferred the 720p footage look to the 1080p footage.. this was all talking heads stuff, so i guess the stubble on the guys faces wasn't as err how to put it.. 'jarring' to the eyeballs ?.. ok put it this way, the stubble was focused or attention getting/draw into on the 720p stuff, where as on the 1080p talking heads your eyes get distracted by the sharpness of the stubble and you can't help but have your eyes focused on that.. maybe it was just me and everyone else focused on what he was saying.. hard call for the shooter to make on stuff like this though as we've totally have enough of the flow of the storyline / movie etc and still in picking the editing to bits lol.. back of mind racing along with the story, could of done that a bit better etc etc.. or am i the only one who thinks like that when watching your own content being shown for like the billionth time ?
December 2, 2011 at 8:57PM, Edited September 4, 7:54AM
With 1080 we should probably think about using soft-focus filters ;)
Nobody complains about 24MP still pictures being annoyingly sharp, usually because they have been Lightroomed/Photoshopped to look pleasing.
With video we were used to the fact that nothing ever was really sharp, and cameras had to do a lot of digital sharpening so the SD picture would look good. This is over with 1080. We have to get used to it (and with us I mean the shooters and editors)
December 3, 2011 at 7:33AM, Edited September 4, 7:54AM
friends you all are really expert so please can anyone explain me what is meaning of 1920*1080 i mean i'm not an expert i really dont understand in my screen whenever i watch a 1080p video how can i visualize what the hell is this pixel height at all and one more thing i've read that wizard of oz was scanned at 8k which took a total 22tb then it came to 22gb after converting it to 1080p how can a 1939's movie comes in 8k pls try to explain in very simple word because i'm just an normal customer not an expert like you all are
December 8, 2011 at 6:32PM, Edited September 4, 7:54AM
One word: film. Not really all that much to do with the age. Oh, and of course, the Wizard of Oz was shot in monochrome, using the original 3-camera technicolor process (one reel for each of the RBG colors). So it's actually much higher effective resolution than a modern film shot on color stock.
An 8K scan is 32 megapixels. That's certainly overkill for a single 35mm half-frame image. But at 4K, you have 8Mpixels... you might get more useful resolution out of very high quality film images. I know from stills, I did 10Mpixel scans of my old Kodachrome 25 full fame transparencies, back when you shot and scanned if you wanted the best quality digitals to work on. My then-current Ektachomes were showing visible grain at 10Mpixel, but the K25 and Velvia40 were not. If you don't see the limit on the film image in your scan, there's more there to be had. And these guys are using high-end professional gear, too, so chances are, they're getting a bit more from every frame than I manage, just on principle.
I can't image there's any reason to scan higher than 8K for anything on 35mm film. But there's still 70mm and IMAX, eh? Higher-still effective resolution.
September 6, 2012 at 12:37PM, Edited September 4, 7:54AM
alot of image "softness" has nothing to do with the projector and resolution. often it has to do with the cameraman not finding critical focus and crappy rack focusing. indie festivals are strewn with this. some scenes are just shot poorly, even in hollywood budget films who are in a rush to get things delivered as quick as possible. i can't stress how important an individual cameraman's skill is. sharpness is the most overrated thing in the world because most cameramen can't even get the sharpest image out of the camera because of their own physical limitations. in other words, a better cameraman can get a sharper image out of the same camera than another guy.
September 25, 2013 at 7:33AM, Edited September 4, 8:21AM