
Joe Rubinstein and his team at Digital Bolex have been developing a new, fully digital version of the most sought after 16mm camera brand in the world, the Bolex. After some lively debates on their forums, Joe decided to address a topic that he found to be particularly engaging with both fans and naysayers alike: is 4K worth it? Joe delves into what 4K is all about after the jump.
This is a guest post by Joe Rubinstein, founder of Digital Bolex.
There’s a particular discussion that has come up again and again on our Digital Bolex forum that we’ve now noticed cropping up in reactions to the footage we posted last week. So I decided to open it up here for a more public debate. To 4K or not to 4K?
CAN OF WORMS
I know that this discussion is a heated one, and there is no correct opinion. The intention of this post is to tell you about the decisions we have made when designing our camera and about how we arrived at those decisions.
To me the core functions of digital cameras are:
- Drive the sensor in a clean way with good A/D conversion.
- Transport and store the image data collected by the sensor in the best way possible.
- Provide the user a good experience and a high-value proposition.
To me this means we create the electronics that run our amazing Kodak designed sensor, and then get out of the way so that filmmakers can have an image as close to sensor data as possible. Kinda like a film camera does with film.
Many cameras makers believe their job is to make your life easier by giving you a few limited shooting styles and smaller file sizes through compression, again limiting your choices, this time in post. We believe our job is to make a camera that gives the maximum control and freedom to the artist, both on set and in post. This is our North Star, the guiding light behind all of our design choices. How do we get the most accurate representation of what the sensor captured to the filmmaker in the most pliable format?
RAW VS. COMPRESSION
Debayering is hard. When running a really nice debayer algorithm in 2K resolution, most desktops computers can only do a few frames a second at the fastest, 4K takes longer. To do this on the fly most cameras use inferior algorithms.
D16 footage is impressive. Our designers and engineers have worked really hard, researching components, tuning the sensor to perfection, designing amazing analog to digital conversion modules, optimizing data paths and write speeds, and generally doing everything we can to protect the image integrity as it travels through the camera from sensor to storage. Basically it takes a lot of work to protect a 12-bit raw file as it travels through the camera. It isn’t automatic. Cameras are either built for raw or they’re not.
In the near future, when people inevitably make their camera comparison tests comparing raw footage on the D16 to other cameras, they will be impressed, even when the other cameras are much more expensive. But if/when we add compression formats, that will change completely.
The processing power in our camera won’t be good enough to run the best debayer algorithms. And when people do their camera comparison tests and compare our compressed footage to other cameras’ compressed footage, the image will be pretty much the same, except without the rolling shutter. All of our other advantages, all of the research, all of the hard work, all of our design efforts will be washed away by the tide of compression.
This is why I am hesitant to do it.
COLOR DEPTH VS RESOLUTION
There has been a big push from a lot of companies recently for 4K. They say it is the future, and I’m sure it is. But there is another, more quiet tech revolution happening, and it is one I think may be more important in the long run. It’s the Color Revolution.
When you go to a movie these days, most of the time you are seeing a 2K resolution image from a DCP, which in size isn’t that different from the 1920 x 1080 resolution of a Blu-ray disc (yes there are 4K theaters, but I’m talking about your average screen in an average movie theater.)
However, there is no way a Blu-ray looks anywhere near as good as the 50 foot movie theater projection. Part of the reason is that theaters use amazing projectors that are DCI compliant, but another reason is that the images they are projecting have 12-bit color depth. This is a huge difference from the 8-bit color we see at home, and the 8-bit color most reasonably priced cameras shoot, including many of the new 4K cameras.
Let’s break it down. With 8-bit color you get 256 shades of red, green, and blue, which combined gets you 16,777,216 colors. Which sounds like a lot, but it’s not, when you compare it to higher bit rates. With 10-bit color you get 1,024 shades of RGB, giving you over a billion different colors. And 12-bit is 4,096 shades of RGB and over 68 billion colors! That’s some color rendition.
Why does this matter? Because just like resolution is advancing, so is bit depth. There are affordable 10-bit monitors and 10-bit video cards these days. They don’t get as much radio play as 4K does, but are as every bit (and possibly more) revolutionary. So in the future when everything is Ultra HD, it will also be high bit-rate.
Bit-rate vs resolution in imaging is analogous to bit-depth vs sample rate in audio. In my opinion, it is much easier to hear the difference between 16-bit and 24-bit recordings, than it is to hear the difference between 48K and 96K sample rates. It’s true that both 24-bit and 96K probably make recordings sound better, as the extra detail in 4k does, but the focus is usually pretty even on providing both simultaneously. People in audio don’t generally push 96K and 8-bit together the way that video/digital cinema companies push 4K and 8-bit together. When they do it seems a little wonky to me.
High bit-depth has been around for years just like 4K. And professionals and tech junkies have been preaching about it for years, just like 4K. And it is finally getting to a price point normal people can afford it, just like 4K. And just like 4K, the distribution side of the industry isn’t really ready for it yet, unless you are going theatrical in a major theater chain. There are very few computers and monitors that can handle 10-bit images right now.
I’m not suggesting anyone go out and purchase a new computer/video card/monitor in order to work in 10-bit right this minute. I’m proposing that when thinking about the future of imaging, we consider color depth at least as important as resolution.
Technology moves fast and we need to keep up, or at least we feel that way. But it actually isn’t moving that fast. The first CDs were released in 1982, 30 years ago. It has only been in the last five years that digital music distribution has become a major player in that marketplace. Blu-rays were first released in 2006. It’s entirely possible that it will take Blu-rays as long to dominate the marketplace as it did the CD and DVD, who both took 15 years to reach a 75% market share.
In today’s fast-paced high-tech YouTube world there are still almost no TV broadcasts in 1080p. Most of the big players in online media delivered to your TV, like iTunes and Netflix adopted 1080p just a little over a year ago, and most of the content on these platforms is still 720p. For television 720p is even considered a premium, for which subscribers pay extra.
The current HD standards were put into place in the mid 90′s, yet standard definition DVDs still outsell Blu-rays almost 4:1. Many analysts thought Blu-rays would be outselling DVD by 2012, but adoption has been slower than people thought. Many financial papers are still talking about the growing popularity of HD even today. HDTVs have only hit 75% of market saturation here in North America, and that was only last year!
How long will it take for all of our content delivery to be in HD of any kind? How long before it’s 1080p? How many years will it take for a majority of screens to be 4K? How many millions/billions of dollars will it take? How much will it cost for servers to host libraries of 4k content? How long will it take to create the infrastructure/bandwidth capable of streaming 4k online in average homes?
In essence, how long will it take to even show your 4k film to an audience in the format it was created in? Probably longer than we expect, considering all of the tiny moving parts that it takes to embrace new technology on a worldwide scale.
So is 4K the future? Yes it definitely is. Is it here today? Well sort of, but not really. Netflix/iTunes in 4K? Sure, in 2030. Is 4K necessary for me to make movies? Absolutely not. Is 4K right for me? That’s really the question at the heart of this debate, and only you can answer it.
I would say if you get hired to make Avatar, by all means, use the highest K you can find. But if you’re making a gritty indie film, or most TV shows, I think 2K is more than appropriate. In the film world there were dozens of formats in the early years, and eventually the market settled down to S8, 16/S16, 35/S35, and 65. I believe the same will happen with digital. Over the next 20 years the markets will settle into a few tiers. 4K will be one of them, but so will 2K.
I’m a low-budget filmmaker, and I’m proud of that. To me, a higher bit rate is more important than faster sample rates or more pixels. I think in the end what’s most important is that you can fall in love with the creative work you’re doing.
I had that years ago with 16mm film, and I’m finding that again with the D16. If you fall in love everytime you see a 4K image than that’s a good choice for you. I just don’t want you to feel like if you don’t have 4K you can’t have great images, and you can’t tell stories.
At the end of the day resolution is only one of many, many factors, and they all should be considered evenly, at least in my opinion.
This post originally appeared on the Digital Bolex blog.
Joe has 7 years experience as a Director of Photography for independent films, and 6 years experience in start ups. He worked with engineers to develop the custom hardware and software solutions that turned Polite in Public Inc, his previous company, into one of the most successful photography based event marketing companies in the country. He has extensive experience with 16mm film.
Check-Out: Pro Video, Pro Audio, Lighting – Great Deals on Gear you made need !!
With any & every B&H purchase You will automatically be entered into the Monthly Gift Card Raffle.
Your Comment
178 Comments
Really great article, I've been confused since the beginning why so many people on the forums where so excited about 4k. Its overkill for the vast majority of us. Personally all other things being equal I'd still go for the 2k camera.
August 23, 2013 at 7:12AM, Edited September 4, 8:21AM
Great post. I own a 4K camera and love it, but ironically not for the resolution, but for the color depth compared to say, a DSLR.
Why don't I appreciated the resolution as much? Well for starters I've NEVER SEEN A 4K motion image haha. As odd as that sounds it's true, and likely true for many of us. Still, as has been preached many times; 4K is nearly as important for down sampling as it is for outright 4K viewing (or so I've heard, gotta stop thru a BestBuy and see 4K). So I appreciate that about it of course.
But if I had to choose between 2K/12-bit and 4K/8-bit RAW cameras, I'd take the one with the greater color-depth.
August 23, 2013 at 7:17AM, Edited September 4, 8:21AM
12 bit color seems like a no brainer to me. Can anyone remember the early compact digital camera days where companies boasted about there mega pixels but everything else was terrible? Same thing is happening in video with 4K at the moment.
August 23, 2013 at 7:46AM, Edited September 4, 8:21AM
Agree. I find images shot on 4k to be extremely distracting to the story or lack of. I watch lots of RAW 2.5 and 4K videos on Vimeo, they are boring as hell. It's mostly landscape "porn" videos, really bad dialog with bad continuity, because the 4K picks up on all the mistakes. Film is meant to be an illusion folks!!! Screw this zit macro image reality phase. It will pass!
August 23, 2013 at 7:33PM, Edited September 4, 8:21AM
IMO, the article sort of mixes together two different discussion - one of economics - such whether or not 4K will be accessible to an average viewer via any physical format or streaming - with the science of image acquisition.
.
With regard to 4K, the physical format is, by and large, unneeded because the various streaming/server options from Red, Sony, Netflix (expected in 2014) are either out already or will be soon. The follow up question is to ascertain the potential financial interest on the part of the filmmakers of all shapes and sizes to engage into what will be limited but yet a fairly high end market. To that, it's quite possible that the visual quality of the new 4K sets and the initial scarcity of the 4K footage will create a market for the "eye candy" showcase clips. In which case, a videographer may as well rent that 4K cam.
.
As to color science, 14-bit Raw 2K is already achievable on a (reputedly stable) 5D MK III ML hack. No doubt, given that 1080p is presently available on the P&S $330 cameras and smart phones, the Raw option will be soon made available on the higher grade DSLR type pieces. The only unknown variable is the price point at which such features become available. It may be as low as $1K (as in BMD's Pocket Cam) shortly.
August 23, 2013 at 8:13AM, Edited September 4, 8:21AM
Just wanted to say, keep up the good work! Can't wait to try out your cam.
On the 4K issue, we have two prodco clients (commercial and industrial clients, not drama) who require 4K acquisition. As more sophisticated digital 2/2.5k cameras appear that may change, but that's what they've had the best results with since '10. Only the Alexa budged them, and then not for long.
In braodcast, Europe is already gearing up for a format change, which i believe will be greater than 4k.
Again, cheers.
August 23, 2013 at 8:51AM, Edited September 4, 8:21AM
What will 2/2.5K do when 6K crashes the scene?? For that matter, what will 4K do?
August 26, 2013 at 10:28PM, Edited September 4, 8:21AM
well i like the 14bit raw video from 5dmk3 !!!
August 23, 2013 at 9:05AM, Edited September 4, 8:21AM
1080p, 2k, 4k , there are indeed alot of flavors, but at the end of the day most movies screens are 2k and honestly when you think about it, 1080p has done quite well, 2k to 2.5k downsampled is good enough.
The fact that AVATAR a 1080p digital movie was projected on IMAX speaks volumes about the resolution wars, 1080 hd never looked any better on the big screen
August 23, 2013 at 3:07PM, Edited September 4, 8:21AM
What could James Cameron do with 6K?
August 26, 2013 at 10:30PM, Edited September 4, 8:21AM
I believe 4K existed when Cameron shot Avatar, but it seems he didn't think it was necessary. Hmmmmm.
August 27, 2013 at 1:37AM, Edited September 4, 8:21AM
You did forget one reason to film in 4k.
To have more edit space. If your film is shaky, then it's alot easier to stabilize, and still have enough for 1080p.
It will also allow you to zoom in on a part of your picture, without losing any quality in 1080p.
So while i am sure i won't be able to show 4k movies anytime this year, i know the tv's are coming soon, and as an amateur filmmaker, 4k will allow me more playroom, more room for errors, and give me more chances to make my footage look good.
August 23, 2013 at 9:08AM, Edited September 4, 8:21AM
A friend of mine who does interviews, is switching to 4K. With one camera he can shoot his wide shot, medium shot and close-up. This may be handy for interviews, but I don't see much reason to use this technique with narrative films. YMMV.
August 23, 2013 at 9:26AM, Edited September 4, 8:21AM
UHG! the 4K crop a medium to CU is bad. while I have no issue with minor framing adjustments, shooting everything wide and cropping it for closer shots is just lazy / cheap. there are the unknowing who run around all day shooting RED's with a 24mm lens on all day to do exactly this. the thing is, once you start doing these more radical crops you can see changes in sharpness. back in the days of shooting 35mm I did this. with ISO 200 film you could go from full frame to *maybe* 200% before the image started to get soft in the transfer using optical zoom... and that was going to SD. the rules of physics haven't changed for digital....
August 23, 2013 at 9:45AM, Edited September 4, 8:21AM
About cropping 4K. When you take a close-up shot, for exemple, you change lens because you want a shorter depth of focus, another style of bokeh, a different focal length and so on. To film everything in one shot is not a very good idea if you wanna tell stories. And even in an interview it's not the best idea. You might lose some sharpness when you crop in, but what's worse is that you lose the ability to express something. I understand it's convenient, but to frame your shots right and to understand what lenses to use is a very important part of the job.
August 23, 2013 at 10:28AM, Edited September 4, 8:21AM
I guess it all depends on the budget available. Doing a single take edit is much easier in post than a multi-camera / multi-take shoot & edit. If it works for their type of work, who are we to judge?
August 24, 2013 at 7:53AM, Edited September 4, 8:21AM
See my piece above. Everyone does it. As I say, we have clients who require 4K acquisition, and this is one of the main reasons. The people below here who are upset by it, well, that's your prerogative.
Often when you're shooting a CEO/World leader, especially outside the US, your setup time is zero. And I mean, zero.
In some cases even getting two cameras into the country can be difficult. Being able to reframe an interview with a subject whose English may not be great, where you have to show them at their best, will SAVE YOUR ASS in an edit.
/apologies for being off-topic.
August 23, 2013 at 11:18AM, Edited September 4, 8:21AM
Well, it depends what you do. For the work you mention 4K sounds like a great solution. I agree on that. But for some productions it's much better to have other choices. D16 is no a bad solution for the right kind of films. Some aspiring filmmakers think they can use 4K as short cut, but I think that's wrong. You got to learn to work with what you have in most cases.
August 23, 2013 at 12:15PM, Edited September 4, 8:21AM
Oh I think the D16 is terrific! I wish it had existed 20 years ago!
August 23, 2013 at 3:54PM, Edited September 4, 8:21AM
that's crazy!!! when you cut between different shot sizes without changing camera angle the editing looks awkward... it sort of feels like a jump cut
August 23, 2013 at 3:04PM, Edited September 4, 8:21AM
Ha! You put other shots in between. Trust me, you'd be amazed how many times you've seen it (and its brother, the slow zoom in post).
August 23, 2013 at 3:57PM, Edited September 4, 8:21AM
That sounds like a horrible reason that 4k is a good thing? You're essentially saying that if I could own a 4k camera then it wouldn't matter if my handheld craft was poor and it wouldn't matter if I had no cinematic craft behind my composition? I know big films shoot 5k for 4k and I know the advantage the extra resolution of 4k will have with stabilisation but as a DP this statement of yours is telling me that over time 4k 5k and 6k is going to lead to poorer and poorer levels of camera operation and cinematic planning or to put it another way "fix it in post"
Low cost cameras with massive levels of DR are going to do the same to lighting skill, work with a cheap AF100 and pull off great images and you learn something, do it with a raw camera and maybe not so much ;)
August 23, 2013 at 10:21AM, Edited September 4, 8:21AM
Did you think the cinematography in Girl With A Dragon Tattoo was poor?
August 23, 2013 at 12:43PM, Edited September 4, 8:21AM
Ugh, I HATE Jeff Cronenweth's cinematography! It's just a super-naturalistic style I can't get into. He pulls it off well and I can tell he MEANS to light it the way he does but man... I can't say every shot is bad but overall just flat and uninspired.
What sucks is that he's done some great flicks (which is why I feel he's been nominated by association). I just never realized how bad-looking the cinematography was until I went back and re-watched some of them.
Did you see Hitchcock? One of the ugliest movies I've ever seen in terms of cinematography.
August 23, 2013 at 1:29PM, Edited September 4, 8:21AM
I would agree about Hitchcock.
August 23, 2013 at 2:24PM, Edited September 4, 8:21AM
I guess I'll have to go back and re-watch Girl With A Dragon Tattoo. Nick what are some of your movie favorites in regard to cinematography?
August 23, 2013 at 2:37PM, Edited September 4, 8:21AM
Hey Razor,
My favorite cinematographic style is one in which the film would look equally as beautiful in black and white as it does in color. The perfect example would be my favorite cinematographer Conrad Hall (R.I.P.). If you watch Road to Perdition or even American Beauty to a lesser extent, you can tell it would be just gorgeous in black and white.
Dion Beebe is another good example of this style.
Another of Fincher's DPs who is better than Cronenweth (IMO) at the naturalism thing is Darius Khondji (Se7en, The Game).
Roger Deakins is a great mix of the Hall/Beebe and the Khondji/Cronenweth style.
August 26, 2013 at 4:26PM, Edited September 4, 8:21AM
Oh yeah but the films. Sorry.
Road to Perdition
Gangster Squad (Say what you will about the film. It still looked gorgeous!)
Sweet Smell of Success (B&W)
Anything directed by Wong Kar Wai like Chungking Express (with Doyle) or My Blueberry Nights (with Khondji)
I Saw The Devil
Sympathy for Mr. Vengeance
Chicago (yes, Chicago!)
Bronson
Glengarry Glen Ross
to name a very select few.
August 26, 2013 at 4:46PM, Edited September 4, 8:21AM
No, but neither Fincher nor the DP shot Dragon wide and reframed it to death. Professionals don't do that unless they have a very, very good reason to do so. Usually it's done in an emergency. There is more to choosing a particular focal length than FOV.
August 23, 2013 at 2:26PM, Edited September 4, 8:21AM
Every shot that was shot on Epic was reframed from 5k to 4k...does it matter if composition is done on set or in post? How do you feel about push processing film?
August 23, 2013 at 3:06PM, Edited September 4, 8:21AM
The opposite is one way I would consider doing a 4k shoot: shoot everything silky smooth and add shakiness in post with the wiggler as much as you like.
August 24, 2013 at 7:56AM, Edited September 4, 8:21AM
Some of us want the extra resolution to downsample into better 2K final outputs, or require the headspace for visual effects plates. I also get the confirmation bias in this article, we get it, you're making a camera that's 2K, and doing your best to justify your camera's output while the rest of the industry is gaining more dynamic range AND resolution. It also seems a little odd to be getting a technical lecture about a camera that's not even close to being released. *shrugs*
Frankly the processing power to debayer 5K or 6K images isn't that tough, and very few of us need to edit 1/1 and instead are happy to edit in 1/4 or 1/2 with minimal hits to performance on a decent machine. If your working output is the web, fine, 2K is great. If you're doing broadcast and need to do post-work it's good to have the resolution headroom. I wouldn't tell a photographer that because his final output is for the web, that he should only grab a 12mp point and shoot. We shoot much higher res to have that resolution when and where needed.
August 23, 2013 at 9:28AM, Edited September 4, 8:21AM
I like the part where you completely missed the point in the article...
He's coming from an independent film making perspective, he's not saying you can't use 4K, he's merely stating that you don't need 4K to achieve a great image. He mentions that in his opinion there are more important things to look at.
You state: Visual effect plates, Downsampling, broadcasting headroom.
Neither of these statements reflects what the article is trying to communicate to the reader, instead of criticizing the points made, you move the subject to other broader aspect that the article was never referring to, which was independent film making.
On the topic of bias, every one is bias in one form or another. Its the the reader's goal to perceive these statements in a logical form. Does these statements have merit? I think so.
August 23, 2013 at 10:32AM, Edited September 4, 8:21AM
No need to be defensive about S16 and 1080p/2K, fellow hipstagrammers. I love 16mm and there's a ton of gorgeous glass for it. Couple points though. 1. The live-action parts of Avatar were shot in 2K. 2. "Netflix/iTunes in 4K? Sure, in 2030" You don't know how wrong you are, darling.
August 23, 2013 at 10:11AM, Edited September 4, 8:21AM
"Netflix Chief Product Officer: expect 4K streaming within a year or two ...
.
Neil Hunt : Streaming will be the best way to get the 4K picture into people's homes. That's because of the challenges involved in upgrading broadcast technologies and the fact that it isn't anticipated within the Blu-ray disc standard. Clearly we have much work to do with the compression and decode capability, but we expect to be delivering 4K within a year or two with at least some movies and then over time become an important source of 4K. 4K will likely be streamed first before it goes anywhere else ..."
From this past March's Verge column, but also available across the webs.
http://www.theverge.com/2013/3/14/4098896/netflix-chief-product-officer-...
PS. And don't be calling Neil Hunt "darling". I read he prefers "honey bunch" instead.
August 23, 2013 at 11:16AM, Edited September 4, 8:21AM
Again, I'm not saying the technology for Netflix 4K will be 17 years off, but the adoption rate.
When will the average person you don't know 5 states away see a movie you shot presented in 4K.
The adoption rate to get 4K to 75% of homes, AND 4K Netflix is very very slow. It will take many many years.
Maybe not 17, but at least 10.
This is the underlying point of this article. No matter how any tech company tries to drive the market, the market does it's own thing. And the market is what we have to be paying attention to, not listening to the tech companies shouting in our ear.
Even if you shoot a movie tomorrow in 4K, unless your Spielberg, and he has trouble doing this too, you aren't going to be showing it in 4K anytime soon.
August 23, 2013 at 11:53AM, Edited September 4, 8:21AM
An article all 4K hipsters should read.
August 23, 2013 at 10:42AM, Edited September 4, 8:21AM
To those of you who use 4K to crop, re-compose, stabilize or 'punch in'... if we ever meet, please inform me, up front, that you do that so, I don't waste any time talking to you.
August 23, 2013 at 10:52AM, Edited September 4, 8:21AM
Please contact David Fincher, who has done that on all his films using the RED. By all means I'd love for you two to compare resumes.
August 23, 2013 at 11:02AM, Edited September 4, 8:21AM
I used to work for David and he would stab you in the face if you shot a scene like that. It's done in an emergency or to stabilize footage.
August 23, 2013 at 2:44PM, Edited September 4, 8:21AM
First. You can capture and create beautiful images at any resolution.
For anybody who has shot and finished S35 4K shouldn't be an alien concept. We've been working with it since the 90's via 4K film scanning and laser recording. A good deal of productions preferred to work at 2K due to cost and time expenses. However, S35 does indeed resolve 4K and a bit more if shot perfectly.
4K projectors have been in theaters since 2007 I believe. There are thousands of them worldwide now. More and more features, not just big budget VFX work, will be finished in 4K.
4K has actually been a goal that the entire industry has been after and now that technology has allowed it we're finally getting to a place where our digital cinema cameras are matching or exceed motion picture film on a format and technical basis. In terms of resolution the Sony F65 and Red Epic are the cameras that today exceed S35's resolving power. Red Dragon actually exceeds film in both resolution and dynamic range. This is the first time this has happened in the history of digital cinema cameras. I need to stress that.
There's different cameras for different types of shooters and each shooter needs to observe their format and reolution preferences. S16 pretty much resolves about 2K resolution when shot carefully with good glass and scanned in. Which is why in my mind the Digital Bolex D16 is on a resolution basis a good idea. BlackMagic's cameras also somewhat land in this ball park, although I personally have thoughts on using them for production work.
The best thing actually about all of this are the camera raw formats. This was/is the right move from a capture and post standpoint. While not sometimes friendly for rushed ENG styles of shooting, although it's being done on a daily basis, it does allow the most creative freedom. That and the raw stream itself, at the cost of recording media space.
4K digital distribution is coming. ODEMAX has just had a soft launch and should be available in Fall/Winter 2013. 4K displays are hitting households at $700-$6000 and up if you have the dough. Any modern video card can output UHD resolution for playback. The new Sony PS4 and Microsoft xBox One both support 4K playback and will be available this year. 4K tablets, cell phones, and laptops were demonstrated in January of this year as long at Netflix's prototype 4K tech, which should be out next year. DirectTV has invested in it. Sony has invested in it. Gaming enthusiasts have even been playing PC games at 4K for a couple years now.
Market acceptance is some funny business, but if you think 4K is 20 years away I'd say that's a fairly wrong number. I personally have been working exclusively in 4K+ for 3 years now and have been working with 4K material for about 14 years. That's my own unique personal and professional choice based on the industry I'm in and what I expect/demand out of the content I shoot. It's also what my clients hire me for as it's what I specialize in.
I come from a film background and my particular format preferences have been S35 and VistaVision in terms of format. I've even shot 65mm and IMAX, and while beautiful, it's somewhat unpleasant for a great deal of shooting conditions, not to mention expense film and processing costs.
A few questions all should ask when purchasing a camera are: What format do you want to shoot in? How many frames per second do you need? You want raw right? What camera fits your budget with these specs? What if you take money out of the equation?
Right there you have the camera you can afford and the dream camera you desire to own. A great deal of those in this somewhat difficult quandary at that point choose to rent. Which is a good way to go if you are working on a project by project/job by job basis.
I work in a world where we target 40 and 60 foot screens. Newer 4K laser projectors are on the horizon and provide an unprecedented level of consistency and quality in the theater. Something film has always had an issue with. Also, there are 4K projectors coming to homes too.
So basically different tools for different folks. 4K in someways has been here for a while. You've actually even experience that level of quality in theaters for years on good optical prints. Sooner than many expect 4K will be in homes and in very affordable and approachable ways. You haven't seen a sports broadcast before like what you're going to see in 4K. It's tremendous and that's all happening right now. Motion pictures are equally impressive if they are shot and finished properly. If you've seen it, you know exactly what I'm talking about.
Cheers and have a good weekend.
August 23, 2013 at 6:19PM, Edited September 4, 8:21AM
I knew you'd pop in at some point Phil :)
I of course understand and agree with everything you're saying.
You are in a very unique position in this industry.
What I'm trying to get people to understand is that yes, 4K is "here" tech wise, but for most people that make an indie film there is not a distribution platform for them that will exist for several if not many years. If you make your Bubba Ho-tep in 4K no one will see it in 4K for many years. It will most likely not get a theatrical release, and if it does they won't put you in on a 4K screen instead of Avengers 2, most likely you will go DVD, VOD, and eventually Netflix. And again if those services offer 4K they won't be showing your movie that way, they'll be showing Avengers 2 again.
So yes 4K is great, but for most indie films shooting 2K will give you almost exactly the same delivery products.
August 23, 2013 at 6:58PM, Edited September 4, 8:21AM
Another wannabe pro who probably shoot horrible wedding gigs for friends. Maybe if you have work with real professionals in the industry that uses After Effects, Maya, and Nuke, you would know how often an editor, director and producers edit, re-compose, stabilize a shot for vfx. Mindset like yours will never get a job in real world. The internet is where you belong. haha!
August 23, 2013 at 11:39AM, Edited September 4, 8:21AM
This is just silly, arrogant comment. How much post stabilization have you done? As a previous poster already said, Fincher uses 4k for this purpose already. It makes perfect sense to use it for this. I would call him at least competent, if not one of the more talented directors today. You could probably make an argument re: the reframing as it is a bit of a cheat, but, are you really that much of a snob?
August 23, 2013 at 12:02PM, Edited September 4, 8:21AM
What exactly makes stabilization and composition in post worse than stabilization and composition on set?
Does making something harder to do make it better? Should we still be using hand crank cameras with 25 ASA film stock?
Film history is made up of clever people who looked at what they had and got the most out of their tools to realize their vision. Those are the kind of people I want to work with.
August 23, 2013 at 12:53PM, Edited September 4, 8:21AM
Sounds fine to me. I get the impression it would be me wasting my time.
No-one sets out to do it - its happens due to the many vagaries of production. The key is to get the story told. Everything else (quality of individual actors, lighting, set design) is gravy.
If you have a moral stance about it fine. I'll treat you the way I treat others whose irrational moral stances I don't agree with. I'll smile patronizingly, as if at a small child, and move on.
August 23, 2013 at 4:03PM, Edited September 4, 8:21AM
Photographers crop and reframe all the time. Seems silly to suggest that a cinematographer who does it is somehow impure.
August 23, 2013 at 6:18PM, Edited September 4, 8:21AM
To those of you who won't use any tool at your disposal to try and achieve the best possible results you can. Perhaps you should consider it...
August 23, 2013 at 11:10AM, Edited September 4, 8:21AM
I think this article makes good points.
Not everyone needs or has to have 4K. To me though this seems like a media pump for his company and camera honestly. I mean how long as it been since it was supposed to release? The first time I read this I thought this makes sense. The more I think about it, I think there is a very large possibility Joe is trying to get people to still want his camera. I mean 4K is around the corner. Him choosing 10-30 year comparisons is extremely out there and random.
Netflix has already mentioned 4K streaming was coming in a year or two. Some Broadcast networks have already made the jump. I think it is for sure a matter of choice for must of us. I still think 2K is good enough for work now.
This whole article just comes off as a big hidden "well we're super late to delivering a product which has since been surpassed by well, the 5D III with raw, BMCC, BMPC, and new cameras every month it seems, It's time to write a press release with a bunch of facts and evidence that will validate why our camera is still worth it."
I totally get it, had I put that much time and effort into something to have it take 2-3 years to develop and release and it's almost becoming more obsolete tech by the day, I'd totally do the same thing.
I'm not saying the camera doesn't have a place, it does, i'd love to test one out but I guess to me this just seems like a reach to hold onto customers and people who may have been on the fence about the camera. I think 4K is coming alot faster then most though... I still own and shoot 1080 cams as well. I could totally be reading this wrong. Just seems like a marketing attempt to keep all the work he's done relevant.
August 23, 2013 at 11:18AM, Edited September 4, 8:21AM
+1
August 23, 2013 at 11:20AM, Edited September 4, 8:21AM
You guys just never learn. Back in the day everybody said that 1080p wasn't needed and 480p or 720p was "just fine". 4K still doesn't match the human eye (and barely keeps up with a real film scan), and we won't stop until we get there. So have fun with your 2k products that will be obsolete in 3 years
August 23, 2013 at 11:22AM, Edited September 4, 8:21AM
And while the high budget productions are moving toward 8K.
August 23, 2013 at 11:28AM, Edited September 4, 8:21AM
If you're talking about the F65 that's a bit of a misnomer. That camera shoots at 8K but really debayers down to 4K. Some lovely marketing voodoo on Sony's part.
August 23, 2013 at 11:31AM, Edited September 4, 8:21AM
No, I mean the real NHK style 8K. When someone says that 4K is a decade away, what they should say that 8K is a decade away. 4K is basically for here and now. Even new smartphones will have it before Christmas.
.
PS. @Joe Rubinstein : the following info has been widely disseminated. Between now and the 2014 NAB, you may be seeing as many as twenty different 4K models. If true, the rentals will come down to no more than a few hundred dollars a week - let's say, a current C100 fee - and folks using the 2K cams will be looked upon as using the inferior equipment by their brethren. Decent looking 1080p is already available in the $300-$400 price range. 2.5K is already at $2K and is heading even lower. Between now and the summer'14, serious filmmakers will just have to make that jump.
August 23, 2013 at 12:37PM, Edited September 4, 8:21AM
Again, I hate to sound like a broken record, but you keep pushing the perspective of an electronics company trying to drive the market. Take a step back. Don't think about what new tech is being pushed at you in the next 18 months or whatever. Think about when your parents (I don't know your age, but you get my meaning) are going to upgrade to 4K. Most likely they just upgraded to 1080p in the last 3 years (according to statistics). Think about wether they care to have the latest and greatest. Cause it isn't till they upgrade that we really start hitting market adoption rates anyone takes seriously.
If you're a filmmaker you can look down on people shooting 2K if you want, but it's very likely that their films will look just as good as your films on 99% of screens in the home and theater, but you'll have shelled out all that dough and gone through that 4K workflow to get there.
August 23, 2013 at 2:49PM, Edited September 4, 8:21AM
For those who haven't seen it, the 8K camera, with H265, for 8K tv, projected for 2016 in Japan. They're working on getting to 120 hz in real time. I'm feeling out of it now. I'll go entertain myself with an Etch-A-Sketch. :
[ http://www.youtube.com/watch?v=XhLLjrkSroQ ]
August 23, 2013 at 10:50PM, Edited September 4, 8:21AM
the 8k in the f65 is marketing (typical sony haha). It is 8k supersampled
August 23, 2013 at 1:40PM, Edited September 4, 8:21AM
Just look at how long an average TV set lasts inside a household - from what I know, it's anywhere between 3-10 years, with the majority skewing to the latter. That's how long it takes for a format change to break through. My FullHD cameras are anywhere but obsolete as of NAB 2014. NAB 2016 maybe, but by then I will have bought new cameras anyway.
August 24, 2013 at 10:59PM, Edited September 4, 8:21AM
I'm not trying to say that 4K isn't coming, or that 4K isn't good, just that if you take a look at market adoption rates at the very fastest big tech changes like this take 10 years.
We are amazed all the time by the speed of technology, look how fast the iPhone and iPhone clones took over the market right? But the iPhone runs on cell phone technology / networks that were invented (and only incrementally improved) decades ago.
You can say that I have a vested interest in 2K cause I am producing a 2K camera. That is true, no doubt, but it is also true that massive migrations in tech take time. I'm just trying to point that out. Tech companies push things hard with early adopters, and early adopters often bare the brunt of the cost / difficulties. I know because I am often an early adopter. I paid $600 for my first iPhone and went through it's growing pains. I don't regret it, but that was only a phone. If you go full on into 4K you need new cameras, new computers, new monitors, TVs, possibly a lot of other things. This is a very expensive move. And most likely, unless you are shooting presidents or really need the reframing capabilities it doesn't seem worth it to me.
You will not be showing your content to anyone in 4K (except in your studio) for a very long time.
Let's say theaters get to 25% 4K screens in the next few years. Those rare few screens are going to be taken up with transformers movies for years to come. And let's say Netflix get's a limited number of movies in 4K up, those are also going to be the same transformers movies for years to come.
So as I said in my post, if you are shooting the next Avatar movie(yes I know Avatar wasn't shot in 4K) please do use the highest K you can get your hands on. But if you are not, and you are shooting an indie film, please CONSIDER wether shooting 4K is right for you or not.
That is all :)
Thank you all for reading and for your comments, Joe
August 23, 2013 at 12:12PM, Edited September 4, 8:21AM
I agree on everything Joe writes. The problem with 4K is the distribution. Many of you dream of making movies for the big screen. Not many of you will, that's reality. Even if you would distribute it yourself most people would choose the version they could watch on their screen. In most cases a DVD (SD) or HD on the internet (if they choose to watch it in HD).
TV, web channels and independent filmmakers don't distribute 4K yet and it will take some time before they do. In some cases much longer time then you believe. Not every market in the world is the same. TV or netflix will only change to 4K distribution if they can make more money.
Other things to consider are the cameras usability and production budgets. It makes sense to shot 4K if you can't take more then one take. It's the quickest way. But in most cases you want to keep down file size, disk drive costs, work hours and use a camera that's not too heavy... There's many things to consider when you make film.
August 23, 2013 at 12:53PM, Edited September 4, 8:21AM
4K distribution via Netflix depends on the implementation of H.265. So far, that's hardly available anywhere...
August 24, 2013 at 11:05PM, Edited September 4, 8:21AM
I think there may be more 4k projectors out there than you realize...even 3-4 years ago the projectors in my small college town's multiplexes had been switched to 4k Sony projectors. In Seattle I haven't done a count, but it seems like every time I go see a movie it has the Sony 4k thing before it.
My new phone has a 1080p screen. 4k monitors and TVs have been rolling out all year. Next year they'll be cheaper. The year after pretty much anyone will have an option to buy.
August 23, 2013 at 1:08PM, Edited September 4, 8:21AM
Okay so I looked up some numbers...according to this document from 2012 on page 22, there are almost exactly 40,000 screen in the US: http://www.mpaa.org/resources/3037b7a4-58a2-4109-8012-58fca3abdf1b.pdf
Sony says they've installed over 10,000 4k projectors according to this 2012 article: http://www.studiodaily.com/2012/11/sony-strives-to-sign-more-theaters-fo...
That means at least 25% of screens were 4k last year, without even counting Christie and Barco 4k projectors.
August 23, 2013 at 1:25PM, Edited September 4, 8:21AM
Outside US (and Japan) things are very different. In Europe we use 4K now and then to make films, but there's hardly any distribution to find. At least not right now. And there are much poorer countries with their own film industries both in Europe and outside Europe that don't have the money to change to 4k distribution. In many places slow internet connections are still very common and TV is still in SD... It will take time, trust me. At least in the rest of the world.
August 23, 2013 at 1:30PM, Edited September 4, 8:21AM
I'm starting an "indie film" on the 7th and we are going 4k because why not? rental rates for scarlets and epics are at rock bottom, and the small clients like us benefit greatly from that. I'd rather drop 2-3000 for a couple weeks of high end equipment rental than drop 2-3000 on buying a prosumer oriented "cinema camera" that only does 2k or 2.5k and will be replaced within a few years, or even a few months
4k also is closer resolution-wise (as mentioned earlier) to film, it is nice to be able to re-frame and crop in for stabilization, and honestly who DOESN'T want prettier, sharper pictures?
if your products aren't future proof, you'll die. end of story
August 23, 2013 at 1:33PM, Edited September 4, 8:21AM
I disagree. I'd rather sell my clients a new film every (few) year(s) than creating one that is "future proof". Of course, if you depend on selling the same stuff for years to come, things are different.
August 24, 2013 at 11:08PM, Edited September 4, 8:21AM
Joe, it sounds like you're done here but I'll throw my 2 cents in anyway. It seems like what a lot of people, including you, may not be taking into account is the transition to HD from SD basically happened simultaneously with the transition from 50 years of analog to digital. There was a lot of legacy to overcome. Now that everything is digital, I'm guessing adoption rates will happen faster. Using HD adoption rates as a guide without considering those factors may lead to the wrong conclusions.
In addition, who's going to wait on 75% saturation? How long will it take to get to a 40% saturation? That amount seems significant enough to make the 4K leap, don't you think? In the end, people will buy what they can afford. If they can get 4K for the price of what they have no problem paying for 2K/HD now, they're probably going to do it whether they "need" it or not. Look at how many people have HD sets but no HD service. I want the D16 to do well. But, just as you want people to CONSIDER sticking with 2K for the near future if they don't absolutely need 4K, you should also CONSIDER you might have to produce a 4K camera sooner than you think to stay competitive. Either that, or produce the best looking 2K camera the world has ever seen.
August 23, 2013 at 1:56PM, Edited September 4, 8:21AM
Hey Brian,
I appreciate your comments. I agree that a direct comparison of SD -> HD may not be 100% accurate to a current adoption model, because of other factors, but there are still more factors that you may not be considering. We saw a huge economic boom in the 90's and early 2000's that helped HD, we are currently in an economic downturn. We are also currently living in smaller spaces than before so huge TVs may not sell as well. We are also hyper aware of the environment and TV / Monitor production isn't exactly environmentally friendly.
And you're right 40% market share is a significant number, but who is going to wait for 75%? Well Apple and Netflix did (outside of small amounts of test markets). So yes many content providers won't spend them money on pushing out a majority of their content on 4K if there is only a 40% adoption rate.
And no, especially in this economy, people don't just buy things because. At least not at a significant rate. They buy things when they need to cause their current thing doesn't work anymore. So when Netflix releases House of Cards in 4K ONLY and you have to buy a 4K TV to watch it, people will buy it. But till then many many people(probably the majority of the market) will feel HD is "good enough".
August 23, 2013 at 3:05PM, Edited September 4, 8:21AM
It's all about economy. Most people in the world can't afford to buy 4K TVs even if they are cheap. And 96 % of the worlds population live outside US. The 4K revolution might happen in US, but won't happen elsewhere except for Japan. In Europe we have so many other things to think about now. The economy is terrible. And it's going to be much worse. China is doing badly, India, US, Europe, Japan... The world has much more problems then to choose between full HD and 4K. To make big investments in a time like this means big risks.
August 23, 2013 at 3:38PM, Edited September 4, 8:21AM
Thanks for the reply, Joe.
Economics are a factor which may slow 4K adoption. However, smaller spaces only help a 4K image. Sitting way back from a television is another holdover from the NTSC days when sitting too close allowed you to see the deficiencies in the picture quality. If history and "Star Trek" have shown us anything, it's that screens will get bigger (it may be through projection, considering the environment and all)). 4K really needs bigger screens and closer viewing distances to get the full benefit. TV manufacturers will probably be the driving force there. They'll simply lower the price of 4K to HD levels (in 4-6 years), set the minimum 4K display to 55" and start phasing out HD sets altogether. That's provided the economy stays put or, better yet, takes an upturn.
When I asked about waiting for 75% saturation, I meant on the acquisition end, which is where you live. 40% saturation (and rising) in the viewing audience is big enough for people to start acquiring in 4K, I think. Content providers may still be heavy into HD at that point but, if they're thinking ahead, realizing the inevitable, may start to finish in 4K and down-rez to HD with 4K ready to go for syndication (if they're lucky) as saturation approaches 70% or so. I'm no authority, so don't take me as anything more than a guy playing devil's advocate. However it shakes out, I hope Digital Bolex is around to see and impact it.
August 23, 2013 at 3:48PM, Edited September 4, 8:21AM
I probably should've said "content creators". I'd hope to be given the benefit of the doubt but I know how some people like to pounce around here. ;-)
August 23, 2013 at 4:13PM, Edited September 4, 8:21AM
A 55" TV set doesn't fit inside the TV cabinet in my living room! I would have to change furniture first...
August 24, 2013 at 11:18PM, Edited September 4, 8:21AM
That's untrue. When we had 4:3 letterbox, and crap analog cable tv, or worse, antennae, the movie theater industry was booming. people wanted the film resolution, tv res sucked!! now, people don't care about theaters as much, many theaters are going out of biz. why? cos resolution is at equilibrium in the home = 1080p/2k. how many screenings do you see these days and you and 3 other people are there in a huge theater. don't believe the hype.. 4k was developed to line the hollywood cash coffers. it's a myth. i think 4k and up has it's uses for sports, porn and nat geo. but for film.. disagree. not everyone wants to see the crevices of someone's acne scar and zits.
August 24, 2013 at 6:39PM, Edited September 4, 8:21AM
I'm not so sure porn in 4k is going to be better ;)
August 25, 2013 at 8:15PM, Edited September 4, 8:21AM
You can reframe the money shot.
August 25, 2013 at 10:42PM, Edited September 4, 8:21AM
Wow, so many people missing the interesting points in this article...
I agree with Joe that things will settle down over the next few years into a small number of formats much like the film world. Hate to say it but this 4K vs 2k vs 1080p discussion seems more like ego and a pixel-peepers pissing contest. Each format has it's own unique look and aesthetic and there is room for all. Those who choose to shoot 4K should not be dismissed for cropping to get their close-ups just as those who choose 2K shouldn't be dismissed a behind the times or as amateurs. I've worked with many world-class award-winning DoPs over the years and plenty of them still shot 16mm or Super 16mm rather than 35mm - it was an artistic choice based on the nature of the story.
Oh, and can we please be done with the hipster comments? It's a cheap insult and lacks respect for someone who has worked their butt off to bring a new camera to market. I don't understand why so many of you seem to want Digital Bolex, Black Magic, even RED to fail. More cameras means more options for us to shoot with and more importantly, options at many budget levels. Some of you folks have no idea how good we have it. It's a fantastic time to be a filmmaker.
August 23, 2013 at 1:14PM, Edited September 4, 8:21AM
Amen
August 23, 2013 at 2:29PM, Edited September 4, 8:21AM
+100 :)
August 23, 2013 at 3:08PM, Edited September 4, 8:21AM
Yes, it's a wonderful time to be a filmmaker!
August 23, 2013 at 3:45PM, Edited September 4, 8:21AM
I think most of the people interested in digital bolex, (at least at it's current price-point) wouldn't be able to handle 4K raw. 2K is DEFINITELY plenty for the market you are aiming at. Even if 4K wouldn't up the camera cost that much, it'd up the cost of every aspect of post production. I personally would probably lose some interest in the camera if it switched to 4K.
August 23, 2013 at 1:21PM, Edited September 4, 8:21AM
Re: "Netflix/iTunes in 4K? Sure, in 2030." - That article statement is completely wrong. The new H.265 (HEVC) codec will allow 4K to be easily available to the masses quite soon. The current H.264 video standard would require about 45 Mbps to deliver 4K video, but HEVC can deliver 4K video at 20 Mbps, and maybe even less. Nextflix is already planning to deliver 4K streaming by 2014-2015. http://www.rapidtvnews.com/index.php/2013060628150/netflix-plans-to-star...
August 23, 2013 at 1:41PM, Edited September 4, 8:21AM
And I'm wondering what H.265 will do to a wonderfully graded image... lower bitrates typically mean information gets thrown out, leaving banding issues and other yuck.
August 24, 2013 at 11:24PM, Edited September 4, 8:21AM
Actually the color options will be far better in H.265 (HEVC). The most common H.264 profiles are 8-bit color with 4:2:0 chroma sampling. HEVC has support for 10-bit color 4:2:0/4:2:2 chroma and high frame rates; 12-bit color with 4:4:4 chroma is coming in a future HEVC profile, called Main 12.
August 25, 2013 at 11:16AM, Edited September 4, 8:21AM
So... not just size but also girth that matters? ;P
August 23, 2013 at 1:55PM, Edited September 4, 8:21AM
I agree with many of the points of this article. I shoot doc style sports content where I often encounter scenes and lighting that are beyond the scope of my control. Sure, you can plan your shooting schedule to optimize time of day and lighting but often the window of time and weather is fleeting and your stuck shooting in less than perfect situations. This is one of the reasons the prod company (we are normally shooting on snow) I often work for stuck with film so long. Much of the this years annual film is still shot on s16 along with Alexa and RED Epic. Back to the film days, I'd challenge you to pop an SD copy of their film into your dvd player from 5 or 10 years ago (before I worked with them), at 720x480 shot and CC'd by experts and say that the finished product lacks in any quality whatsoever. We were yet to even know what we were missing when it came to resolution. Watch an SD copy of your favorite classic film, do you enjoy it any less if it's not 4k? Obviously, it's not as sharp as 1k, 2k, or 4k for sure but it's going to be much easier to watch than something shot at 4k 8bit with 8 or 9 stops of DR. Sure, I love shooting 4k but I'll sacrifice resolution for DR and color information every time.
August 23, 2013 at 2:25PM, Edited September 4, 8:21AM
I'll be buying a newer 2k 10 bit/raw camera in the next year that fills the niche of the smaller budget end of the owner operator jobs I do very often, and for bigger budget jobs when a client wants 4k they can pay to rent me a 4k camera as they very often do. I think 4k is great but if the client doesn't want it, I'd hate to have to eat that cost on my end (either it's sitting around collecting dust while you use another camera, your shooting a lower res/something a less expensive camera will do, or your shooting an extra 2k for free). I'll buy cheap now/soon and let clients rent expensive.
If I was going to be making major (I don't think $4,000-$5000 is that major) purchases as an owner operator of a small production company I'd probably start at lenses and tripods (makes me laugh everytime I see $60,000 of kitted camera sitting on an $800 Manfrotto head). I like to buy things that not only hold their value but that will work on whatever camera I use for the job. I invested in a used prime kit 3 years ago and the overall value has increased at least a couple of thousand dollars since I bought them. If I buy a lightly used carbon fiber Sachtler Video 18 or 20 tripod tomorrow for $4,000 bucks I bet it will be worth $3,500 in 5 years. How much is a used Red One worth today or a used Epic from 2 years ago?
I'd say the majority of the people that are totally stressing this 2k vs 4k debate (besides camera and tv salesman) are the current 4k owner operators that have been sandbagging in a low budget market. If 4k stalls out with the market getting more and more saturated they are going to be increasingly underbid on 2k jobs by flexible operators with lower price cameras that are getting better and better and forced to rent their camera's at a cut rate for the 4k in the hopes that they will be paid off in five years.
Renting solves several problems: first, you always have the newest and best stuff (Epic owner: can you afford to get the Dragon upgrade a year from now when clients will be asking for it?) second, you always have the camera the client wants whether it's 10k, 4k, or 2k, or 1k (what if they want you to shoot on another camera, or the same model of camera from a different preferred rental source).
What did Jim Jannard say about bringing a gun to a knife fight/bazooka statement? That really might be the best description of the 2k vs 4k debate yet. If sometimes bringing a gun is overkill, what about a bazooka?
Anyway, sorry for the tangents, basically what I'm saying is that the 2k niche that the Digital Bolex falls into is great and getting better and better with more choices (including more RAW and Pro Res options) for owner operators and if you need 4k why not just rent now and see what happens later.
August 23, 2013 at 6:23PM, Edited September 4, 8:21AM
Lame, even if you disagree with a product be constructive and if you can't, use your real name, coward. I'm not rushing out to buy a Digital Bolex, it's not the right fit for me. But I sure as hell do respect the effort and drive of someone pursuing their dream, not to mention the added competition people like Joe bring to the marketplace. This is an amazing time to be a film maker.
August 23, 2013 at 3:33PM, Edited September 4, 8:21AM
By the way, since we sort of accepted the arguable premise that a disc based 4K system will not be coming any time soon.
.
"The future of 4K video distribution may be via optical media after all: Sony and Panasonic are jointly developing a next-gen disc format. An early agreement suggests discs capable of storing "at least" 300GB of information will be first out of the gate for a 2015 release."
.
http://www.wired.co.uk/news/archive/2013-07/30/4k-blu-ray
.
PS. The Sony 4K video server/player is already out.
August 23, 2013 at 3:42PM, Edited September 4, 8:21AM
OK again, You can tell me about all the tech thats "out", but 1080p blu-ray market adoption has been really slow and has not hit anywhere near industry standard rates yet, even though major companies have been pushing it since 2006. So you think 4K blu-ray will some how magically get there 16 months from now?
August 23, 2013 at 6:14PM, Edited September 4, 8:21AM
Doesn't Oppo from China already have a 4K DVD player?
August 23, 2013 at 9:56PM, Edited September 4, 8:21AM
Oppo's a fake/upscaled 4K. Once you put in powerful processors, you can get a decent image out of the lower res formats too.
.
New 4K TV's, whose manufacturers are well aware of the 1080i reality, also have pretty powerful upscalers. In the olden days, you usually had to buy a stand-alone player from a company like Faroudja. Back then, someone with a tad cash to splash on a two-piece projector would either buy a Barco with an upscaler already built-in - an expensive option - or a more reasonably priced 480i projector from someone like Mitsubishi + a separate upscaler. Barco, of course, looked better but also cost 4-6 times more.
.
The funny thing is that most old single-piece front projectors had a curved screen. Thirty years later, LG and Samsung are bringing the curve back in their 55" OLED units. They're only 1080p but run $15K retail. The colors are supposedly phenomenal.
August 24, 2013 at 12:16PM, Edited September 4, 8:21AM
Exactly. TV is 1080i, not 1080p. 720p at best.
August 24, 2013 at 11:31PM, Edited September 4, 8:21AM
FabDex you have amazing conversation skills, I really wish everyone contributed with well established opinions the way you do.Unfortunately not everyone is as sophisticated or witty as you.You are one of the very reasons I love the internet so much.Nameless nobodies with inexplicable aggression.
Fincher did shoot one of his best films Zodiac in 1080p only and it looks absolutely marvelous but I guess we are all so over that
August 23, 2013 at 3:44PM, Edited September 4, 8:21AM
Are you the Konstantinos from Urban Visuals?
August 23, 2013 at 5:37PM, Edited September 4, 8:21AM
Jim Jannard, is that you?
I thought you retired?
Bored already?
August 23, 2013 at 3:57PM, Edited September 4, 8:21AM
you reading news with your ass? he s not retired...he just dont want to post aynthing on internet because of loser assholes
August 24, 2013 at 2:28PM, Edited September 4, 8:21AM
I guess maybe the future of 4K in America and Europe is far off, but where I live in Japan, it is here right now. I've been enjoying 1080i and 1080p content sent over the airwaves here for about six years now. The is full definition sent over the air, never mind some copper string from the late 70's being piped into a TV. I also have a fiber optic internet line running directly into my house, another "in the future" concept for Americans and Europeans. I can purchase 4K televisions and computer monitors that won't ever see release in the State and the Union right now, for cheap. So the hold off on 4K is only relative to where you live. Here in Japan 12-Bit 4K is very very relevant right now, not in the future.
August 23, 2013 at 4:46PM, Edited September 4, 8:21AM
You're watching 12 bit 4K TV in your house over airwaves? Damn!
August 23, 2013 at 5:00PM, Edited September 4, 8:21AM
Agree, it's relative to where you live. Most people live in countries that doesn't have as developed IT infrastructure as in Japan. I have fiber optics too in the house I'm living in and i live in Europe. So it's a reality at least for me and some other people here. I could easily watch 4K here, but noone is distributing it... Television here have serious financial problems that must be solved first. Public service TV will remain and private ones will start to go down. Next step is bankruptcy for many. And I wonder if it's in the public interest to have 4K television and pay it with tax money? The question is who will pay for 4k TV in Europe?
August 23, 2013 at 5:03PM, Edited September 4, 8:21AM
You guys in Japan will be getting 8K tv in a couple years. Ok. I'll go watch an SD movie now.
August 23, 2013 at 10:06PM, Edited September 4, 8:21AM
Here is the point of all this I was trying to get at.
When considering a serious platform upgrade like 4K for your work don't listen to the people that are trying to sell you cameras and TVs. It's their job to get you to buy stuff before you really need it. Don't even listen to me since I am making a camera.
Watch market acceptance, see how many of your friends and relatives have the tech and use / watch it on a regular basis. I have a 3D projector, but I think I have only used the 3D aspect maybe twice ever! That doesn't count. Understand the world around you and think about it as objectively as you can.
Platform acceptance is all about timing, and there are serious draw backs to upgrading too soon. You may price yourself out of a a whole tier of customers, you may burn yourself out with unneeded work loads, and you may not get the gear that may actually help your work improve.
4K represents 400% increase in file size(of same format files) and some workflow issues, and only a 15% increase in image quality when presented on a 2K or smaller screen(if that). That is some crazy set of numbers.
I guarantee people will still be watching Sky Fall and Avatar years from now and if you shoot in 2K, and your movie is good they'll be watching your movie too.
That's all, thank you all for reading and for your comments, I know this is a touchy subject, I just haven't been hearing people talk about it from the only really important vantage point which is market acceptance(in my opinion). I personally don't think I'll shoot a 4K movie till my dad buys a 4K projector for his living room, and since he just last year got his Sony 1080p I know I have a few years :)
All the best, Joe
August 23, 2013 at 4:56PM, Edited September 4, 8:21AM
"Exponential multiplication creates powerful numbers". The costs and time increases alone is a major drawback for the independent producers. Those with money and resources to burn are not your target customers. The KISS principle applies. Nirvana may not sound like Vivaldi, but that did not stop them from being extremely successful and liked by later generations.
August 23, 2013 at 5:26PM, Edited September 4, 8:21AM
You can't have ANY market acceptance without content in the first place. And you can't have 4K content if you don't have anyone shooting in 4K. Digital Bolex 2K = $3,299 and Blackmagic 4K = $3,995. So if 4K is only a few bucks more, than why not? It's only an "optional" menu select on the Blackmagic 4K. I guess I don't like the tone that experimenting or shooting in 4K is such a crime.
August 24, 2013 at 4:09AM, Edited September 4, 8:21AM
Hi Razor,
Shooting in 4K is definitely not a crime, I'm sorry if my post made you feel like that was what I was trying to say.
The only thing I was trying to get across was that for us indies no matter what we shoot in 2k is the best presentation format we will have to show in for a long time. At least a decade, maybe more.
August 24, 2013 at 12:11PM, Edited September 4, 8:21AM
I agree with Kris, in Australia, 4K is here, limited, but here and its not as expensive as we thought. I can get a 4K computer screen and television this afternoon if I wanted to, believe me, when you see a 4K TV, even at 30 inches, and compare it with HD, there is no comparison. The US and Europe screen markets need to play catch up with Japan and Australia. The future in both countries is so 2011 :)
August 23, 2013 at 5:04PM, Edited September 4, 8:21AM
US and Europe have lots of financial problems. I'm sure you can get the screens really cheap but who will distribute the content and to what price? That's the problem here. Someone need to invest in infrastructure and bandwidth. Most countries in Europe and places in US doesn't even have really fast internet yet. There are different systems in all the countries in Europe, for exemple. In some countries we start to have fiber now, but it's still a long way to go.
August 23, 2013 at 5:15PM, Edited September 4, 8:21AM
If 2K raw images compare to S16, used on Academy award movies recently, and Umatic was used for a 70's era political drama, plus the fact we do not know id we are going to die in the next few days, Should anybody serious about doing movies be concerned with image resolution or getting their stories done?
Picasso said: "Only put off until tomorrow what you are willing to die having left undone." Joe, just get the D16 out and only be concerned by those interested in your product or message. Thank you for offering an alternative.
August 23, 2013 at 5:12PM, Edited September 4, 8:21AM
Joe, my cousin bought a 4K projector in Melbourne 3 weeks ago for $1490. I think you need to visit Australia!
August 23, 2013 at 5:12PM, Edited September 4, 8:21AM
And what content is he watching on it?
August 23, 2013 at 5:29PM, Edited September 4, 8:21AM
She just purchased the Sony media player that has x8 4K titles but its mostly her own and her students content that was shot and mastered out shoot in 4K (we have about x45 shorts and x19 indie features to go through and watch).
August 23, 2013 at 8:37PM, Edited September 4, 8:21AM
We are reeaaallly behind in America.
August 23, 2013 at 10:10PM, Edited September 4, 8:21AM
Im not sure of the reason why because the US is at the forefront of many industries, but I have noticed that existing US networks are resistant to change and infrastructure investment, and that distributors are not shipping next generation equipment like 4K screens and home entertainment systems/content in the same ways as other countries. Because here (Australia) is attached to the Asia Pacific market, we are sometimes the test bed for new kinds of technologies and formats coming out of Japan and our new roll out of NBN (national broadband network) which will take years to complete is making our internet faster and more reliable (we haven't got it yet so cant say first hand) so that when 5G bandwidth systems becomes operational we could download a feature film at 1080 in about 10 seconds. So with that in mind I think the US needs to review its antiquated broadcast and internet system because as 4K rolls out to replace HD as the standard broadcast and acquisition format, if broadcasters are still clinging on to 720p it will not give users and subscribers the same kind of viewing experience as what folks in other countries can experience.
August 24, 2013 at 4:30PM, Edited September 4, 8:21AM
In America we should already have 16K cameras. America has taken its potential for granted. We have the minds and machinery to do things like 16K cameras. And if we don't have the machinery we could invent it. But instead we use our Unions to go on strike and demand more comfortable wages and benefits when we already have one of the most comfortable lifestyles in the world. The more we get the more we take things for granted. We aren't the #1 Nation in the world. And we don't even really care. We've fallen behind and don't see a problem with it. We don't feel the need to fulfill our potential. You Aussies don't have better tech there from Asia just because of the region you're in. You have it because America is spoiled, fat, and lazy.
August 24, 2013 at 11:02PM, Edited September 4, 8:21AM
Hang around. I'll tell you what I really think.
August 24, 2013 at 11:03PM, Edited September 4, 8:21AM
I own a Red Scarlet. Love it. It's been a tremendous step up from 5D for working on dramas and music videos predominantly. But I'm very excited about the D16 and am in awe of what Joe, Elle and the team have been doing. It looks like a serious production ready heavy duty tool creating beautiful images (from what I have seen). Lack of global shutter is the one thing I feel held back with on RED, I can't shoot frantic Bourne style. I really hope that Digital Bolex finds great success, that solid state drives get bigger and cheaper and that they later make a 4k D35 :-) Would love to try out the D16 with a really nice S16 lens
August 23, 2013 at 7:07PM, Edited September 4, 8:21AM
For the budget cameras, SSHD's (hybrid solid state-magnetic drives) may do the trick in the near future.
.
Anyway, I would like to offer another audio analogy. Let's go back to the 1960's - just because the majority of people had crappy mono systems, the recording studios didn't back off from recording in stereo. When CD's appeared, many FM stations proudly proclaimed their use despite their users frequently awful car radios. Similarly, when the 4K cams saturate the scene in Hollywood, the fact that the slow adopters may still be watching 480i won't restrain the clients from ordering the product shot with the latest gear. It's how it's always been.
August 23, 2013 at 8:52PM, Edited September 4, 8:21AM
Not sure I fully understand your audio analogy, but no studios didn't record in "stereo" since almost all studio microphones were and still are mono ;)
If your clients demand "the latest gear" then you should definitely buy or rent it, no argument here. :)
August 23, 2013 at 9:21PM, Edited September 4, 8:21AM
Of course you can record in stereo...it's common to use different kinds of mics placed in different places to capture different parts of a voice or instrument and use that to craft the sound. You didn't think they just put one mic out in the middle of the studio to record everything did you?
August 24, 2013 at 1:12AM, Edited September 4, 8:21AM
You can record in stereo, with stereo microphones, and XY style mic setups, but placing " different kinds of mics placed in different places to capture different parts of a voice or instrument and use that to craft the sound" is not stereo.
Stereo is a very specific term with a specific meaning and is more commonly associated with sound reproduction and speakers than with microphones or recording...
ster·e·o (ˈsterē-ōˈ)
noun: Sound that is directed through two or more speakers so that it seems to surround the listener and to come from more than one source; stereophonic sound.
So no, they don't generally record music in "stereo".
August 24, 2013 at 2:56AM, Edited September 4, 8:21AM
"” different kinds of mics placed in different places to capture different parts of a voice or instrument and use that to craft the sound” is not stereo."
It is if you place the mics to record in stereo. "Stereo" mics are still just mono mics, but packaged together. Unless you're doing environmental sound, most recording engineers I've worked with prefer doing their own placement.
All you need to get a stereo effect are multiple mics, it doesn't have to be a straight stereo set-up...just mixing different parts of a voice or instrument in different ears is still a stereo effect. And I know for sure Motown was doing that in the 60's.
So here's my question to you, did you not realize that, or were you just being anal to be irritating?
August 24, 2013 at 6:01AM, Edited September 4, 8:21AM
I guess I'm just irritating. Sorry.
I was trying to be funny at first. Obviously a fail.
You can use one mono mic and mix it in stereo if you want.
I was just pointing out stereo, IE binaural stereo placement, is something you typically do in post, not while recording.
August 24, 2013 at 12:04PM, Edited September 4, 8:21AM
The old studio recording technique "bounced" multiple tracks down to one. In the 50's, the most popular studio recorder was a 3-track - Les Paul had Ampex custom build him an 8-track in 1958 - but the output was still mixed down to one. Similarly, the 4-tracks didn't produce a quadrophonic sound. They simply allowed a higher quality mix. It was a common practice then to begin with a drummer (the rhythm section is most often the first recorded, the vocals last) and overlay everything on top of that. After these overdubs were done, they often called the drummer back because the original drum track was by then inaudible. Bands like Led Zeppelin used a shitload of overdubs even with their 4-track units and that gave their early albums a lot of tape noise/hiss. Their first generation CD's were pretty awful sounding because of that. Only after the albums were digitally remastered, the hiss was taken out. (of course, other artifacts were introduced with it, meaning that getting the high quality original is still the best way to go ... and the high quality original is 4K, not an upscaled 2K)
August 24, 2013 at 12:31PM, Edited September 4, 8:21AM
4K is WAAAAAAAAAAAAAAAAAAAAAAAAAY overrated. sharpness is a distraction to story as we have seen recently in 4k theatrically released films. it takes AWAY from the story. again, sharpness is OVERRATED!!!!
in any event, broadcast cable networks and even netflix can't even do consistent 1080p. most "HD", is broadcast still at 720p because of the limitation of our existing networks in the good old US of A.
August 23, 2013 at 7:29PM, Edited September 4, 8:21AM
4K is sharp when the DP or VFX supervisor allows it to be but it can also be as soft and organic as you like. Start with glass and F stops... Regarding broadcast, you might want to look outside the US as an example of how 1080HD and 4K broadcast can work effectively because the US broadcast system is neolithic at best, which is a pity when the talent and content coming out of the country can be stella.
August 23, 2013 at 8:51PM, Edited September 4, 8:21AM
then Tell me why even today all the big studios trying to scan old movies in 4K.and there is huge difference between 2K scanned negative and 4K Scan.
Also this article says i was so stupid to foresight the future now im stuck to 2K just buy MY FUCKIN HIPSTER camera..
August 24, 2013 at 7:10AM, Edited September 4, 8:21AM
marketing is all about dick size envy.
August 23, 2013 at 7:40PM, Edited September 4, 8:21AM
I don't disagree with the article I'm only interested in 4k cameras if they give a better image when they are downsampled. I shoot with a DSLR at the moment for broadcast television at 720p. It looks pretty good. If a 4k camera could give me a better image once downgraded over a DSLR or a 2k camera I'd consider getting it. The ability to reframe if I really need to is nice, but I'd rather compose shots properly to begin with. Yeah I'm happy watching a movie on a standard DVD but those films were usually shot on Film not on 720p handicams.
It's all about how it looks once it ends up at the lower resolution. We'll know about the bolex once they actually release it. If it looks better than the competition then the market will decide.
August 23, 2013 at 9:59PM, Edited September 4, 8:21AM
I love the smell of 6K in the morning..........
(not that I can afford it now)
August 23, 2013 at 10:00PM, Edited September 4, 8:21AM
There is 8K too, 7680 X 4320:
http://www.astrodesign.co.jp/english/product/8k4k-products
August 24, 2013 at 8:24AM, Edited September 4, 8:21AM
Excellent post. I can take or leave the arguments for resolutions over 2k but the keys to the kingdom for making beautiful images is bit depth. I've been in post doing vfx for 14 years. The leaps in resolution have had less of an impact in overall quality than moving from 8-bit scans, to 10-bit Cineons, to 12 and 14-bit whatevers now. Blends are nicer, gradients (skies) improved ten-fold. And, of course, latitude is way better (hello, underlit greenscreens). I get a kick out of seeing footage shot on a hacked 5D that exceeds tonal range than scanned plates from tent pole blockbusters from 10 years ago. Which, btw, were 2k. And they did just fine at your local movie theater. I paid to see them in the theater and so did you!
I'm not anti-progress. Oy, with the early Cine-Altas and Yay! to Red, Alexa, the Digital Bolex project, BMC, and the geniuses driving the Magic Lantern movement. I'm just not big on being lead by the nose by companies trying to make products obsolete that I've just invested in so they can sell me new stuff. Especially if my personal viewing experience is going to be exactly the same as it was before. Which I think Joe outlined nicely.
August 23, 2013 at 10:44PM, Edited September 4, 8:21AM
I was a bit surprised about the statement that "There are affordable 10-bit monitors and 10-bit video cards these days." According to SmallHD "10-bit panels today are very expensive, bulky and typically reserved for professional color-grading applications." 10-Bit Panel Drive "[…] only indicates the hardware driving the panel is capable of 10-bit. NOT that the actual panel is 10-bit." [http://www.smallhd.com/oled/why-oled.html]
August 24, 2013 at 3:48AM, Edited September 4, 8:21AM
10-bit hardware = Eizo ColorEdge monitor plus Nvidia Quadro card or AMD FirePro card plus Displayport cable.
August 24, 2013 at 5:12AM, Edited September 4, 8:21AM
I tried to do some research on 4K projection. According to a statement in this article, roughly 20K out of 90K theaters in the US are 4K capable:
http://www.hollywoodreporter.com/behind-screen/nab-75-percent-theaters-a...
That seems like a lot but someone in the piece was saying that investing in 4K is not a priority for theaters right now, and they question whether general audiences would care enough to spend more for it.
I haven't found data on the number of films actually shown in 4K. I would assume it's quite small right now. My mostly ignorant impression is that facilities for finishing and outputting films in 4K do not exist on an industrial scale yet.
As for broadcast channels, this page has info on the current formats for most channels:
http://en.wikipedia.org/wiki/High-definition_television_in_the_United_St...
Most are 1080 but ESPN and FOX channels are still 720.
As for television displays, I haven't done a lot of research but this article says that 4k market share is currently minimal (<.1%) and isn't projected to even reach 1% for another 5 years:
http://www.isuppli.com/Display-Materials-and-Systems/News/Pages/4K-Telev...
I know there is a graph about this somewhere but someone in the article suggests that you would need a 60 inch monitor to perceive a difference between 4K and 2K.
Overall, my impression is that widespread consumption of 4K images on 4K displays (as opposed to widespread use of 4K cameras), whether in theaters or at home, is many years away. For theaters it might happen in the next 10 years. For television/internet, it looks like it will be at least 10 years before 4K displays have significant market share, and it may be decades before most people are watching at home in 4K, barring an unforeseen technological/economic revolution.
Bottom line, if you can comfortably afford a 4K camera or if your job pays for it, of course it will be "better" in many respects. But if it's a matter of viewing an image in 2K, while 4K cameras will be better downscaled in general, it's reasonable to point out that a 4K camera, which costs a minimum of 10X more than a high quality 2K camera, in no way produces an image that is 10X better when viewed at 2K. Maybe it's 10% better, I don't know. But if you take a RED EPIC and a BMCC/5Dmk3 at 2K, the difference is small (but obviously important) for camera people and basically non-existent for the general public. Vastly more important will be production values, acting, story, etc.
August 24, 2013 at 9:36AM, Edited September 4, 8:21AM
Thanks for taking the time to research this!
Very useful data.
August 24, 2013 at 11:54AM, Edited September 4, 8:21AM
@Jackson "90K theaters in the US are 4K capable" - Correction that's not US, that's 90,000 movie screens worldwide. The US has 40,194 movie screens with 85% digital and around 22% of those digital screens 4K capable.
August 24, 2013 at 3:20PM, Edited September 4, 8:21AM
"But if you take a RED EPIC and a BMCC/5Dmk3 at 2K, the difference is small (but obviously important) for camera people and basically non-existent for the general public."
Usability and support are the main reasons for the price difference (not to mention framerates: 200 fps at 3k vs 30 fps at 2.5k). It seems the real price difference of 4k compared with the BMCC is only 2x since that's the camera Black Magic is coming out with soon.
August 24, 2013 at 5:10PM, Edited September 4, 8:21AM
There is no low budget 4K projection systems available yet to "democratize" distribution of independent films.... that is the point.
Most of you 1st time feature directors are in Hollywood dreamland thinking you are shooting like Fincher or Jackson the truth is you should be channeling your inner Tarkovsky to inner 16mm Aranofsky.
Second tier film festivals, art houses are using mostly 2K so until a projector come in at a price it does not warrant shooting on it.
RED has been in development on a 4K $10k projector for a few years but no sign of it yet.
http://www.engadget.com/2011/09/22/red-ceo-teases-4k-3d-laser-projector-...
August 24, 2013 at 11:54AM, Edited September 4, 8:21AM
In terms of projectors, one could use the "edge blending" designs, where a 4K image can be obtained by stacking four 2K projectors, with each responsible for the quarter of the screen. Below is the basic explanation page - http://www.mviewtech.com/Solutions_listen.asp?ProdId=120830154709
.
Additionally, as I kept web-surfing, I found a Chinese company PallasLCD that has recently came up with the seamless LCD walls, that previously were deemed unfathomable due to their bezels. Apparently, this company made the bezels transparent and essentially unnoticeable (or their videos are heavily blurred). On YT and their own site, they are showing the 5x4 stacks (5 wide x 4 tall) of various LCD flat screens. Considering that a 55" LCD can be had under a grand, a 20 foot wide wall of 20 screens can be had for about $20K (+ the bezel surcharge + software pr hardware control box). A 6x6 should run under $40K, et cetera, et cetera.
August 24, 2013 at 5:55PM, Edited September 4, 8:21AM
Theres a 4K Ultra HD screen available now in the US for $699. http://www.amazon.com/Seiki-Digital-SE39UY04-39-Inch-Ultra/dp/B00DOPGO2G... Its upscaling but a good cheap start...
August 24, 2013 at 9:09PM, Edited September 4, 8:21AM
I wouldn’t be so excited about Seiki 4Ks without first knowing the facts. Reviews are coming in with Seiki’s cheaper panels as having poor black levels, murky shadow detail, noticeable uniformity issues, and inaccurate color. Tested games are blurring or skipping frames. I don’t see the point in wanting to demonstrate the new era and grandeur of 4K if the panel displaying the content looks like crap. For the best panel quality, up-scaling, and superior color you’ll need a Sony 4K.
August 27, 2013 at 5:17AM, Edited September 4, 8:21AM
Or just buy a 2.5K monitor for under $400. In that size, there won't be much, if any, difference between that and 4K.
August 27, 2013 at 5:37AM, Edited September 4, 8:21AM
Maybe just like he is late on his product it seems he is late on his technology info. he says that 4k would probably not be dominant by 2030. He is clearly not seeing how technology is advancing faster every year. 4k tv's are already in the market so I believe yes to 4k. I think he is only trying to convince you to buy his product seeing the great threat of black magic 4k. He would probably have to match BMC pricing or go lower to sell his 2k camera. I do find the bolex design nice but I'll get it as my back up or bts camera
August 25, 2013 at 12:47AM, Edited September 4, 8:21AM
Hi Michael,
Thanks for your comment. I feel a little like I am repeating myself, even within the comments section here, but what I'm trying to get everyone to see is that as independent filmmakers our adoption of any technology should be based on market acceptance not on what camera manufacturers or TV companies are trying to push at any given time.
As Jackson pointed out in a comment above 4K TVs are currently at less than one percent market acceptance. To me that is FAR too early to be jumping into a 4K workflow if you are an independent filmmaker.
Understand your market, know who is going to consume your product, and choose your gear, and everything based on that. If 99% of the people you are presenting your material to will only see it in1080 you don't NEED to shoot 4K. You of course can if you want, but make these decisions because you understand your market. I guess if you are making films for the Japanese or Australian markets maybe 4K makes sense as again other commenters have pointed out.
You know how there are three rules of (local) business? Location, location, location.
There should be three rules of indie filmmaking. Know your market, know your market, know your market!
And it's not that hard these days, a few days of internet research, find out who's buying, what they are buying, and who the end user is. It used to be really hard to find this information, but now it's very easy. There really is no excuse anymore for making films "in the dark" anymore.
August 25, 2013 at 2:54PM, Edited September 4, 8:21AM
If your film is going to a TV audience, then yes 1080 is going to be it for a long time. If your film is going to a movie screen or online as an HEVC encode, then 4K is a something to consider. The US has 40,194 movie screens with 85% digital (34,164) and 22% of those digital screens are 4K capable (7,516).
August 26, 2013 at 3:36PM, Edited September 4, 8:21AM
85% are digital. Someone saw the handwriting on the wall.
August 26, 2013 at 10:25PM, Edited September 4, 8:21AM
True but theatrical releases are very rare for indie films, and if they do show your film in theaters will the theater owners want to show your film in 4K instead of Avengers 5? Probably not. If you get a theatrical it will almost definitely be in 2K.
August 27, 2013 at 1:13AM, Edited September 4, 8:21AM
In my view of the future, especially now with high-quality low-cost cameras (Digital Bolex, BM4K), I see resourceful and talented indie directors disrupting the older Hollywood model. By the way Joe, I do appreciate what you are doing with the Digital Bolex. It's a clever camera with a great image. And thanks for taking time-out to chat with us!
August 27, 2013 at 4:56AM, Edited September 4, 8:21AM
Also if it is going "online" still no one has 4K screens to view it. So again doesn't really matter.
August 27, 2013 at 1:15AM, Edited September 4, 8:21AM
Excellent post.....now I'm really happy with this post as I've learnt a whole lot from it, not so fascinated about 4K anymore atleast not until another 5 years LoL
August 25, 2013 at 10:14AM, Edited September 4, 8:21AM
@Joe Rubinstein ... Very good analysis. Ignore the haters and trolls. Bit-depth and color rendition is far more important to me than having 6K, or 8K, etc. Even if everyone could enjoy our movies in 4K, the only way you will truly be able to appreciate the increased resolution is by having a VERY large screen.
Accurate color rendition on the other hand is far more important from an emotional and technical standpoint when it comes to cineamtography because it greatly influences the "feel" and tone of a movie and is conducive to storytelling as is lighting. Thats my two cents.
August 25, 2013 at 6:52PM, Edited September 4, 8:21AM
I really am hoping this camera works out good for them. I hope they have brisk sales. I hope there's a good market for them.
I think seeing the Red Dragon has got me dreaming of what even higher K's could look like. But there still is a market for 2K. 4K isn't exactly on fire in America. And the camera does look so cool. And it does have excellent audio specs. I do hope they make a good business with it!
August 26, 2013 at 12:06AM, Edited September 4, 8:21AM
I've seen 4k and I've even seen demo footage of 8k at NAB. Honestly, I could not tell the difference and I have very good eyes. You need a HUUUGE immersive screen to reap the benefits. We are the point of diminishing returns. Sure, supersampling is nice but their is more to an image than just Ks.
August 26, 2013 at 7:46AM, Edited September 4, 8:21AM
wierd that 8.25.13 and 8.26.13 comments are taking an higher position than 8.23/24.13 comments
August 26, 2013 at 12:09AM, Edited September 4, 8:21AM
That's how the comment board operates. It's a distinct flavor with its overflows and mini-threads (but, to make a positive out of it, it makes you go back to see where the followups to your own comments might have appeared).
.
Ryan did update the boards tonight (server upgrade?) but the software looks identical.
August 26, 2013 at 10:15PM, Edited September 4, 8:21AM
I'm with Joe.
The future place to watch your film is the internet. 720p is enough of a struggle for most people. Youtube had 4k for a while - how many people watched it? How many care? Most people are starting to watch things on their iphones . The web unless h.265 starts taking off still struggles even with 720p. how many nonfilmmakers don't make things full screen?
I have the red one, I have the sony f35. I didn't go the 4k route and get the F55 or F5, I have the F3 - it's good enough for me. I just shot a commercial that aired nationally and we shot on the alexa and the f3 and the f55 and it all looked good enough for the average viewer - they wouldn't notice - and that's a 4k CAMERA mixed with a 2k camera mixed with a 1080p camera!! OH MY! No one called me up to complain. Dynamic range, color, and motion is as important to me as perceivable sharpness. 4k is nice to reframe. But it's also a pain in the butt to deal with - shooting a project with 2 cameras for a week, I'd rather not shoot 4k raw and have to process it and turnaround time on most projects are fast. It's all a compromise.
August 26, 2013 at 10:05AM, Edited September 4, 8:21AM
Even if people are watching more content on their phones, phones keep ratcheting up their resolution...my new phone is 1080p, and now all the mobile streams look like complete shit on it. However it's not just phones, more people will be watching on tablets (which are replacing standard PCs)...there will be 4k tablets within the next year for sure.
August 26, 2013 at 4:07PM, Edited September 4, 8:21AM
As been mentioned by Gabe, the new smartphones will have the 1080p screens and new laptops are drifting toward at least the 2.5K. Anything with the hard drive - and this is where the current streaming servers are inching toward - will be capable of (technically speaking) 4K delivery, if not always instantly so.
August 26, 2013 at 5:20PM, Edited September 4, 8:21AM
Panasonic is coming out with a 20" 4k tablet any day now. With how good 6K looks 4K display may be a speed bump on the way to higher K's. at least I hope so.
20" 4K tablet: [ http://www.youtube.com/watch?v=ze8AGAFaagQ ]
August 26, 2013 at 6:14PM, Edited September 4, 8:21AM
So you think average consumers are going to buy a 20" 4K tablet? The video even says it's for professional photographers, which means expensive.
And again, I feel like a broken record, but I'll say it again...
It doesn't matter what tech the companies make and sell the only thing that matters is market acceptance.
So when a significant portion of the country buys 4K phones or tablets let me know.
Till then it just doesn't really matter, unless you are making a major Hollywood FX movie.
August 27, 2013 at 1:33AM, Edited September 4, 8:21AM
I didn't set out to buy a 1080p phone, but the phone I got happened to have it anyway.
August 27, 2013 at 2:57AM, Edited September 4, 8:21AM
Exactly.
August 27, 2013 at 8:36AM, Edited September 4, 8:21AM
Joe, I agree with you about your market being your primary factor. But there are also workflow logistics. Right now, dealing with even 2K uncompressed raw is burdensome for many, if not most. When 4K becomes generally reasonable, people will originate in 4K even if the end product is 2K/HD. Ubiquitous 4K cameras (RedShark predicts 20 new ones will be announced between now and NAB 2014), codecs (e.g., CineForm RAW/VC-5), higher-speed interconnects (e.g.,HDMI 2.0, 6G-SDI, Thunderbolt 2.0), and even more massive storage will make 4K production feasible long before general consumer adoption.
September 1, 2013 at 10:16AM, Edited September 4, 8:21AM
I think it's interesting that the drive for image resolution is so hyped at the moment within the grassroots and mid level movie makers chatrooms.
Sony like 4k, it's a good way of selling more stuff.
16mil was a liberating approach in the '60's and then later on there was dogma. Now we need a bit of that spirit again.
I still write analogue scripts with analogue characters. I'll worry about 4k when I write a 4k story.
August 26, 2013 at 11:27AM, Edited September 4, 8:21AM
Two technology based notes from today :
.
1) Sony released a compact camera a3000 that has the same 20.1 MP sensor, a Zeiss lens and 24 Mbps AVC HD at 24 fps as their current Alpha offerings. It also has HDMI out. Some of the mid-tier SLR tier features are missing but, off early shots, the image quality is pretty darn good. The most important feature of the unit is its $400 price. Projecting upward, this should mean a pro feature packed 1080p piece for well under $1K (whichever replaces the A65, A77 and A99) and a potential 2.5K piece for under $2K. If the difference is another processing chip (this one has BIONZ as well) and the de facto free software, this is all within Sony's capabilities.
.
2) AT&T U-verse just upped the max download speed to 45 Mbps. They're using h.264 for their TV package but, should they switch to h.265 soon, this is more than enough for the 4K quality streaming.
August 27, 2013 at 10:57AM, Edited September 4, 8:21AM
And, just like that, Sony and Samsung drop their prices on the 4K sets. What was $5,500 (55") is now $4,000.
August 28, 2013 at 1:40AM, Edited September 4, 8:21AM
We dropped something for you too...
http://www.digitalbolex.com/know_your_market/
Or why it doesn't matter what Samsung prices their 4K TV at ;)
August 29, 2013 at 2:44PM, Edited September 4, 8:21AM
You guys discussing in this forum are all geniuse in your own way. I started producing short movies using Sony EX3 and Kinomatik movietube. My results are always mind blowing and i sincerely still do not know why all this 4k, 5k, 6k when in reality the additional K are not usefull for 90% of those payng for all these Ks..
August 29, 2013 at 12:48PM, Edited September 4, 8:21AM
4k resolution (at least) is necessary if theater chains start to get back to the days of GIANT wall-to-wall Cinemascope screens (I hope to God they do). 2k is great for your local mini-mall cinemplex, but starts to fall apart.
The ingredients that MUST be in place for 4k theatrical and UHD consumer media to have a viable chance is like the article stated: there must be a much wider color gamut and high bit depth included besides more frame rates and faster frame rates.
And the deliverable compression codecs must be visually transparent to the master files, or why bother? Too much compression and inefficient compression codecs have killed the potential for consumer 1080 HD. Most broadcasts and internet content is artifact riddled.
If UHD suffers the same fate... again, why bother?
August 29, 2013 at 1:11PM, Edited September 4, 8:21AM
95% of movie goers just want a glimpse of a Hollywood girl's T&A doing a bumpty-bump. Or a guy in a superhero suit. It could be 480p and sell.
August 29, 2013 at 1:31PM, Edited September 4, 8:21AM
Pages