I suggest doing a search because this question gets asked at least every couple of weeks. I can tell you up front, $500 doesn't buy much of a camera. I myself got a refurbished Canon G20 because it's the only camera I found for under $1,000 that didn't annoy the snot out of me. That said, pixel count doesn't really matter much. HD video is 2.1MP and cameras that try to impress people with massive pixel counts are simply throwing away dynamic range. Case in point, a 3840 x 2160 (UHD) sensor has 4x as much noise and 1/4 the light sensitivity of a 1920 x 1080 (HD) sensor. This is why professional video cameras still have 2/3" 1920 x 1080 sensors and are natively about 400 ISO without noise reduction trickery.
Getting back to my own choice of camera; I absolutely despise cheap still cameras for video. They have tons of rolling shutter, alias distortion, often inaccurate color etc. so I opted for a camera that was less sharp because these issues were less prominent. The FZ18 has a CCD, so no rolling shutter, but it seems to only capture standard def video (0.3MP) and an 8MP 1/3" sensor is going to be SUPER noisy and insensitive. In fact, I had an 8MP 1/3" Sony for a few months that had noticeably worse image quality than my current 2.1 MP 1/3" Canon. I had a Panasonic 8MP 1/3" camera for a couple of weeks that had the same problem as the Sony. It literally took over 2,000 Watts to optimally light a corner (equivalent of a cloudy day) of a room.
Sharpness is mostly determined by the quality of your optics. That's why we use $10,000 lenses on 2.1 MP cameras in the TV industry.
So to recap: noise/sensitivity are both determined by pixel size, sharpness is determined by optical quality. Quality optics are expensive, so longer inexpensive zooms tend to be very "slow", soft looking and often have chromatic aberrations especially as you zoom in on the subject more. On my film cameras, I stopped using zooms all together in favor of of primes: 28mm, 50mm, 85mm, 135mm because four primes wound up being cheaper and better quality than a single zoom.
I can say with confidence that not many people are making 4K DCPs. The thing is, professionals are more concerned with dynamic range than pixel count. Arri Alexas are so popular because they have 8.6uM wide pixels, which is the reason it has almost the latitude film does. To cram more pixels into the same size sensor is to throw away its biggest advantage. Also, 4K DCPs require 4x as much compression to stay within the bandwidth restrictions, so while there's 4x as many pixels, there isn't really more information there. It also takes 4x as long to encode, which is really slow already. Since many theaters are still 2K, there isn't necessarily an advantage to making 4K DCPs.
If you shot UHD (thank you for knowing there's a difference), you should probably author a DCP at HD res (1920 x 1080). You can also make a UHD res DCP, but definitely don't make a 4K one. You'll either be applying a lossy process to digitally zoom the image 6.67% or you'll be adding black bars to make it 4096, which is just a waste of time. Either way, the eye only resolves .4 arc-minutes, which equates to about a 2K resolution at THX-approved viewing angles.
They have a dedicated camera crew who has practiced the shot. That is, there is a person standing next to the camera who's job is to adjust the iris and focus while another handles the pan/tilt. Also, it's impossible if your lens is controlled by the camera; it must be a fully-manual lens. When I was primarily doing cinema camera work (yet another distinction between professional film and prosumer video), we would make little wedges with white scribble tape to put on the lens so we would know what F-stops and focal distances were required for those shots. As far as I can tell, most current DSLR/video lenses can't do that.
Agreed, I am about done with this forum because it is getting hard to find the real content.
Mr. Slomka, that was an exception (which there always have been), not the rule. Now I LITERALLY see photographers on the street talk about how they love using crappy lenses and shooting into light. I guess it's like how when people were recording music on analogue equipment, they strived to achieve the cleanest signal possible as a general rule and then when everybody went digital, they started making the audio far dirtier than ever. You know, for that "classic warm sound". :D
Oh, Sony et al already started development on 8K cameras a few years ago. They know there's no benefit to end users but they have to keep people wanting to buy new products somehow.