I understand where this article is coming from, but I think there is a chance the author is misinterpreting key information from the original comment. The difference between a camera that shoots 4K DCI and a camera that shoots UHD (but not 4K DCI) is that the second camera literally cannot produce an image that is projectable (without scaling, etc.) at the highest current standard of digital theatrical display. The difference is literally about images designed with theatrical cinema as the pipeline end and images designed with television as the pipeline end. UHD is a TV standard. 4K DCI is a theatrical standard. One is television. One is cinema. Hence, the argument that a camera that cannot shoot 4K DCI should not be called a cinema camera.
Of course, most DCPs are projected at 2K and you can upscale UHD for 4K. I'm not suggesting that there is a necessity to shoot 4K. I've never done it, and I've made several feature films. But, there is a simple technical distinction there that might be getting obfuscated in the broader discussion about resolution.