Have you ever wondered why cameras shoot 29.97fps? Why not just 30?
If you've ever picked up a camera, looked at the frame rate settings, and wondered why you had the option to shoot 29.97fps and 30fps, well there's a reason for that. It's pretty much the same reason why there's an option to shoot in 23.98fps, too, and it has to do with math—and the introduction of color—and TV—and interlaced video—and the NTSC system—and—well, just let stand-up comedian/mathematician Matt Parker explain it to you:
Okay, to simplify (that's a joke), here is Parker's explanation of the calculations that led technicians to coming up with 29.97fps:
North American television has a frame rate of 29.97fps because if you multiply that by the number of horizontal rows in each frame and then you multiply that by an integer, happens to be 286, you get out a whole number which matches exactly the frequency window this data is sent over.
This is a pretty nifty piece of trivia for video nerds, but it can also save you from some major confusion and aggravation, namely if a client asks for a project to be shot/delivered in 30fps or 24fps when what they really need is 29.97fps or 23.98fps. If you know the distinction and the instances in which one must be used over the other (interlaced video vs. progressive scan, for example), then you'll save yourself a lot of time, money, and heartache from angry clientele.