My disappointment is with Apple. They were close to taking over the NLE market with a decent product in FCP7 and they threw it away. The NLE industry is developed. It worked. It wasn't broken. There are many aspects to it that are part of the viewing experience. A example is the transition from 35/70mm projectors to DCP projectors. Instead of re-inventing the entire experience they took the best of theater viewing and created what we have today. The DCP projectors use Xenon bulbs just like before. It's 24fps but instead of running film in front of the bulb there's a transparent chip that changes it's image 24 times per second. This maintained the viewing experience. Apple is in a unique situation that would allow them to leverage their hardware with the software. FCPX is very fast because of this. However they were seeing a problem that professional editors were not seeing. Producer/editors were though. Because the industry was restructuring many jobs that used to be done by multiple people were now being done by one person. Projects were being handed off to people to produce and direct entire shows. Many people were having trouble with accomplishing soo much. Along comes Apple with FCPX. It'll simplified the workflow. However it had to cut a lot of corners to accomplish that. Corners that professional media creators weren't really happy with. In the end it has a niche market. Too bad...
FCPX was written with a mistaken impression of what an editor does. Firstly, the magnetic timeline. A unnecessary piece of eye candy. For professional editors audio and video are separate animals. The mistaken idea that sync needs to always be maintained is erroneous. Editors create their own sync. Audio and video content is very often captured and ingested by different devices and at different times. How out of sync audio and video is useful information but a profession editor handles that. Many times I purposely throw sync off to either cheat or compress the content. Tracks are also a good tool. Media is created by camera or sound recorders on tracks. Mapped keyboards in Avid and Premiere maintain the workflow for final delivery. FCPX deliverables seem to be just iPhones and MacBooks. It's always easier to create simple mixes for simple playback devices. However we're still using TV's and showing films in theaters. Professional editors have to be able to create for all that.
Secondly, media management and collaborative workflow. A few years ago meta data was the rage in media software. Though metadata has it's uses most editors just want to get their work done. Wasting times organizing and tracking meta data for minor rewards is time consuming. Media's most useful metadata is the date it was shot, who shot it and what was shot. Beyond that you're going into the realm of the annoying. Then you have the issue of where the media is and what form it is in. Avid has solved that very well over the last few years. There is no automatic transcoding in the background to a drive somewhere where the "Event" resides. The media belongs on the media drive and if that drive gets full more media is put on another drive. Where the project or 'Event" are is secondary and separate. The media and the edits don't have to be together. Each sequence isn't a event it's just one stage of a project. In Avid these are all shareable and pasted around or are on the server that multiple editors use. Premiere also does this.
I know that some of what I've written about is possible in FCPX but it requires using FCPX in an alternative way, not the way Apple wants it to be used.
Some random thoughts Apple should consider. Final Cut Pro 7 was a good program. FCPX isn't. Apple needs to accept that keep the very few things right with FCPX and correct the mountains of useless workflows or just plain dump it.