July 20, 2015 at 11:54AM, Edited July 20, 11:55AM


NextStage Pro - Realtime Matchmoving with the Kinect

Hello No Film School!

For the past year I’ve been developing an application called NextStage that turns the Kinect for Windows and the Kinect for Xbox One into a realtime virtual production camera.


Some of the key features are:

- Realtime Camera Tracking
By tracking retroreflective markers, NextStage is capable of instantly and accurately tracking the Kinect’s position and rotation in 3D space.

- Instant Matchmoving
6DOF tracking lets users easily combine live action footage with virtual objects and sets, without the need for tedious frame-by-frame post processing.

- Depth-based Keying
Separate live action subjects from the background in realtime. Depth mattes let users place live action people or subjects on a virtual set without the need for green screen.

- Creative Effects
Depth mattes can be used as an instant, high quality garbage matte for green screen footage, or to quickly rotoscope actors and objects.

- HD Capture
Capture uncompressed RGBA footage in 720p with NextStage Lite, or sync tracking data to an external camera with NextStage Pro.

- Flexible Workflows
NextStage Pro lets users export 30hz tracking data to sync external cameras and devices at 24, 25 and 30 frames per second.

There is a free version called NextStage Lite that captures video using the Kinect’s onboard cameras. NextStage Pro allows users to export the tracking data to other applications like Blender and Maya.

More information can be found at:

I’ve been developing this application pretty much in a vacuum, but I’m very excited to finally get it out into the world. Please let me know if you have any questions, comments or concerns.


Pretty cool use for the kinect, I also see that you used the newer version of the kinect (xbox one)?
A friend of mine developed a tool to use a Kinect to actually track focus while shooting as well, but that was with the xbox 360 kinect.
You can read his whole paper on it here: http://www.jku.at/cg/content/e48361/e60689/e228418/e228430/MScPracticum_...

July 23, 2015 at 6:19AM

Philip Drobar
Video Editor

Looking good. A few questions:
1. If I shoot something and I want to extract the camera movement and tracking points from the shot. Am I supposed to strap the Kinect to the camera itself or does it sync in some way with the camera?

2. I currently use Boujou + Maya for matchmoving and 3d blending. Will I be able to use the tracking data from NextStage Pro and just skip the Boujou stage?

3. Is there a feature that understands optical compensation?


July 24, 2015 at 8:53AM

Daniel Falcon
Director, VFX artist

Hi Daniel,

1. If you capture video using the Kinect's onboard cameras there is no need to sync. The exported video and tracking data will have the same number of frames.

If you use an external camera you will need to mount the Kinect onto the camera (or mount the camera onto the Kinect). Right now sync has to be done manually in an external application, but I am working on a more streamlined workflow.

2. Tracking data is exported as a collada .dae file and can be imported directly into Maya. All you need to do is change that parent object's rotation order to ZXY and you are good to go.

3. NextStage Pro has a calibration tool for calculating the offset and focal length of an external camera. With a calibration loaded NextStage will export the tracking data with the calibration applied.

July 25, 2015 at 2:23PM

Sam M

Your Comment