It can be frustrating for a filmmaker to have an imagination full of amazing stories and no budget to tell them with. What if the tools used on big-budget productions were suddenly made available to you? “The holy grail for filmmakers is to be able to work in real time, iterate in photorealistic environments, and do anything you want in the moment,” said Frank Patterson, President of Pinewood Atlanta Studios. “When you start iterating this way, think, what would your brain come up with if you had these tools?”

Participating on a SXSW panel about Virtual Production, Patterson was joined by Chris Edwards, CEO of the Emmy-Award winning visualization company, The Third Floor, Shannon Justison, Sr. Previs/Postvis Supervisor at The Third Floor, and Wes Ball, director of The Maze Runner. This session, featuring players at the forefront of previs and virtual production, gave an inside look on their process and how advances in the techniques of iteration can soon make these tools available to the indie film scene.

No Film School was on the ground at SXSW and compiled useful information from this session, and The Third Floor, whose recent work can be seen in Black Panther and Pacific Rim: Uprising, gave NFS a follow-up look into the tools they use in a Virtual Production workflow. If you’re wondering what virtual production is and why you should consider experimenting with it, here is a breakdown.

Thewalkfilms_0Robert Zemeckis's 'The Walk.'

What is virtual production and why should indie filmmakers care?

Many filmmakers hear the word “virtual” and stop listening, assuming the conversation is about something far-off, like big-budget Avatar-esque CG aliens or the spectacle of effects. Take heed filmmakers, as the technology of virtual production evolves and becomes more intuitive. The biggest benefactor may be the indie director. “I want a kid halfway across the world to take their phone and populate a whole world on it with AR,” said Ball, referring to the effect virtual production tools could (will?) have on the most low-budget, guerrilla filmmakers of today.

If you want to start experimenting with the possibilities, you’ll need to start with an understanding of what exists right now and how you use it. For starters, The Third Floor definition of Virtual Production:

 “Virtual Production, at its core, is the idea of combining motion capture with real-time rendering—shooting a movie with real-time computer graphics on stage.”

You can’t really talk about virtual production without giving a little primer on the stages that lead up to, or follow, it. Filmmakers don’t often start production without a long period of pre-production, and so Virtual Production is often preceded by previs, a world-building, planning aspect that is so closely linked to production that it’s often inseparable.

Preliminary stages: Pitchvis and Previs

  • Pitchvis can be used to create a pre-vis trailer to show investors and production companies that could help get your project funded or greenlit.

Here’s a look at the final pitchvis of World War Z.

  • Previs, as the name suggests, is when you previsualize some or all of a film, often for very complex scenes. That can include using storyboards and animatics (as well as asset building) to get a 3D visualization of the world of your story, therefore letting you try out every angle before you film.

As Wes Ball described in the SXSW session, “Previs is about communications between departments. You’ve got a limited amount of money, and so you ask, 'How am I going to do this shot, this idea?' Previs tools are now being used inside of a game engine, leveraging the power of what we can do now. We’re so close to photoreal already, and soon we’ll be at a place where game engines can be presented as final image.”

Here’s a look at several film Previs, including The Walk from The Third Floor.

Interested in playing around? Check out TTF's Chris Edwards in this tutorial on how he uses Maya to create Previs.

Virtual Production Workflow

Now let’s dive into production. The world of Virtual Production is changing rapidly, and The Third Floor (TTF) was kind enough to break down the elements of their Virtual Production workflow at the moment. 

  • Motion Capture (Mocap, Motion tracking, Performance capture)

Those BTS videos with actors in suits covered in tiny dots? Yes, that’s MoCap. As The Third Floor describes it:

The motion capture process digitally records human performances. It allows directors to work one-on-one with an actor, coaching and capturing the authentic performance to be put onto a digital 3D character. It hands an animator’s tool to other artists: the actor and the director.

Industry Standard for Motion Capture: Motive by Optitrack, used by TTF for data-streaming, motion capture, volume/stage calibration/retargeting.

NFS Suggestion for Low-Budge MoCap to Watch: Smartsuit Pro 

Here’s a short BTS featurette of MoCap on War for the Planet of the Apes to give you an idea of how this works on a large scale:

  • Virtual Camera

As The Third Floor describes it, the virtual camera allows directors to play and explore in a virtual environment:

“The director (or cinematographer or cameraman) is given a physical camera with a sensor that feeds into the virtual world. The director can pan, tilt, dolly, and crane the camera to explore this world. The camera movement is recorded and the captured sequence can be played back when the director calls 'action.”'The virtual camera is used to scout the location or to re-shoot the Pre-vis animation from different angles. It gives the director the reins.”

Programs for Virtual Camera: Motionbuilder, used by TTF to characterize 3D character rigs onto mocap data, data “stitching” or the process of piecing data together to create a cohesive action to be displayed onto a 3D character.

To get an idea of a virtual camera at work, here’s a demo from Sawmill Studios:

  • Techvis

Once a team like The Third Floor has previs in the can, they can run a technical analysis on how the film can be physically shot. According to TTF:

TTF’s workflow captures and stores all technical information—lens type, height of a camera, size of a green screen, speed of a vehicle—as sequences are created. Whatever information the client needs is provided via diagrams and animated visualizations that show real-world measurements.

  • Motion Control

Once the shots have all been pre-planned in techvis, the data can be sent to a real camera to be performed to the exact specifications. The description of this crazy craftiness from TTF:

“Once all of the technical information is recorded, we transfer the digital camera’s movement to a real camera so the shot can be reproduced effortlessly.

This is a technique used to enable precise control of, and optionally allows repetition of, camera movements. It can be used to facilitate special effects photography. The process can involve filming several elements using the same camera motion, and then compositing the elements into a single image. Today's computer technology allows the programmed camera movement to be processed, such as having the move scaled up or down for different sized elements”

Here is a list of a few more platforms that The Third Floor uses:

Simulcam: "The name given to a number of real-time on set tracking and compositing systems. Known here at TTF as a ‘real-time postvis’ it is the process of combining real world actors and sets with CG actors and sets during a live-action shoot and in real time."

Ncam: "An augmented reality platform that captures photorealistic virtual elements in real-time. Here at TTF, we use this to help our clients visualize the scalability of environments and the creatures and objects in them."

  • Virtual Rapid Prototyping (VRP)

Virtual Rapid Prototyping is something being opened up to all kinds of industries, and in filmmaking, it’s getting us closer to that real-time iteration. As TTF describes it:

“VRP is a unique adaptation of the Pre-vis process, accelerated with virtual production techniques. Utilizing only a small crew and an actor in motion capture suit, a director can stage, shoot, and edit sequences in real time,“sketching” the sequence quickly.

An entire film can be quickly and cheaply prevised using VRP to test for marketability, providing, in essence, a feature-length Pitch-vis. An incredibly scalable solution, VRP can be executed with a compact team or full-scale production.



  • Virtual Asset Department (VAD)

Big-budget films can currently create realistic virtual environments without having to bring the cast and crew there. Just think of Jon Favreau's The Jungle Book, which didn’t have a single frame actually shot in a jungle. Here’s a description from TTF:

"Under the direction of the production designer, our virtual art department builds the assets and virtual environment (whether fantastical or based on an existing location), for directors to explore with the virtual camera. If a location exists, our virtual production team can physically scout the location, take measurements, and then model the digital replica to scale. This is vital in cases where access to a physical location is limited."

  • Photogrammetry

Not something you use to capture a simple object like a box that can be easily computer-generated, but for something cool that you that you can capture from numerous angles and regenerate in your film. From TTF:

"Photogrammetry is a technology that collects reliable information about physical objects and the environment. This is done through a process of recording, measuring, and interpreting aerial and terrestrial photographs.

This is an extremely helpful tool if you have access to a set or prop and want to create a model based on real-world dimensions. Virtual production has found that generating models from photos is a huge time saver and when integrated into virtual camera or simulcam, can aid in creating a more accurate user experience."

Industry standard: Photogrammetry eBook from Unity

Low-budget experimentation: Photogrammetry on your Camera Phone 

  • Virtual Scout (or VR Scouting)

As mentioned here by Lucasfilm & ILMxLab as working with VR and an iPad, virtual scouting can create a 3D environment that can prove useful for several departments. From TTF:

“Virtual Scouting is the process of the director and production designer to be able to virtually scout a 3D environment. This can be helpful to determine blocking, helpful for a cinematographer to compose shots by taking “snapshots” of the scene, and especially for a production designer to be able to put themselves in a virtually accurate representation of the filming location."


Finally, the process of putting the finished look together can be described as postvis, where the editor and VFX editor assemble the live action with the previs and virtual production. After previewing the elements together, they become the magical experience of the finished film! For an interesting explanation, check out this detailed account of The Third Floor’s work on The Last Jedi

Going back to the Planet of the Apes sagas, here's a look at side-by-side of previs and postvis. 

See all of our coverage of SXSW 2018.

Header image of Third Floor using OptiTrack Tech courtesy OptiTrack.