Two cars race each other on winding, scenic highway, one slowly overtaking the other. Standard car commercial? Not quite. It's being edited in real-time—and only one of the cars is actually there.

When we recently marveled at the announcement of the Blackbird, a "mule" car from VFX company The Mill that can act as a stand-in for any virtual vehicle, it seemed like a futuristic invention that might not have real-life implications in the near future. Little did we know that the company would hit the ground running. Utilizing Epic’s Unreal Engine, which computer game developers use to render visual assets in real-time, The Mill shot a short film that pushes the boundaries of both technologies, merging real-time VFX and live-action storytelling.


Using this new system, called The Mill Cyclops, the short film's director was able to make complex VFX edits in real-time. By simply pressing a button, the director could swap out the entire body of the car and alter its physical properties, transforming it from a red 1950s Chevy to a blue Volkswagen. It is, in effect, a real-time generated movie.

It combines CGI elements with the real world in real-time, creating cinematic augmented reality.

"We created a virtual production toolkit to visualize what you see in the film—a virtual car," Boo Wong, global director of emerging technology at The Mill, told FastCoDesign. "But that can be extended to any character, prop, etc. From a visual effects point of view, that’s super exciting."

By way of example, watchThe Human Race, in which the 2017 Chevrolet Camaro ZL1 competes with the Chevrolet FNR autonomous concept car.

How it works

How do you combine CGI elements with the real world in real-time, creating cinematic augmented reality? 

It all starts with what's on set. The Blackbird has data capture equipment on board, including tracking markers, depth-sensing LIDAR (the technology used in self-driving cars), appendages that protrude from the roof, and 4K 360-degree RED cameras, all of which help VFX artists map the movement and orientation of the car. The artists then superimpose a CGI "skin," or photorealistic image, of a car onto the curves mapped from set. The software analyzes the footage, recognizes the position of the sun, and creates a realistic lighting system for the CGI car's shiny surface.

The fact that everything can be rendered live on set is a milestone achievement. 

On set, the director views a live video feed on a preview monitor, where the Blackbird is re-skinned in real time with the final pixels that will appear in the commercial, rendered at 24 frames per second, 60 times per second.

The fact that everything can be rendered live on set is a milestone achievement. Previously, VFX artists had to manipulate lighting in the post-production process. Now, Cyclops allows this to happen in real-time. If the DP moves a light, the virtual elements—a car, character, or otherwise—respond to that light change.

What does this mean for the movies?

In the short-term, this technology is game-changing for the commercial industry. But with time, it could upend the Hollywood VFX industry. It is, effectively, our first portal into a marriage between cinema and augmented reality.

As Variety notes, studios would be wise to utilize this technology for on-set visualizations as long as traditional, after-the-fact rendering is still the industry convention. But once Epic releases Hollywood-friendly support for Cyclops later this year, filmmakers should expect a sea change in the VFX process.