UNREAL ENGINE represents the pinnacle of video game development. With incredibly realistic levels of detail and sharpness, video gamers are experiencing the reality in virtual reality. But the next level to that is real time adjustments to an environment and using Unreal Engine to drive a virtual space in studio, and use it as an ultra realistic backdrop in filmmaking. And then to control it with an iPad.
Anybody would take a look at it and say 'oh that's a disco box, it's a dance floor.' But it wasn't. It quickly became a friend that would help you do your job. - Sandra Bullock
Using the Touch Designer Light Box filming Gravity. Image credit - DerivativeCredit: Derivative
We got a taste of what the future of where cinematography would go with Gravity, the space opera starring Sandra Bullock and directed by Alfonzo Cuarón. Using a technique called "The Light Box," (created by Touch Design) director of photography Emmanuel Lubezki was able to create realistic light bouncing off Bullock's image as she is tumbling away from damaged Space Shuttle after a catastrophic meteor strike. It was frightening, it was incredibly realistic and immersive, and it was completely manipulated in real-time...sort of.
The Light Box not only provided visual references for the actors, who saw a representation of what they were looking at, but it also cast the proper light, at the right angle to keep the actor, and the audience, immersed in the scene. And being computer-controlled, digital gaffers could adjust the intensity and color temperature of that light on the fly as needed. Then, in post, that image could then be replaced with CGI, and still use the same metadata to maintain that sense of realism.
But that was in 2012, and a lot has happened since "Gravity" took the Oscar for Visual Effects and Cinematography as part of their seven Academy Awards. Now, there's really little need to replace that computer-generated background. Where The Light Box used 1080p video screens to wrap an actor in a visual environment, today's set backgrounds are powered by the video game tool Unreal Engine, and projected on Lux Machina high-resolution 8K video backdrops, which can fool the camera into seeing a realistic background.
We're on the cusp of this new revolution of how we tell stories, and 20 years from now we'll look back and look in awe of what we've created together. - Donald Mustard, Worldwide Creative Director, Epic Games
What used to take an hour to change lighting conditions for a new shot, can now be done in milliseconds from a computer or tablet. Speaking at SIGGRAPH 2019, Chief Technology Officer Kim Libreri took to the stage 20 years after presenting the Bullet Time effects in "The Matrix," to talk about how digital effects, lighting, rigging, and rendering are coming together using ray tracing to create real-time visual effects.
The gaming development term for this level of realism is "final pixels," and Libreri says that real-time final pixel visual effects is not only within reach, but is inevitable. And we see that in the simple shots that Matt Workman talks about in shooting of a commercial for Indian Motorcycles.
A virtual technician can now go into virtual reality to adjust the background image, change just about anything from a mountain top, to the sunlight, and even shading, to fine-tune the image to the cinematographer's liking. Workman has also harnessed the power of Unreal Engine to create a pre-Viz tool called CineTracer to plan out shots in the virtual space before he even gets to the set.
It's a bold new world. Check out the entire SIGGRAPH 2019 presentation here.