As filmmakers, we're constant problem solvers, identifying ways to present our story. At No Film School, our indie DIY attitude has found us making our own ring light from a bucket of fried chicken, using spray paint to create gels, and building jib arms from scratch.
When it comes to visual effects, nothing beats in-camera techniques. They tout a photo-realistic language that's hard to replicate using software alone. A recent example being First Man, where Oscar-winning VFX supervisor Paul Lambert and his team leaned on LED screens and high-resolution projections to backdrop the flights into space instead of traditional green screen. The results produced lifelike reflections and glares during liftoff that Lambert admits would have been impossible to replicate in post.
It's not the first time this type of pipeline has ever been used; even fellow Oscar-nominated Solo: A Star Wars Story used projection for its lightspeed travel. NFS contributor Charles Haine collaborated with Indy Mogul on the very subject of live projection. Lighting has this magical way of immersing us in story.
On a project I worked on, we needed to figure out a way to push video content to multiple screens and smartphones all at once. We've all seen it in movies; where the villain "takes over the airwaves" and lists his demands, leaving everyone in shock and awe. There are a few ways to approach this: 1.) preload the devices with the material for playback 2.) use VFX to replace a blank screen with the content 3.) send the content wirelessly.
In our scene, we had a couple of things working against us. One being the sequence took place at a night. Another being it jumped to different locations showing reactions from dozens of actors that would be intercut later. On top of that, the director didn't want the actors pressing Play on their smartphones, nor did she want the actors to see the material beforehand as a way to record organic reactions. All playback needed to happen simultaneously.
Since it took place at night, we wanted to capture the glare emitted from the smartphones to add authenticity. Knowing we didn't have the budget to recreate that look in post, we looked to a wireless video solution. We hit all the usual "Teradek-esque" suspects but found an answer in Crowdbeamer, tech that's actually made for education and presentations. Ya, the PowerPoint kind.
We specifically used Crowdbeamer Go. Basically, it's a device with built-in WiFi that connects to a laptop (VGA/HDMI inputs) and shares content in real-time to smartphones, tablets, and other screens. Out of the box, 25 people can view the content and its expandable to 75. After downloading the Crowdbeamer app on each of the smartphones, we were able to trigger and control the content to everyone in the scene, all simultaneously.
This allowed us to record the reactions, and at the same, capture the exact light and video reflections produced on their faces. To display the content onto TV screens, we used the HDMI output on the Crowdbeamer Go to plug directly into the monitors. We also didn't find the need to upgrade past 25 people and elected to change the camera angle, giving the new set of actors the Crowdbeamer enabled smartphones. If you need a similar solution for more people, they do have an option for up to 500 devices.
We're always looking for answers to solve our problems and maybe this one will help you out in the future. If you've found a unique workflow solution of your own, tell us about it in the comments section.