When you think of a film movement, what comes to mind? Political/social rebellion? Self-expression? Freedom? Rarely will someone say "strict and rigid rules", but that is indeed the structure put in place by Danish directors Lars von Trier and Thomas Vinterberg when they formed Dogme 95. A controversial, but influential movement, Dogme 95 gave birth to some incredibly important films, like Festenand Idioterne. If you'd like to learn more about Dogme, what better way to learn about it than from one of its founders. Watch director Sophie Fiennes' short documentary Lars from 1-10, and find out how less freedom could mean more creative films.
Von Trier and Vinterberg started Dogme 95 in 1995 with this idea in mind: to return to "traditional values of story, acting, and theme, and excluding the use of elaborate special effects or technology." At that time, big budget Hollywood films were reigning supreme, and von Trier and Vinterberg wanted to show that budgets don't define quality. Similar to Dogme 95, movements such as Italian Neorealism and French New Wave had certain political implications, but where post-WWII Italian filmmakers, as well as post-Trente Glorieuses French filmmakers zigged, Dogme 95 filmmakers zagged. While Italian and French filmmakers embraced creative freedom, Dogme 95 wrote their manifesto and imposed a set of ten rules known as the "vow of chastity":
Shooting must be done on location. Props and sets must not be brought in (if a particular prop is necessary for the story, a location must be chosen where this prop is to be found).
The sound must never be produced apart from the images or vice versa. (Music must not be used unless it occurs where the scene is being shot.)
The camera must be hand-held. Any movement or immobility attainable in the hand is permitted.
The film must be in colour. Special lighting is not acceptable. (If there is too little light for exposure the scene must be cut or a single lamp be attached to the camera).
Optical work and filters are forbidden.
The film must not contain superficial action. (Murders, weapons, etc. must not occur.)
Temporal and geographical alienation are forbidden. (That is to say that the film takes place here and now).
Dogme 95 ended in 2005 once its founders decided that the Vow was producing formulaic films, but its initial effects can't be denied. Sometimes giving yourself complete freedom means arresting your creativity with unlimited possibilities. But, you put that creativity inside a box -- it'll find a way to break free.
What do you think of Dogme 95? Would you ever try imposing a set of rules like this on yourself? Let us know in the comments below.
Whether you call him a fixer, a generalist, or your own personal strike team, Real-Time Producer Bryce Cohen is the guy you want if you’re in the middle of a virtual production shoot and need to deliver high-quality content fast. That’s his primary role atArea of Effect Studios (AOE), a full-stack VFX and virtual production studio specializing in innovative visualization.
Bryce Cohen
Credit: Bryce Cohen
AOE has set its sights on driving a VFX renaissance in feature film and television production through using cutting-edge gaming technology. It’s also the creative sandbox for Academy Award-winning VFX Supervisor Rob Legato, an executive at the studio, and the virtual production arm of game development studioRogue Initiative, where Director Michael Bay is a Creative Partner.
In its mission to change the way content creators think about the entire pre- and post-production process, Area of Effect grabbed the opportunity to work on the HP Z8 desktop workstation, provided by HP and NVIDIA. Supporting up to 56 cores in a single CPU and high-end GPUs, the workstation delivers the faster frame rates and overall visual quality that make new in-engine workflows palatable to experienced filmmakers. The Z8 at AOE is equipped with two NVIDIA RTX™ A6000 graphics cards.
As Cohen says, “The Z8 helps remove some of the roadblocks that established creatives in the industry would initially have toward emerging real-time pipelines for creating the content they've been making forever.”
Bryce Cohen and Case-Y
Bryce Cohen
Accelerating in-engine work for film, television, and immersive productions
One re-imagined workflow that AOE executes routinely now is in-engine VFX pre-visualization using Epic Games’ Unreal Engine. They have applied the technique on the recent Apple TV film Emancipation, starring Will Smith, among others.
“There's a great argument to using Unreal for previs,” said Cohen. “You can see how big an explosion will be in-frame in-engine quickly and not have to hire a whole team of people to make sure no one gets hurt. But while the technology is capable of doing that, if we put a VR headset on an A-list director to do a virtual location scout and they’re getting five frames a second, they’re going to throw the headset off because it’ll make them sick. It's not a worthwhile venture for them and they’ll never want to do it again.”
Having the Z8 handle the task changes the equation. Said Cohen, “The Z8 can pretty much crunch whatever I throw at it. I will unplug the Z8 and cart it over to the director, plug it in, and they’ll be ready to go. Whether the project's performing or not, the Z8 will be.”
AOE has also produced real-time facial and body motion capture in realistic environments to create high-quality digital humans for a client demonstration. They’ve also produced real-time musical performances where a live artist performs onstage alongside a virtual character, and they continually produce proof-of-concepts and game demos for investors that depend on the advanced performance of the Z8.
“For those projects where there’s a high-pressure environment and nothing can go wrong, we take that HP Z8 workstation with us,” added Cohen.
Beyond films, episodics, and games, AOE invests time in interesting side projects where they get to test their tools and production workflows on a small scale, then apply that learning to their long-form work. They have created several in-engine music videos and experimental projects for artists represented by Loud Robot, a joint venture between RCA and JJ Abrams’ production company Bad Robot.
One of these projects involved producing content around a virtual character for electronic duoMilkblood. The character, Case-Y, was part of a world-building narrative around the duo. For themusic video, AOE created the character’s movement using a motion capture suit, then placed it into different virtual environments in Unreal Engine. The videos were delivered across platforms like YouTube and TikTok.
After the project was finished, with the permission of Milkblood and Loud Robot, Cohen launched an experiment. “Having Case-Y living on the computer in my garage gave me an idea. It was impossible for me to bring him into my actual space, but what if I could bring my world to him in his virtual space? I had been looking for an excuse to learn NeRFs, and this was it.”
NeRFs are neural radiance fields -- fully connected neural networks that can automatically build 3D representations of objects or scenes from 2D images using advanced machine learning. They’re an emerging technology that enables real-world objects and locations to be recreated in a digital space.
Cohen explained, “Pre-built digital environments have to be purchased, and the realistic-looking ones take up a ton of storage on your computer. While we were working on the project, I wondered, ‘What if I didn’t have to pay for environments and got them to look uncannily real? What if they were also places people might recognize?”
To test his idea, Cohen learned how to make a NeRF from a YouTube tutorial -- by walking around the space with his iPhone taking pictures, uploading them to NeRFStudiowhere they were processed, exporting the NeRF into Unreal Engine. Once he had everything on the Z8 workstation -- the actual environment and the digital Case-Y, he was ready to launch the experiment.
A power glitch in the area delayed the project and he took the files to a consumer-level workstation in another location. “I could see that the environment was definitely there, but it was getting maybe two frames per second maximum. I’d put an input for a movement on the computer and wait three seconds and see the movement happen. It looked good but was totally unusable.”
Comparison of Cohen's view and Case-Y's NeRF world.
Bringing the project to the Z8 workstation was a completely different experience. “Everything ran smoothly. I knew what this character was doing. I know where he was in the scene. I was able to experiment with camera movement, making it feel real, and blending the character into the real world because I was getting about 40 frames per second. I would have gotten more, but as it was my first time creating a NeRF I had made it much bigger than what was advised.”
To blend the character with the world, Cohen walked around the physical space and recorded his movements on his phone, using Glassbox Dragonfly, which takes that location data, brings it into Unreal Engine and lets the user see what the phone is seeing. He mounted a GoPro camera to the front of the phone to see what his movement looked like in the real world, recorded that movement, and put it into a virtual copy of the real world to build the scene.
“If you're trying to use a 3D environment of a NeRF, or if you're trying to import a NeRF into Unreal, and then use it to get real-time feedback of even just your movement inside that environment, you're going to need a really powerful computer. You could scale down the NeRF's quality, but the whole point of using a NeRF is to make something that feels realistic. It just wouldn't have if I didn't have the Z8 workstation.”
That kind of bold experimentation is what AOE is all about -- driving content creation forward by helping filmmakers make discoveries and use technology to bridge the physical and virtual worlds. And if they find themselves in a tight spot, well, they can always call on Cohen. He’ll bring his Z8 workstation.
Keep ReadingShow less
True
Loading more
Popular
Subscribe to No Film School and get a FREE high-resolution PDF of How to Write a Screenplay