DYNAMIC VIRTUAL ENVIRONMENT

When a returning client from The Post contacted us about an action sequence on a subway train, we very nearly talked ourselves out of a job. The request was for us to film our traditional array plates, but from a New York City MTA Subway. After discussing the nature of the scene and the fact that there was never a need to leave the tunnels, we recommended against shooting a live plate in favor of creating a dynamic virtual environment. We suggested working from animated transparent panoramic stills to create a “2.5D” world to project onto the large scale LED video walls that the production had already sourced. The result is that we did in fact loose the plate job, but were tasked with creating what may have been a first of its kind animated world in a media server. In close collaboration with media server programmer, Stephan Hambsch, we pushed the capabilities of the Disguise D3 Media server to just shy of its breaking point.

Proof of concept render of 2.5D Parallax Effect

Proof of concept render of 2.5D Parallax Effect

The technique we devised for creating a dynamic parallax effect centered around mapping a speed multiplier variable to a version of a particle emitter in the D3 Media Server. We divided the 3 dimensional space into a series of 5 layers on either side of the set. Each layer would contain a large scale stitched panoramic still that we created from location photography. Each layer included architectural elements appropriate to its distance from the train with the remainder of the panorama containing Alpha Channel transparency. Each layer’s emitter speed property was tied to a simple multiplier variable that was driven by a midi fader. By increasing or decreasing the position of the fader, we altered the value of the variable and the multiplier caused each layer to speed up or slow down proportionally to each other in accordance with its position relative to the set… the closer the layer, the faster it moves. We also tied the horizontal motion blur to the same multiplier, so the faster a layer moved, the more blur it would exhibit.

“The ability to control the subway environments on-the-fly in real-time was very impressive.”
— Stephen Ramsey - Gaffer

Dynamic Interactive Light with only the LED WALLS active. This is before we activated motion blur

Working in this way, we were able to achieve an infinite amount of tunnel, and an infinite amount of station including 3 different station looks. These various environments could then be cued as needed to suit the timing of the actors and stunt performers including speeding up or slowing down the train, pulling into stations and stopping and allowing for other trains to pass by.

Gaffer, Stephen Ramsey was able to synchronize his DMX controlled interior and supplemental exterior lighting to accentuate the interactive lighting being provided by the dynamic content on the LED walls.

After a successful two day shoot on the LED stage, we were asked back by VFX Supervisor Edwin Rivera to shoot our traditional driving array to acquire bus plates on the Queensboro Bridge for another sequence in the film that was shot using blue screen.

Gaffer, Stephen Ramsey sits in for a quick lighting test for DP Lawrence Sher.

From The DP’s Perspective

Cinematographer, Lawrence Sher sat down for an extensive interview on his process and experience on Joker.

The following video is the portion where he discusses this pivotal scene.

Previous
Previous

Creed II

Next
Next

Stranger Things