Planet X FX

Planet X FX

visual effectsvirtual productiontitle designreadyset studiossetellitedeepspace
'the crash' uses in-camera vfx as a first in netherlands
Watch our exclusive breakdown on the ICVFX used for The Crash

In The Crash, a considerable amount of scenes (and screen time) take place in the Dutch parliament as investigations about the cause of the crash and the mysterious cargo inside the plane are an important part of the narrative. The two main spaces where Dutch government resides, the First Chamber (Eerste Kamer, or 'senate') and Second Chamber (Tweede Kamer, or 'house of representatives') are not accessible to film crews, nor do they look like they used to in the period 1992 - 1998 when the story is set.

Reference material of the Dutch parliament (senate and house of representatives, or First and Second Chamber) as they appeared during the timespan of the series.

As constructing the sets in full was not an option budget-wise, and partly building the sets would mean countless shots would rely on VFX set extensions (not to mention the technicalities of shooting in a chroma key setup), the choice was made to solve these scenes with the use of big L.E.D. screens, a technique these days synonymous with the term ‘virtual production’. The precise term describing our use of this type of virtual production is ‘ICVFX’ which stands for 'In-Camera Visual Effects'.

The Dutch parliament 'Tweede Kamer' was build as a game-level within Unreal Engine to serve as an in-camera background to cover the many scenes that takes place in this 'impossible to produce' location.

The benefit of using ICVFX and creating digital environments of the parliament spaces, was that once these assets are created and optimised for use on the L.E.D. walls, the crew can -within some boundaries- shoot whatever angle they want with runs covering any length, directly live on set and evaluating the in-camera results. The light emitting from the L.E.D. wall helps the integration of fore-mid-and background as the emitted color and brightness of the content being displayed on the wall, actually ‘lights’ the actors and sets in front of it.

The decision was made to set up the parliament environments in Unreal Engine, basically creating a full 3D game level for both First and Second Chamber. Assets of the furniture and other objects were modelled based on photo references and archival source footage. The modular character of these environments allowed for quick repetition and then make some tweaks to details and add randomness.

Virtual environment in Unreal engine of the Dutch 'senate'
Virtual environment of the 'house of representatives'
why game engines?

Game engines are build to display whole 3D worlds for gaming purposes in realtime, as a game can not predict beforehand how players of the game will interact with it. In order to provide this seamless experience, the game engine must be prepared to render this game world in full ,360 degrees and photorealistic appearance (or as photorealistic as possible within the limits of real time computing). These techniques make game engines very useable for film production as well, as the combination of camera angle, camera dynamics and talent moving in front of the screen are not fully predictable. Realtime, continuous updates of the content in the background on the LED wall, ensures that at any given time, the camera angle is matched with the right perspective, parallax and defocus, selling the in-camera result as a seamless result.

This clip clearly shows the realtime tracking of the camera shooting the scene and the background on the L.E.D. wall updating to match movement, perspective/parallax and focus. The rectangular plane moving over the screen is the inner camera frustum, which is the area the camera 'sees'. Within this plane, the render power of the engine is concentrated to optimise performance, outside it, a lower res version of the scene is displayed to benefit from light and reflection. The inset bottom right shows the camera feed and in-camera result.
realtime tracking

ReadySet Studios located in Amsterdam and co-founded by Planet X, features a Stype inside-out realtime tracking solution that can measure position, rotation and lens information in realtime and send that data back into Unreal’s virtual camera. That way, the virtual environment displayed on the L.E.D. wall gets updated in realtime, aligned with whichever way the camera and DOP in front of the wall are moving.