Planet X FX

Planet X FX

visual effectsvirtual productiontitle designreadyset studiossetellitedeepspace

recreating the apple store, without stepping inside

iHostage, a Netflix Original produced by Horizon Film and directed by Bobby Boermans, dramatizes the hostage event that took place at the Amsterdam Apple Store on February 22, 2022.

With over 60 percent of the film taking place in and around the store, the filmmakers faced a significant challenge: how to convincingly portray a location that was completely inaccessible.

From the outset, it was clear that filming inside the actual Apple Store was not an option. While the production was granted official permission to shoot exterior scenes at Leidseplein at different locations and angles, the store itself could not be entered or modified in any way. Lighting, visibility, and even the ability to place actors near or around the windows were all outside of meaningful production control. A solution was needed that would not only bypass these limitations but deliver a final result indistinguishable from reality.

building a fully controllable virtual set

The creative response was ambitious and meticulously executed. A full-scale replica of the Apple Store’s interior was constructed inside an airplane hangar. Surrounding the set was a 60-meter wide, 6-meter high curved LED wall, displaying a real-time virtual replica of Leidseplein. This environment was built from scratch using LiDAR scans, drone footage, and high-resolution photography, creating an interactive digital twin of the iconic square. Read an in-depth case study of this unique approach here.

The 60m x 6m LED-wall pop-up studio by ReadySet Studios.
Watch this in-depth look at the meticulous process of capturing a real-world location for Virtual Production. The resulting scan is a fully 3D, real-time environment that allows live adjustments to camera angles, lighting, weather, and individual elements like vehicles, street furniture, and virtual crowds.

With this setup, the crew had complete environmental control. They could transition seamlessly from day to night, adjust lighting and weather conditions, and direct tram traffic and background extras—all live, during the shoot. Crucially, the LED wall acted as a dynamic light source, casting accurate reflections and ambient light onto the set’s high-gloss surfaces such as glass panels, polished floors, and device screens. The level of realism achieved was only possible through the direct interplay between real-time virtual imagery and physical set design.

creating a digital twin for virtual production

The scan of Leidseplein served as the foundation for both the Virtual Production and later the visual effects work. For on-set use, the real-time version of the environment was optimized for performance without sacrificing believability. While it often sat slightly out of focus in the background, it still needed to deliver spatial accuracy, architectural coherence, and proper light behavior.

Pre-production was critical in making this work. The production team, in close collaboration with Planet X and ReadySet Studios, used the proprietary VR previs tool DeepSpace to simulate the entire setup well in advance. This allowed the creative leads to test lens choices, block scenes, estimate lighting conditions, and define LED wall coverage virtually, long before the physical set was constructed. Without access to the real Apple Store, DeepSpace became a vital tool in bridging the gap between concept and execution.

precision on set with real-time control

Virtual Production Supervisor Robert Okker led the Virtual Art Department, coordinating closely with director Bobby Boermans, DOP Daan Nieuwenhuis, and First AD Evert van Setten. Operating from a dedicated space just off the set, the VAD team used live camera feeds and witness cams to monitor the impact of their environment updates on the LED wall in real time.

Digital trams, pedestrians, and traffic patterns were choreographed with timing cues, giving the director the ability to call virtual elements with the same ease as physical extras. One of the key collaborative wins came from the partnership with gaffer Gideon van Essen, whose intimate knowledge of the real Leidseplein lighting conditions allowed him to shape the virtual environment to feel authentic, consistent, and cinematic.

Camera tracking was enabled using a dual Stype RedSpy system, with infrared markers subtly integrated into the ceiling of the set. These provided real-time position, orientation, and lens metadata directly to Unreal Engine, supporting a dual frustum setup that gave both A and B camera units full access to the virtual scene with minimal latency.

a forward-looking production approach

Virtual Production was not just a workaround, it became a defining asset in the visual storytelling of iHostage. With so much of the film dependent on a location that could not be used, VP allowed the production to work smarter, faster, and more collaboratively. It brought final-pixel confidence into pre-production and enabled the director and cinematographer to craft scenes in real time, on set, with a clear view of the final result.

In doing so, iHostage exemplifies how Virtual Production can be deployed not only as a high-end visual tool, but as a foundational part of a film’s creative strategy, especially when the location in question is both iconic and off-limits.