Read Latest Issue now

Start-up: Lux Machina, US, 2013

This month’s issue looks at a company pioneering flexible LED backdrops as an alternative to VFX

While Jon Favreau’s pioneering The Lion King demonstrates how a whole film can be set inside a gaming engine, there are many other ways in which Hollywood is utilising immersive environments to create real-time, in-camera VFX.

One company pushing the envelope is Lux Machina – a design, management and technical consultancy cofounded by Philip Galler (also the company’s CTO) with the aim of bridging the gap between equipment vendors and productions wanting to try out new technologies.

Using his background as a production coordinator at PRG Lighting – where he handled large, complex shows – Galler and his team built up the fledgling company’s reputation by designing virtual production systems for live events and TV shows that have a heavy LED screen and graphics presence. These included The American Music Awards, The Golden Globes and the NFL Honors.

The company first started to integrate Unreal into its workflow for live events and TV shows four years ago for previsualisation, then as a tool for projection, rendering and in-camera VFX. 

Sets appeal The team at Lux Machina use giant LED screens to create highly realistic sets and backgrounds for events and films  

“The larger group of us has been looking at or doing something with rendering for most of our professional lives. We knew that as soon as real-time rendering became photorealistic, the savings and the amount of work that you could get done in a visual way would be undeniable,” Galler says. 

Around this time, forward-thinking Hollywood productions were starting to look at ways they could use large LED walls and gaming engines to create in-camera VFX as a means of getting organic, realistic reflections on sets without going on location. The tech also provides realistic scenes and backgrounds on set using screens that can be quickly and easily adjusted (from night to day, say, or interior to exterior). 

In this context, Lux Machina’s work can be seen on Rogue One: A Star Wars Story where it used large LED wall set-ups to provide lighting and reflections, such as those seen on Darth Vader’s helmet. 

The firm – which structures its pricing on a workflow-as-service model – then upped the ante on the set of Solo: A Star Wars Story, creating over 90 minutes of visual effects on the 2018 film for the speeder chase scenes, as well as scenes around the Millennium Falcon. 

Work on the Ron Howard film also included assisting in the development of a giant on-set projection system that enabled actors to see and react to pre-designed animations of flying, and of entering hyperspace. 

In-camera VFX are also proving popular for driving scenes, as seen in Martin Scorsese’s vehicle-heavy period production The Irishman, where projected LED screens were wrapped around cars to avoid location-based issues such as permits, New York’s inclement weather, 21st century anachronisms and the labour involved in dressing a set.

Right now, LED screens aren’t a cheap alternative, and Galler admits that his company “leans towards higher-end productions with budgets that let us use the latest tech, and are willing to take a risk on a possible unknown.”

However, as one of the recipients of Epic Games’ Epic MegaGrants, the firm is now looking at ways it can build custom tools and hopefully expedite an ‘easy’ avenue for productions that want to
set up in-camera VFX workflows and virtual production.

The company leans towards higher-end productions with budgets that let us use the latest tech, and are willing to take a risk

Galler admits that there are challenges. The tech is still a niche area, he explains, and scaling is extremely difficult – it can be hard to find the right people with the very specific skillsets required to operate in this field. 

However, he remains buoyed by future technology advancements that promise to improve workflows. 

“5G will hopefully modernise how we move video, and video over IP will become a great methodology in broadcasting,” he says.

“We will also continue to work with HDR and high frame rates. Towards the end of the year, we hope to be working to 120fps with full HDR capability, and we’re also looking at volumetric displays and holographic technology. It’s in its very early infancy, but it’s fascinating.”

This article first appeared in the April 2020 issue of FEED magazine.