Everything About Virtual Production

A Virtual Production & Real-Time Animation News Website

Uniting physical and virtual lighting to create compelling concert experiences

Article by Addison Herr, Magnopus, September 30, 2022

Live concerts consistently wow audiences with increasingly immersive, jaw-dropping moments. Incentivizing users to attend a virtual show means delivering some extra visual magic that is only possible through game engine technology. For us, that means playing to the strengths of the medium – immersing users into hypnotic, evolving worlds that are grounded in reality but avoid practical constraints such as physics and budgets for physical fixtures & fx. 

At least for now, an XR concert may lack the intensity of a stack of Funktion-One subwoofers or the electrifying reaction of a crowd after a pyrotechnic blast, but we can surely compensate for those in other ways.  

In a virtual production context, this philosophy requires a suite of lighting tech that allows designers to rapidly iterate on virtual cues that simultaneously affect physical fixtures on stage. This is important, as while filming on LED gives some ambient light level to the talent, it’s not sufficient to provide plausible, high-intensity, interactive lighting that physical fixtures deliver. Virtual lighting pushes the fantasticism for pixels-on-screen, while physical lighting unifies the image and grounds one foot back in reality. 

Dynamic procedural lighting in Sebastian Yatrá’s performance of Traicionera, shot at the Verizon 5G Lab in Los Angeles. Credit: Verizon

Lighting design with math

Many of the resources for virtual lighting design rely on recording the DMX output – the standard data protocol for lighting hardware – from a dedicated console. This is a fantastic method for accurately simulating realistic fixtures (perfect for previs!) but it pushes the design and iteration phase upstream to the console. In a collaborative version-controlled studio environment, this isn’t always desirable. Also, once frame-by-frame DMX keys are recorded into Unreal Engine, it’s difficult to make adjustments. Attempting to retime or adjust the animations becomes cumbersome when working with hundreds of densely baked keyframes. 

Since we were prioritizing rapid iteration, we built a system within Unreal Engine to procedurally animate large groups of lighting fixtures using template sequences. Rather than keyframes, fixture groups have custom parameters – speed, wavelength, offset, etc. – which plug into a suite of math function libraries we purpose built. Continue reading


Magnopus: https://www.magnopus.com/blog/uniting-physical-and-virtual-lighting-to-create-compelling-concert-experiences

Related Articles

Weekly VP Newsletter

* indicates required
Featured Posts