Ripple Effect is a short film produced by the Entertainment Technology Center at the University of Southern California (ETC). The project was commissioned to test virtual production and remote workflows, emphasizing COVID safety and acquisition of “Final-Pixels.” It serves as a case study to test critical steps for restarting and continuing film productions during COVID-19 shelter in place orders. This paper analyzes Ripple Effect, sharing findings, technology highlights, best practices, and opportunities for standardization of the film’s production workflows.
At its core, virtual production is filmmaking combined with real-time rendering, including game engine technologies to facilitate the process. The term, however, can be used to describe an incredibly varied range of evolving production techniques. It could be anything from just using a game engine for pitch-vis, or for basic shot-blocking, or VFX previsualization and planning, to something akin to a photoreal multiplayer game (in VR, or not), where the
game is “make a movie,” and everything in between. In the last decade, Graphics Processing Units (GPUs), real-time rendering, and game-engine technology
have become capable of rendering worlds, including virtual representations of the real world, at the level of fidelity we would expect from a traditional visual effects pipeline. Shots that once took hundreds of hours to render, can now render in real-time, at 60+ fps.