The Incredible Machines Bringing XR Stages to Life

Simple explanations for complex virtual production systems

When you’re annihilating evil forces in a video game world, a single computer or console can get the job done. When you’re creating an entire extended reality environment on a stage using a game engine, you need something a little more powerful. OK, a lot more powerful. Enter machines like Silverdraft’s Demons and Devils, and people like Hardie Tankersley, Silverdraft’s VP of Visualization Solutions. Here Tankersley explains the mighty hardware components at the aptly named XR Stage in Los Angeles, where ICVR’s short film Away and many other creative visions have come to life.

The XR Stage for ETC’s short film Ripple Effect

Setting the Stage for Extended Reality

Walk onto an XR stage in action and you might find yourself anywhere in the world — the real one or an imaginary one. In fact, one of the biggest advantages of shooting on an XR stage, whether you’re filming a major motion picture or a 30-second commercial, is that you can create whatever setting you like without travel time and expense. And you can change that setting in a fraction of the time it would traditionally take.

How all of those 3D assets are rendered and filmed to look so realistic involves, as you might imagine, not just powerful machines but an advanced architecture that allows them all to work together. Let’s start with the physical components.

An LED volume, the heart of the XR stage, has a single curved wall and a ceiling. The wall and ceiling are composed of segments. Each of those segments may have as many as a hundred LED panels and is rendered by a single computer, called a render node, which looks like a rack-mounted server. All of the computers for all of the segments are precisely synced to work together. The LED volume for Away had four segments on the wall and two on the ceiling, for a total of six render nodes.

A server rack with six Silverdraft render nodes. Their liquid cooling system allows stacking without overheating.

Need for Speed

“It’s like a real-time render farm,” Tankersley says, referring to all the computers working together to render the graphics. “In a traditional production, it’s an offline process. You have your models, whether it’s a spaceship or a monster or whatever imaginary thing you’re trying to create in the computer, and the computer can draw it overnight. But on an LED stage, the computer has to draw it 60 times a second, because it’s live.”

So why can’t you just hook up a bunch of gaming PCs with big GPUs to the LED wall segments? “Those machines aren’t built to do this kind of job,” Tankersley says. “And they’ll overheat or melt down. You need a machine that can run at full speed all day long and have reliability and redundancy.”

The back of an LED wall with ROE panels and Brompton LED processors. The panels are connected to render nodes, among other hardware components.

Getting in Sync

Render nodes are just one piece of the virtual production puzzle. Silverdraft built a new high-performance rendering architecture that feeds every piece of data into a cohesive whole. This includes the motion-capture system (which tracks the camera and anything else physically on the stage) and wall-geometry mapping in addition to the 3D assets and LED segments. Information from all of these components has to be processed in real time and rendered as the scene plays out, pixel by pixel.

As if this doesn’t sound impressive enough, consider that on a production set with a real crew and actors, there’s very little room for error. “You can’t afford to drop a frame,” Tankersley says. “You can’t afford to slow down the render. You can’t afford to shut down the production because the computer isn’t working or because there’s a little frame drop when you pass by a particular object…. You really have to design the computer architecture for the stage to be as fast as possible and never slow down. It has to be running at full speed all the time.”

     ICVR’s Chief of Product, Chris Swiatek, works on a rendering node to match the pace of the game engine environment to the physical environment.

But we’re not done with all the machines yet. First, any change in a scene means recalculating the lighting, which is called baking, so there’s often a computer just for the light-baking system. For Away, ICVR also used a DMX lighting controller to match the standalone lighting with the LED wall lighting in real time. Second, any changes to a data set — for instance, adding detail to a 3D asset — can require work by several people, which means several artist workstations are needed. And third, you need cameras and a camera tracking system.

Shooting for the Stars

Two types of cameras are used in all virtual productions in LED volumes. One is a virtual camera inside the game engine. It handles location scouting inside the virtual world, in-engine shot blocking, previsualization (aka “previs”) and techvis. The other is an on-set camera that can genlock, meaning sync to a common source with all the other components; examples include the Alexa LF Mini by ARRI and the Sony Venice.

 Fine-tuning a scene in Ripple Effect with the on-set camera

Then you need a camera tracking system connected to all of the other components (on-set camera, LED wall, render nodes and so on). For Away, ICVR used RedSpy by Stype.

Here Comes Troubleshooting

It’s rare for even a nonvirtual film shoot to go off without a hitch; issues can crop up related to everything from equipment breakdown to actor meltdown. Since virtual sets have all that extra technological complexity, troubleshooting is to be expected. Here are three common issues.

Note the visible seam at back right, one of the most common issues on a virtual set. This one was caused by inaccurate pixel mapping, and ICVR devised a solution.

Seams. Remember how the LED wall is made up of multiple segments, and how all the computers have to work together to project a single scene onto them? “Each of those segments has a different geometry, because the wall is typically curved,” Tankersley says. “So the computers also have to understand the specific curve of the wall and where the segments meet.” If the software picks up any errors in the geometry, “you’ll see some weird artifacts on the wall.” Artifacts include a range of undesirable visual elements.

Syncing. “All of the screens have to be synced to the [camera] frame so that the camera and the wall are in sync for every frame, so you don’t get any flicker,” Tankersley says. “If the sensor on the camera is running on a slightly different clock to the wall, you’ll see artifacts.” How precisely do all the sensors and clocks have to be synced? “To the microsecond.”

Camera angles and proximity. “LEDs have a color shift on different angles…you see this with any TV,” Tankersley says. “If you look at it straight on, the colors are right, and if you look at an angle, they’re off. So you have to pay attention to the angle of the camera to the wall.” In fact, it’s one of the reasons the wall is curved — to keep the camera at as much of a square angle to the wall as possible. Also, if the camera is too close to the wall, you’ll start to see the individual pixels. Color matching can be an issue as well.

Troubleshooting any of these issues might take some time, of course. But overall, the time saved by shooting a production virtually versus traditionally more than makes up for it. Plus, when it comes to setting, not even the sky is the limit on an XR stage. Thanks to powerful computers like Silverdraft and advanced tools and technologies like ICVR’s, your location can be anything from inside an atom to the farthest reaches of the imagination.

Short Film ‘Away’ Shows XR’s Vast Potential

ICVR’s trailblazing game engine tools launch worlds of possibilities — in filmmaking and beyond

Imagine filming in different locations without ever leaving the studio — and in a fraction of the usual time. Imagine creating entire worlds on set that look exactly as you picture them, without physically building every component. Now step out of your imagination and into the reality of interactive content creator ICVR.

ICVR used its groundbreaking new XR (extended reality) tools to create the award-winning four-minute film “Away” in one day in a single studio. Here’s how the team, including Gro Creative (on-set production) and ETC (production equipment), pulled it off.

XR Stages: Full 3D Environments

Combining physical and virtual elements on a film set isn’t new, of course. Green screens and editing software allow anyone with a computer to make movie magic these days. But XR stages go way beyond a screen rolling in the background. “One of the biggest misunderstandings about XR is the confusion between shooting 3D virtual environments on an LED wall versus flat plates,” says Chris Swiatek, virtual production producer and cofounder of ICVR. “These are not flat plates being shot and captured in camera. These are full 3D environments.”

Artistry meets technology in these 3D environments, also called LED volumes. The result: photorealistic scenes without the need to create all the elements physically or add extensive VFX (visual effects) in postproduction. And ICVR’s tools, using Epic Games’ open-source 3D creation platform Unreal Engine, allow a level of detail never before possible. The technique is still in the dawning stages but shows immense potential for countless applications beyond filmmaking.

The Process

Preproduction

The “Away” team flipped the traditional moviemaking script on its head, starting with virtual environments and building the narrative to fit them. “Content development starts with the creative vision,” Swiatek says. “What exactly is this environment going to be, and what’s it going to look like? What purpose is it going to serve?” The film’s post-apocalyptic setting led to two virtual environments: a large forest with varying terrain, and a multilevel cyberpunk city. 

Next up: building 3D assets to fill the environments, as well as pulling from existing asset libraries. The 3D assets for “Away” included trees, a wrecked plane and neon signage. Minimal physical assets brought into XR Stage’s Los Angeles studio, where the film was shot, included leaf piles and a few trees. 

Following asset creation came blocking, or figuring out roughly where to place those assets. Then the team fine-tuned the virtual set based on the actual shooting points. This involved adding umpteen details for realism and testing how everything looked using the actual hardware and LED walls.

Previsualization came into play here too — “the process of going into the scene that you’ve created and starting to use virtual cameras as well as stand-in characters,” Swiatek says, “to start planning your shot setups and your camera positions.”

Shooting

As you might guess, creating environments in a game engine on a small screen is one thing; shooting realistically using an LED volume is quite another. Challenges included making the camera perspective realistic, showing the movement of the character’s journey on a fixed stage, and hitting the standard 24 fps (frames-per-second) film format with such highly detailed 3D assets. Bring on the creative thinking and technical know-how.

First, the team used inside-out camera tracking to shoot. “A camera is attached to the camera that’s looking up at a constellation of infrared markers, which give it its placement on the stage,” says Jay Spriggs, XR systems integrator. “And then we’re also feeding zoom and focus data into that camera so that the Unreal Engine can understand where it is, what the lens characteristics are, and be able to reproduce that.”

While that might sound dauntingly technical, the solution for conveying the journey’s movement on a fixed stage was surprisingly not: The actor walked on a treadmill while the virtual set traveled a path through the city. The team matched the speeds of the two actions to create an effect similar to using a camera dolly.

As for hitting the 24 fps rate, swapping in a photo sphere for some of the 3D environment did the trick, allowing the LED processors to perform optimally.

While those challenges required creative solutions, they’re dwarfed by the advantages of working on an XR set.

Advantages of XR Sets

  • Quick scene changes. Traditional scene changes in filmmaking can take a lot of time, but “it only took a matter of moments for us to flip production from the cyberpunk scene to the forest,” says Scott Kelley, director of “Away” and cofounder of Gro Creative. 
  • Endless possibilities in one space. “Everything is in one location,” Kelley says, “and within that location you can change the worlds any way you want it…. The options are limitless.”
  • LED screens supply lighting. The screens provide many leverage points to light naturally — no separate, extensive lighting system needed.

And these advantages extend well beyond filmmaking. “It is really hard to conceptualize the vastness and fidelity that these LED volumes offer until you see it in person,” says Madeline Donegan, executive producer of “Away” and cofounder of Gro Creative. “This technology is going into the mainstream pretty soon, and it has a variety of use cases across many many industries.” Commercials, music videos, news and talk shows, interactive theater and location-based entertainment are just a few possibilities.

“Away” has won five Telly Awards, honoring excellence in video and television, and is being screened at film festivals in the U.S. and abroad. Check out the full four-minute video on YouTube and watch the behind-the-scenes film here

The film hasn’t just won honors, though — it has opened the door to a new world of visionary possibilities. It “really is such a powerful example of what you can do with virtual production,” Swiatek says.