21 February 2020

How ‘The Mandalorian’ and ILM invisibly reinvented film and TV production


“The Mandalorian” was a pretty good show. On that most people seem to agree. But while a successful live-action Star Wars TV series is important in its own right, the way this particular show was made represents a far greater change, perhaps the most important since the green screen. The cutting edge tech (literally) behind “The Mandalorian” creates a new standard and paradigm for media — and the audience will none the wiser.

What is this magical new technology? It’s an evolution of a technique that’s been in use for nearly a century in one form or another: displaying a live image behind the actors. But the advance is not in the idea but the execution: a confluence of technologies that redefines “virtual production” and will empower a new generation of creators.

As detailed in an extensive report in American Cinematographer Magazine (I’ve been chasing this story for some time but suspected this venerable trade publication would get the drop on me), the production process of “The Mandalorian” is completely unlike any before, and it’s hard to imagine any major film production not using the technology going forward.

“So what the hell is it?” I hear you asking.

Meet “The Volume.”

Formally called Stagecraft, it’s 20 feet tall, 270 degrees around, and 75 feet across — the largest and most sophisticated virtual filmmaking environment yet made. ILM just today publicly released a behind-the-scenes video of the system in use as well as a number of new details about it.

It’s not easy being green

In filmmaking terms, a “volume” generally refers to a space where motion capture and compositing take place. Some volumes are big and built into sets, as you might have seen in behind-the-scenes footage of Marvel or Star Wars movies. Some are smaller, plainer affairs where the motions of the actors behind CG characters play out their roles.

But they generally have one thing in common: They’re static. Giant, bright green, blank expanses.

Does that look like fun to shoot in?

One of the most difficult things for an actor in modern filmmaking is getting into character while surrounded by green walls, foam blocks indicating obstacles to be painted in later, and people with mocap dots on their face and suits with ping-pong balls attached. Not to mention everything has green reflections that need to be lit or colored out.

Advances some time ago (think prequels-era Star Wars) enabled cameras to display a rough pre-visualization of what the final film would look like, instantly substituting CG backgrounds and characters onto monitors. Sure, that helps with composition and camera movement, but the world of the film isn’t there, the way it is with practical sets and on-site shoots.

Practical effects were a deliberate choice for “The Child” (AKA Baby Yoda) as well.

What’s more, because of the limitations in rendering CG content, the movements of the camera are often restricted to a dolly track or a few pre-selected shots for which the content (and lighting, as we’ll see) has been prepared.

This particular volume, called Stagecraft by ILM, the company that put it together, is not static. The background is a set of enormous LED screens such as you might have seen on stage at conferences and concerts. The Stagecraft volume is bigger than any of those — but more importantly, it’s smarter.

See, it’s not enough to just show an image behind the actors. Filmmakers have been doing that with projected backgrounds since the silent era! And that’s fine if you just want to have a fake view out of a studio window or fake a location behind a static shot. The problem arises when you want to do anything more fancy than that, like move the camera. Because when the camera moves, it immediately becomes clear that the background is a flat image.

The innovation in Stagecraft and other, smaller LED walls (the more general term for these backgrounds) is not only that the image shown is generated live in photorealistic 3D by powerful GPUs, but that 3D scene is directly affected by the movements and settings of the camera. If the camera moves to the right, the image alters just as if it was a real scene.

This is remarkably hard to achieve. In order for it to work the camera must send its real-time position and orientation to, essentially, a beast of a gaming PC, since this and other setups like it generally run on the Unreal engine. This must take that movement and render it exactly in the 3D environment, with attendant changes to perspective, lighting, distortion, depth of field and so on — all fast enough so that those changes can be shown on the giant wall a fraction of a second later. After all, if the movement lagged even by a few frames it would be noticeable to even the most naive viewer.

Yet fully half of the scenes in The Mandalorian were shot within Stagecraft, and my guess is no one had any idea. Interior, exterior, alien worlds or spaceship cockpits, all used this giant volume for one purpose or another.

[gallery ids="1949115,1949123,1949122,1949124"]

There are innumerable technological advances that have contributed to this; The Mandalorian could not have been made as it was five years ago. The walls weren’t ready; The rendering tech wasn’t ready; The tracking wasn’t ready — nothing was ready. But it’s ready now.

It must be mentioned that Jon Favreau has been a driving force behind this filmmaking method for years now; Films like remake of The Lion King were in some ways tech tryouts for The Mandalorian. Combined with advances made by James Cameron in virtual filmmaking and of course the indefatigable Andy Serkis’s work in motion capture, this kind of production is only just now becoming realistic due to a confluence of circumstances.

Not just for SFX

Of course Stagecraft is probably also the most expensive and complex production environments ever used. But what it adds in technological overhead (and there’s a lot) it more than pays back in all kinds of benefits.

For one thing, it nearly eliminates on-location shooting, which is phenomenally expensive and time-consuming. Instead of going to Tunisia to get those wide-open desert shots, you can build a sandy set and put a photorealistic desert behind the actors. You can even combine these ideas for the best of both worlds: Send a team to scout locations in Tunisia and capture them in high-definition 3D to be used as a virtual background.

This last option produces an amazing secondary benefit: Reshoots are way easier. If you filmed at a bar in Santa Monica and changes to the dialogue mean you have to shoot the scene over again, no need to wrangle permits and painstakingly light the bar again. Instead, the first time you’re there, you carefully capture the whole scene with the exact lighting and props you had there the first time and use that as a virtual background for the reshoots.

The fact that many effects and backgrounds can be rendered ahead of time and shot in-camera rather than composited in later saves a lot of time and money. It also streamlines the creative process, with decisions able to be made on the spot by the filmmakers and actors, since the volume is reactive to their needs, not vice versa.

Lighting is another thing that is vastly simplified, in some ways at least, by something like Stagecraft. The bright LED wall can provide a ton of illumination, and because it actually represents the scene, that illumination is accurate to the needs of that scene. A red-lit interior of a space station, and the usual falling sparks and so on, shows red on the faces and of course the highly reflective helmet of the Mandalorian himself. Yet the team can also tweak it, for instance sticking a bright white line high on the LED wall out of sight of the camera but which creates a pleasing highlight on the helmet.

Naturally there are some trade-offs. At 20 feet tall, the volume is large but not so large that wide shots won’t capture the top of it, above which you’d see cameras and a different type of LED (the ceiling is also a display, though not as powerful). This necessitates some rotoscoping and post-production, or limits the angles and lenses one can shoot with — but that’s true of any soundstage or volume.

A shot like this would need a little massaging in post, obviously.

The size of the LEDs, that is of the pixels themselves, also limits how close the camera can get to them, and of course you can’t zoom in on an object for closer inspection. If you’re not careful you’ll end up with Moiré patterns, those stripes you often see on images of screens.

Stagecraft is not the first application of LED walls — they’ve been used for years at smaller scales — but it is certainly by far the most high-profile and The Mandalorian is the first real demonstration of what’s possible using this technology. And believe me, it’s not a one-off.

I’ve been told that nearly every production house is building or experimenting with LED walls of various sizes and types — the benefits are that obvious. TV productions can save money but look just as good. Movies can be shot on more flexible schedules. Actors who hate working in front of green screens may find this more palatable. And you better believe commercials are going to find a way to use these as well.

In short, a few years from now it’s going to be uncommon to find a production that doesn’t use an LED wall in some form or another. This is the new standard.

This is only a general overview of the technology that ILM, Disney, and their many partners and suppliers are working on. In a follow-up article I’ll be sharing more detailed technical information directly from the production team and technologists who created Stagecraft and its attendant systems.


Read Full Article

No comments:

Post a Comment