Star Wars Mandalorian: the Rise of the Game Engine

24

Mar 20

In case you hadn’t noticed, Disney+ has just been added to the choice of binging platforms available to us, here in the UK. As well as hosting (almost) every classic Disney cartoon, and every Marvel superhero movie, this will also be your new platform for all things Star Wars.

And they are adding new stories to the galaxy of a long time ago and far, far away. Your latest obsession will be the eight-part series, The Mandalorian. This show represents a convergence between video games and movie-making, like never before.

The show brings a breezy, ‘wild west’ feel to the Star Wars universe as it jumps from desert planets to exotic cities to ice planets. What you might not appreciate, is the range of technological innovations going on behind the scenes, that allowed the show to happen.

For decades, there has been a growing overlap between the film and video game industries. Big movies’ dependence on CGI special effects has resulted in them looking ever-more like games, while the top-flight video games have been striving to look ever-more photo real and they cast recognisable Hollywood actors in major roles.

Well, Disney’s The Mandalorian represents the moment where Hollywood reached out and embraced video-game technology.

Director (and occasional superhero chauffeur) Jon Favreau brought several innovations to this production. Primarily, they shot it in a ‘virtual environment’. This is the culmination of the innovations George Lucas first brought to the Star Wars universe twenty years ago, with his prequels which, notoriously, saw actors standing in featureless green rooms, talking to tennis-balls on sticks – because literally everything else would be added later, in post.

In the twenty years since then, a lot of work has gone into making that process seem less fake, and less of a challenge for the actors. It all comes down to the technology allowing the storytellers more freedom to do their work. It is only fitting, therefore, that the latest Star Wars story should feature the culmination of that process which began with The Phantom Menace.

Back in the pre-digital days, one of the most common techniques used in film-making was ‘back projection’, where you had your actors standing in front of a screen with an image projected on it from behind. This method was used when a film didn’t have the time or the budget to ship the cast and crew to a location. The problem with it was, it was patently obvious the actors were standing in front of a screen. The lighting, the focus, the perspectives, even the image quality all gave away the trick.

Fast forward to 2020 and the newest virtual environment uses the same basic idea, but gets over all the traditional problems by employing the Unreal video game engine and giant LED screens to produce an effect which is invisible.

That’s the ideal of technology in the creative industries – for the kit to empower the artists, and then to get out of the way!

Unreal’s revolutionary new virtual environment technology offers unprecedented flexibility to film-makers. Amazing effects can now be achieved live, on stage. The actors can see what the camera sees, which is what the viewer at home will see. There’ll be no more need to fix it in post!

Once this toolkit is widely adopted, it will speed-up production, reduce costs and ensure that even modestly-budgeted movies can achieve spectacular results.

Here’s what it looks like, in operation:

This Unreal innovation allows for spectacular in-camera effects. It allows a production to visit several locations without leaving the comfort of the studio environment. The Force really is with those developers at Unreal.

0
Would love your thoughts, please comment.x
()
x