3d animation company, animation companies

The ‘Matrix Resurrections’ dojo was the first time DNEG had used Unreal Engine-rendered scenes for a sequence in a feature film

How the studio crafted the Morpheus and Neo fight scene.

At one point in Lana Wachowski’s The Matrix Resurrections, Morpheus (Yahya Abdul-Mateen II) takes Neo (Keanu Reeves) to an idyllic dojo location to reacquaint him with his kung-fu abilities.

The sequence, while incorporating live action actors and a practically built set-piece, also made use of Epic Games’ Unreal Engine for rendering out the environments. This was the first time DNEG had used the game engine for such a purpose on a full sequence in a feature film. DNEG visual effects supervisor Huw Evans describes the process for befores & afters.

b&a: What can you tell me about the design for the whole area of the dojo environment?

Huw Evans: There’s a bridge in Germany which inspired it. It’s called the Devil’s Bridge (Rakotzbrücke) –it’s this beautiful curved bridge that when you look at it from a certain angle, it feels like a full circle in the reflection of the water. Lana really loved that so that was where the design idea started from. The design was tweaked to have three lakes that are inter-joining with these two arched bridges. That was firstly worked out between Lana and Epic originally just to block out the environment.

Originally, there was a thought to use LED screens beyond the dojo set piece and for various reasons that didn’t work out, but we still wanted to try and keep this sequence in Unreal. It kind of fits with the world of the Matrix. It’s an artificial construct. It’s got that slightly different feel to it which I think worked really well. It started off Epic creating the base of the environment, then we had an open dialogue with them about porting that over to us to carry on the work.

So we picked up the environment from them and took it to our own creative team at DNEG to finish it off, add extra details and make it work in shots. That in itself was tricky because at the time Epic were using their bleeding edge engine that hadn’t been released, so we were taking builds of their engine and trying to build it DNEG-side. It was all hot off the press. We’d have an open dialogue with our incredible CG supe Roel Couke, and their tech teams and we’d be upgrading constantly just to try and get everything as fresh as possible, things like OCIO colour support and layered rendering which are all in place now, but at the time was all new, so that was fun.

b&a: So, it wasn’t a matter of taking their game engine, cameras or environments and importing them into Maya. It was keeping everything inside Unreal?

Huw Evans: Everything was done in Unreal. So we took from Epic the environment, the Unreal scene. We opened everything in Unreal, all the shaders, all of the geometry, all living in Unreal. Lighting was done in Unreal. For the cameras, we did do some of those in Maya. Our camera and layout team are used to working in Maya so we blocked out a bunch of cameras, did some back and forwards with Lana and production visual effects supervisor Dan Glass and just made sure they were happy with the camera moves. We exported that and brought it into Unreal and rendered it from there.

This meant that all of the content was created in Unreal (other than the CG dojo itself which remained in Clarisse) and then the tricky part was figuring out how to render it as we would in our normal pipeline. That was a big challenge. Back then we were on UE 4.25 when we started, and of course we’re now coming up to version 5 release. So in 4.25, there was a lot missing that you can get now off the shelf.

There was a lot of back and forward, figuring out things like splitting up render passes, making sure we could render out stuff that could go to comp like we normally would, rather than just having one output that you can’t do too much with. You can stick a grade over the top, but we wanted to break out individual elements and treat it how we would a normal comp. Robin Beard our DFX supe for that sequence would feed back anything he needed to Roel, who in turn would feed things back to Epic, they’d update some settings, kick it back to us. So it was really nice having that open dialogue and feeling like we can move things forward so we could use Unreal in more of a traditional way that VFX companies are used to.

b&a: So, when you had to incorporate it with live action, did the Unreal parts effectively become a render pass?

Huw Evans: Well, the dojo was live action other than the wide CG shots, that is, the dojo was a live action set with greenscreens behind it. So, normally we would get our cameras all tracked and do a CG background. We did effectively do that, but just through Unreal. So, rather than taking it into Clarisse or RenderMan, it all just lived in Unreal. We had one artist, with support, who would block out the lighting for the whole sequence as soon as we had the cameras there. It was great for spitting out iterations and doing updates, and quickly blocking things out with a minimal crew, which was obviously a huge benefit.

b&a: I did a story recently on Nuke’s UnrealReader node and it sort of seems like the optimum thing for this.

Huw Evans: We would’ve loved that! I saw that as well as soon as that came out, I was like, ‘Oh, that would’ve been cool.’ But we didn’t have that.

As a way of working, it required a different skillset and it was quite exciting, because you could quickly see results and not just a gray, blocky, playblast result. You could quickly see nice images. It did mean we had to figure out quite early on the lighting direction. The way it was shot, Lana was very keen on getting nice three-quarter backlit characters, which meant the sun direction didn’t always remain true. We couldn’t just use one setup and just roll it out through the sequence. But the freedom to very quickly, in Unreal, in real-time, move the sun over–it was very fluid in order to work like that.