This VFX studio has made a standalone AI app for de-aging
- Branding, Cinematography, Filmmaking, Films, Motion Graphics, Post Production, Technology, Video Production, Visual Effects
- #Postproduction, #realtimeengine, before and after, BeforeAfter, CGINexus, CGINexusVFX, CGLabVfx, DC, ILM, Industrial Light & Magic, TheCGLab, tricks, vfx
- January 29, 2023
For many years, beauty work and de-aging VFX in films and television series was something kept very secret (sometimes it still is). But more recently, de-aging and aging and cosmetic fixes are much more front and center, often because of particular story points that need to be told, or because of various production challenges.
Visual effects studios tend to have their own 2D and 3D workflows for this kind of VFX, with machine learning and AI also becoming part of the process. Still, it’s meticulous work, often time-consuming and expensive.
That’s where MARZ (Monsters Aliens Robots Zombies) wants to change things. They’ve just launched Vanity AI, a standalone app that utilizes generative AI and other approaches to enable what they say is super-fast aging and de-aging (plus cosmetic, wig and prosthetic fixes).
MARZ has already used the tech on more than 20 productions, with the overall idea being to save time (and money) on these often complex shots. The intention is to speed up the workflow by letting the AI automate much of the process, and enable artists to make fast adjustments via sliders, rather than having to go through the usual time-consuming match-move, tracking and comp process.
befores & afters received a demo on Vanity AI from MARZ co-founder and COO Matt Panousis. Here’s a couple of quick take-ways. And check out MARZ’s demo video here, too.
1. What the tool actually is – “It’s a standalone app,” Panousis told befores & afters. “You ingest the plates in the same sort of normal way as you might another tool. There’s pre-processing done, and then you just get into the project. You establish your regions of interest, but only have to do so once for the entire project, as opposed to frame by frame for every shot. You establish whether you want this to be an age or a de-age, and you dial all that in, with dials.”
2. How it works – “You mask your regions of interest in the app. But unlike the current workflow, you don’t have to mask shot-by-shot and frame-by-frame. You build that mask once per identity for the project, and then for each shot, the system understands where the mask should be on that face. You don’t select a frame. The system tells you what frame to work with, and it takes the mask that you set up at the start of the project and applies it. The magic then happens in the background. You wait three and a half hours for the processing time to take place and you’ve got your moving shot.”
3. The AI part of Vanity AI – Panousis explains: “There’s a number of different neural nets, different AI functions, in this one piece of software. The most prominent function is the ability to edit (age or de-age) the look in real-time simply by moving sliders on a frame, as well as the ability to automatically transform that frame into a full moving shot. There’s a ton of pre-training but then there’s also some training that takes place on the shot level. So, the model’s been trained, that’s why you’re able to de-age with the dials. And then there’s what we call some overfitting taking place with some training on the specific shot.”
4. How it deals with ‘tough shots’, like bad lighting or extreme poses – “The software usually picks a single frame to work from, but in the case of an extreme pose, instead of one frame, you’ll get a frame of that extreme pose. If there’s an extreme lighting change, it’s going to give you a frame there as well.”
5. Can anyone use it? – MARZ says it will complete beta testing of Vanity AI midway through 2023. It will use the tool itself for this kind of work. After the beta testing process, Vanity AI will become available for license. “The software can be operated by trained VFX artists, but is also simple enough to be used by editors, the DI, and even pro-sumer content creators,” Panousis says. “That’s part of what the democratization of VFX entails. It’s about lowering barriers to entry and making these VFX capabilities accessible to those that haven’t had the budget or formal training needed in the past.”