X

Avengers: Infinity War and the CG effects behind Thanos

Why does Thanos look different in Infinity War, and how did a machine learning algorithm soup up Josh Brolin's performance?

Richard Trenholm Former Movie and TV Senior Editor
Richard Trenholm was CNET's film and TV editor, covering the big screen, small screen and streaming. A member of the Film Critic's Circle, he's covered technology and culture from London's tech scene to Europe's refugee camps to the Sundance film festival.
Expertise Films, TV, Movies, Television, Technology
Richard Trenholm
4 min read
Marvel Studios

Avengers: Infinity War is packed with dozens of beloved characters we've watched and loved over 10 years of Marvel movies. But one character stands -- literally -- head and shoulders over everyone else: Thanos.

Spoiler warning: We've steered clear of discussing the story in detail, but do make reference to characters and scenes near the end of the movie. If you haven't seen it, bookmark this and come back later!

spoilers-mcu
Composite by Aaron Robinson/CNET

To find out how the filmmakers created possibly the best computer-generated villain in movie history, I spoke to Kelly Port, visual effects supervisor at effects company Digital Domain. Apart from the scenes on Titan -- which were handled by Weta -- Port's team at Digital Domain had the job of turning actor Josh Brolin into the towering purple tyrant through the magic of computer-generated imagery (CGI).

As with most big blockbusters nowadays, various effects vendors were recruited to create different parts of the movie. They did this using performance capture: Brolin's face and body were covered in tiny dots that allowed them to record his movements and facial expressions. This created a low-resolution "mesh" that could be mapped onto a high-resolution "mesh" of Thanos' face, so the final CG character's expressions and movements matched the subtle details of the actor's movements even though their faces have different shapes.

What technical advances went into creating Thanos?
Port:We use both still scans and motion scans using technology developed at Disney Zurich Research Labs. We took the helmet-cam data, which is quite low-resolution, to create a facial motion mesh. Then you feed that through a machine-learning algorithm, which finds the appropriate fit with the high-resolution shape. With each iteration, you teach it -- you feed it information or make shape adjustments so it's a more correct solution. Over time, this algorithm improves the ultimate results by basically learning. We were able to keep so much of the incredibly subtle facial performance detail -- little twitches in the face muscles, for example, and little quivers. Subtle things that aren't really obvious but critical for the dramatic performance. 

Was the whole thing created by the computer?
As we do a particular shot and it needs to be finessed even more, we would go in there and key-frame animate. The animation team would go in there and adjust things on a per-shot basis. Then we have a whole post-animation modelling team that would go into very specific things to clean up the geometry, especially around the eyes where you're not getting a lot from the capture simply because you can't put a lot of dots on the eye area. Or around the mouth, or lip compression. Those are just so critical to extract every little drop of Brolin's performance. 

How many dots were on Josh Brolin's face when he was being filmed?
Not that many, actually -- I would say around 100, 150. It's a pretty small amount, nowhere near what the resolution ultimately ended up being.

aet1240-v008-1057.jpg

You're gonna need more Avengers.

Marvel Studios

On previous films like Beauty and the Beast, on-set movement and close-up facial capture had to be recorded separately -- meaning performance capture actors had to essentially perform their role twice. Is that still the case?
Previously we would have to do [facial capture] as a separate session. You would typically sit down and do the dialogue and get a high-res facial mesh. The downside of that is that you're in a separate room not with the other actors, one step removed from them.

The directors of Infinity War thought it was really important to keep the actors together. You've got so much odd visual-effects technology around them, blue screens and people wearing motion capture suits and helmets cameras, but at least they were together to be able to act off each other right there in and among the actual constructed set. And we were able to capture that relatively low-res phase and apply that motion to a hi-res phase with basically as much fidelity as if we had done it in a sit-down session using the straight-up higher-resolution scanning.

Did the technology have any bearing on the changes to the way Thanos looks compared to his brief appearances in earlier Marvel movies?
It was more of a creative choice. They wanted to keep the essence of the original design, but there was some tweaking of the overall Thanos creative design to go a little bit closer to Brolin. The overall proportions of his face are not Brolin's, but  let's say they're closer to Brolin's than they were before.

Having rendered the hirsute Beast in Beauty and the Beast, was it a bit of a relief to work on a character who's bald?
[Laugh] Yes, oh my god, yes.

Thanos does have hair -- he is bald, but he's got hair on his arms and he's got this little bit of peach fuzz on his head and face and skull. But yes, absolutely true, hair grooming can be a real pain, so that helped a lot.

How to watch every MCU property in the perfect order: From Marvel films to the shows on Netflix, here's the best order to experience the MCU.

Why the Marvel Cinematic Universe was a huge risk: As Black Panther and Avengers: Infinity War smash box office records, it's strange to think Marvel's movies were a gamble 10 years ago.

Watch this: Avengers: Infinity War: What did and didn't work