All the Tech Breakthroughs of Avatar — and What They Mean for The Way of Water
James Cameron’s 2009 blockbuster “Avatar” didn’t just decimate previous box-office records — the film that Steven Spielberg once termed an “emotional spectacle” changed the way movies were made and shown. The cultural impact of the sci-fi epic (or lack thereof) continues to be a topic for debate, but its influence on virtual production and 3D viewing cannot be denied. Before StageCraft wrapped Pedro Pascal and Grogu in immersive, reactive “Star Wars” landscapes, before a performance-captured Andy Serkis brought a whole new level of expressiveness to the “Planet of the Apes” franchise, before Ang Lee’s mad-science frame-rate experiments in “Billy Lynn’s Long Halftime Walk” and “Gemini Man,” there was Pandora, the Na’vi, and the “director-centric” workflow developed for “Avatar” by Oscar winner Rob Legato (“The Jungle Book,” “Hugo,” “Titanic”).
On “Avatar,” Cameron could shoot his actors in the volume like live action using Glenn Derry’s Simul-Cam virtual camera, observing low-res versions of their avatars in Pandoran environments on an LCD monitor in real-time. What Cameron saw spoke to him in the moment and changed how he moved the camera and blocked the action. Then it was up to the wizards of Weta Digital (now Wētā FX) to make it look real. Under the leadership of Oscar-winning senior visual effects supervisor Joe Letteri (“Avatar,” “King Kong,” “The Lord of the Rings: The Two Towers,” and “The Lord of the Rings: The Return of the King”), they rewrote the VFX playbook for performance capture, animation, lighting, and rendering.
Letteri and Wētā have come a long way since “Avatar,” leveraging innovations from the “Planet of the Apes” franchise, “Alita: Battle Angel,” and “Gemini Man.” With the upcoming sequel, “Avatar: The Way of Water,” they explore new frontiers of Pandora, particularly the sweeping oceans, which take up a large portion of the film. Before “The Way of Water” comes to theaters on December 16, let’s review some of the tech advancements of “Avatar” and those that followed in its wake.
Performance Capture
How “Avatar” changed the game: Wētā advanced its performance capture capability for shooting in the volume in several ways. The New Zealand studio put a lot of effort into facial solves and tracking to overcome the limitations of a single point of view using one camera. Multiple cameras would’ve added weight, slowed down the process in terms of changing out drives, and made the process cumbersome for the actors.
In addition, the FACS (the Facial Animation Coding) system was overhauled as a more comprehensive solver for more accurate translation to the facial rig. Wētā also created a new optical solver for the eyes to track them, and paid a lot of attention in animation to the movement to compensate for what the solver couldn’t achieve.
What’s happened since? With “Apes,” Wētā advanced its performance capture techniques, particularly on the latter two films directed by Matt Reeves (“Dawn of the Planet of the Apes” and “War for the Planet of the Apes”), moving on location in harsh conditions of rain and snow. They established a smaller footprint with better markers and higher resolution and more mobile cameras, along with improved facial capture. This included faster and more interactive models and real-time animation facial tools for instant feedback.
Then, on “Alita,” they advanced further: The studio implemented two lightweight HD head cams for the first time to capture greater detail. Producer Jon Landau told IndieWire that this will definitely benefit the “Avatar” sequel: “It’s a much more direct 1:1 correlation, so that we are really lighting with their package, at a lower-res, a lower proxy, but it will save them work downstream.”
“Alita: Battle Angel”
20th Century Fox Licensing/Merchandising / Everett Collection
Wētā also advanced its facial capture system by using two CG puppets (one for actress Rosa Salazar and another for her character), re-targeting one onto the other to achieve closer unity. It took a year to work on the eyes alone, with Wētā simulating fibers for the first time (based on a baby’s eyes). As an illustration of how far the studio has come: There’s more detail in those eyes than there is in the whole Gollum character Wētā created for “The Lord of the Rings.”
The oceans of “The Way of Water” contain innovative water simulations, including first-time underwater performance capture, for which Wētā developed a new system blending underwater filming and using hundreds of cameras and markers in a 900,000 gallon tank. In addition to training the actors to hold their breath for several minutes underwater, there were several obstacles to overcome for the underwater sequences, including shielding the light from above by placing small white balls on the surface. The teams also had to prevent reflections in the water from the dots and markers because of a moving mirror effect.
Behind the scenes of “Avatar: The Way of Water”
©Walt Disney Co./Courtesy Everett Collection
Animation
How “Avatar” changed the game: “Avatar” required a whole new level of building characters and environments. There were seven main and 14 secondary speaking parts, plus the more than 200 Na’vi who have no dialogue but who are still very expressive. The trick for the land creatures was working out a believable six-legged walk and run cycle; the flying creatures had four wings, so Wētā had to figure out how to make them fly without the wings getting in the way.
The biggest character animation breakthrough involved crossing the Uncanny Valley, particularly with the two heroes: Jake (Sam Worthington) and Ney’tiri (Zoë Saldaña). A new facial rig was developed in Maya with a blend-shape system that used muscles as the basis for the controls. Wētā also developed improved skin texturing for the Na’vi.
Meanwhile, the lush jungles of Pandora, with beautiful daytime plants that become bioluminescent at night, necessitated a new virtual workflow. But growing plants and trees was difficult. Wētā had hoped to at least create the plants procedurally, but ended up hand painting everything to make sure it was of the highest quality and uniform in 3D space. One of the techniques was biospheres and domes where they placed a virtual camera in a scene that extended to infinity. There were also proprietary tools that could render a 360-degree sphere view at a certain radius for a prescribed distance.
What’s happened since? On “Gemini Man,” Wētā developed a few new facial animation techniques: they created a system for growing pores on the face using flow maps; they developed a much more naturalistic skin solution that treats the blend shapes of the face as different depths of the skin; and, for the eyes, they modeled the conjunctiva for greater realism.
In addition, it appears that Jake and the Na’vi (including the oceanic Metkayina clan located on Pandora’s reefs) benefit from improved skin texturing. Also, the new flying fish/crocodile and the whale creatures look very impressive.
In terms of depicting more of Pandora’s jungles and new plant life below water, Wētā can utilize its Totara organic tree-growth tool that it developed for “War for the Planet of the Apes.” This will enable them to grow full, natural environments, including surrounding ecosystem, with physical accuracy, making it more realistic.
Lighting
How “Avatar” changed the game: Wētā additionally adopted a global illumination system for lighting. It was based on image-based lights but then converted the whole system to spherical harmonics, which meant pre-computing all the lighting contributions in a given scene and then putting the characters and environments in with the lighting and moving the lights around. The results were impressive with subsurface scattering: light would bounce off the surface of a leaf, for example, but also transmitted through the leaf and come through as a green on the backside.
For the Na’vi, they also developed subsurface scattering, which provided transmission through the ears and the nasal cartilage. However, the blue skin became problematic in terms of lighting, especially after adding sweat and oil sheens to the surface of the skin. To overcome the plastic look, they went to a Hawaiian rainforest and studied how light reflected off plants and how the face takes light from the sky. Then they used green bounce light in conjunction with white to properly convey the faces.
What’s happened since? In addition to its improved global illumination system for environments and characters, Wētā can make use of PhysLight, a lighting system that simulates on-set lighting. This was too was developed on “War for the Planet of the Apes,” and has been recently used on “The Batman” and “Black Panther: Wakanda Forever.”
Rendering
How “Avatar” changed the game: For “Avatar,” Wētā’s super computers had to render up to 1.4 million tasks per day using RenderMan. This consisted of processing 8 GBs of data per second running 24 hours for more than a month. Often each of the film’s frames took several hours to render. But no matter how computationally intensive and nightmarish it was, the more Wētā strove for photorealism, the realer it actually looked.
What’s happened since? Since “Dawn of the Planet of the Apes,” Wētā has increasingly relied on its own physically-based production renderer, Manuka, which handles the complexity of battle action and wide establishing shots with speed, efficiency, and great computing power. This is accompanied by the fast pre-lighting tool, Gazebo, also physically-based, which allows imagery to move consistently from one renderer to another.
“Avatar: The Way of Water”
20th Century Studios
Stereoscopic Cinematographer and Projection
How “Avatar” changed the game: Vince Pace of Pace Technologies created the 3D Fusion stereo camera for “Avatar,” consisting of two Sony F950 cameras mounted on a special rig with two J-cam optical blocks to keep it as small as possible. It took seven years to develop the Fusion system, and what was unique about the camera was that it was made to match-move the performance-captured CG characters for compositing in establishing shots.
In terms of the stereoscopic planning, though, very little attention was paid to it because of the tight schedule, with 1,000 shots coming in the door in the last couple of months. Planning for 3D space was prepared early in the workflow as part of the template with unfinished assets for the virtual camera. This went downstream to Weta to complete the high-resolution assets and fully-rigged models and rendering.
For the immersive RealD 3D experience of the original “Avatar,” special attention was paid to the post-production process for the best possible viewing experience at the time. This was recently upgraded for the “Avatar” 3D reissue, which was remastered in 4K with select scenes at 48 fps.
What’s happened since: Cameron has reinvented stereoscopic cinematography for use with DP Russell Carpenter. They rigged the Sony Venice cameras to a specially made 3D stereoscopic beam splitter system, utilizing the Rialto extension unit used on “Top Gun: Maverick.” (This system is called the Sony CineAlta Venice 3D). The camera will be used for both underwater and select flying sequences, and the sequel will be offered in an unprecedented number of formats, including 4K and 3D at a high frame rate of 48 fps.
After viewing Lee’s use of 120 fps for “Gemini Man” and “Billy Lynn’s Long Halftime Walk,” Cameron found the hyperrealism too jarring for non-action scenes; therefore, he alternated between shooting “The Way of Water” in 48 fps and the traditional frame-rate standard of 24 fps. So how will that work in theaters, where a projector can’t switch frame rates mid-screening? “They just run it at 48fps,” Cameron said during a panel at the Busan International Film Festival. “In any part of the scene that we want at 24fps, we just double the frames. And so, they actually show the same frame twice, but the viewer doesn’t see it that way.”
Source: Read Full Article