Inside the virtual production of ‘The Lion King’
The new Lion King movie is visually spectacular. A two-hour remake that frequently defies belief. Pride Rock is, of course, a fictional place but in director Jon Favreau’s film it feels eerily real. The creatures, too, move just like they would on the plains of Africa. They’re utterly believable until they talk and sing to one another about family, responsibility and the circle of life.
Disney has been remaking its animated classics for some time now. But The Lion King, which hit theaters on July 19th, sets a new benchmark for what’s possible with computer-generated animation. The secret behind it all? An experimental form of filmmaking that, through VR, allows studios to shoot virtual sets with old-fashioned direction and analog camerawork.
The workflow of The Lion King was vastly different to The Jungle Book. MPC worked on ‘master scenes’ in London with industry-standard tools such as Maya. It then developed an “asset management system” that could translate the scene into something that was compatible with Unity, a popular game engine. These contained low-resolution versions of the final assets and “fairly long clips of animation” that could be triggered on command.
The rest of The Lion King team worked out of a production facility in Playa Vista, Los Angeles. Favreau, Deschanel, Legato and other important crew members could slip on HTC Vive headsets and then explore the virtual world like it was a real set. They would talk about the placement of the sun, where the cameras should be set up and the best lenses to capture a particular shot. As Favreau told reporters during a set visit in 2017: “A lot of doing a movie is just walking around, talking.” The VR headsets allowed him to enter a similar mental state and trigger the parts of his brain “that fire on a real movie.”
Samuel Maniscalco, then-lead lighting artist for MPC, lit the world based on a number of factors including the scene’s location, time of day and placement within the larger story. The team also adjusted the position of the Sun — a small detail that made every shot unique — and chose one of 350 pre-made skies.
Then, the team started shooting in a 25-foot-square area called the Volume. Favreau and his crew would take a camera — or rather, something representing a camera — and physically move it around the stage. The three-dimensional flight path was tracked and reflected inside the virtual world, allowing the team to ‘shoot’ the master scenes in Unity.
Disney isn’t the first to do this. Ninja Theory, a video game developer based in Cambridge, England, used a similar process to shoot cutscenes in Hellblade: Senua’s Sacrifice.
As American Cinematographer reports, The Lion King team had a custom setup that combined the Vive with OptiTrack’s sensor system and US Digital encoders. It meant the crew could use and, more importantly, track conventional camera equipment including Steadicam stabilizers, cranes and wheeled dollies. A drone pilot was even hired to operate a virtual quadcopter and capture aerial shots in a believable manner.
The team wanted its camerawork to be as authentic as possible. Some shots, however, were impossible without manipulating the large but nevertheless restrictive Volume. The crew could change its scale, for instance, so that moving the camera a meter equated to two or four meters inside the world. Oftentimes, though, multiplying or exaggerating the operator’s movements looked unnatural. For particularly long, sweeping shots, the team would move the entire set through the virtual world like a magic carpet. In other instances, the crane and dolly equipment would be coded so that a horizontal movement became a vertical or diagonal one inside the virtual world, and vice versa.
The animation inside the master scenes was repeatable — and always identical — which meant The Lion King crew could shoot around the action as many times as they liked. These became ‘takes,’ which were scrutinized by editorial and filtered down into ‘shots’ that could be sent back to MPC’s team in London.
The animators in London would receive “3D scene files,” according to Newman, that contained the three-dimensional camera moves. MPC’s task was to translate that camerawork, which had been shot in the Unity game engine, into a version of the movie that had final, production quality, assets and animation. It was a new workflow for the company, but one that provided some advantages. The camera tracking had already been handled, for instance. “The virtual production kind of took that away from us,” Newman said. And unlike The Jungle Book, the team didn’t have to shoot around any live-action performances. Everything was CG.
That still gave the company a staggering amount of work to do, though. “MPC did all the visual effects,” Newman said. “We did all the animation. We did all the rendering, all the compositing, all the lighting. The whole thing was made by MPC.”
“The reality is when you’re on location, sometimes you’re shooting things that don’t look that great.”
Save for one shot, the entire movie was keyframe animated. Favreau and his team didn’t use any motion capture because, if possible, they wanted to avoid placing markers on live animals. “I think that’s a nice next step for movies,” Favreau told Collider, “to leave the animals alone.” The crew knew it needed some real-world touchstones, though, to make the virtual characters believable. Some of the crew visited Kenya, an obvious parallel to the Pridelands, and plenty of reference footage was shot at the Animal Kingdom theme park in Walt Disney World Resort, Florida.
Some of the actors were filmed in a soundproof room, too, that looked like a motion capture stage. They didn’t wear suits covered in ping pong balls, though. Instead, they were simply asked to perform together and, when appropriate, improvise a bit. The crew stood back and simply recorded them with microphones and — for the benefit of the MPC animators later — long-lens video cameras.
All of this footage helped MPC to realize the film’s ultra-realistic, documentary style. For better or worse, The Lion King remake looks like an episode of the BBC’s beloved Planet Earth docu-series. The team based its virtual camera around the Arri Alexa 65 body and Panavision 70 Series cinema lenses that were used on the reference trip in Africa.
MPC also chose skies that were deliberately muted: “[Favreau] would always kind of take us back and say, ‘No, I don’t want to see a beautiful sky in every shot,'” Newman said. “Even though we can do that, that’s kind of like a CG gag that you want to steer clear of. Because the reality is when you’re on location, sometimes you’re shooting things that don’t look that great. There is a realism to that.”
You can see this commitment in the characters, too. They’re emotive, but not to the same extent as the original movie. “If anyone had actually seen us over-animate these characters or over-design the worlds and the characters within them… I think it’s a very fine line and you produce something that just looks very strange if you push any one aspect of it too far,” Newman said. That doesn’t mean the animals are static, though. After the opening number, for instance, the camera lingers on a mouse that scratches its cheeks and scrambles up a rock — first unsuccessfully, then in a moment of understated triumph. It serves no narrative purpose — the sequence is merely visual fodder to sell you on the live-action illusion.
Other pieces of animation show the characters’ personalities. Scar slinks around to reflect his devious mind, for example. Pumbaa, on the other hand, trots around with a visible spring in his step — a clear nod to the character’s blissfully ignorant and happy-go-lucky attitude.
But there’s restraint. MPC trusted that the film’s documentary style could still engage the audience and make them feel for the creatures on screen. When you watch Planet Earth, Newman explained, there’s a ‘story’ that’s conveyed through careful editing and narration. That, combined with the beautiful cinematography, makes you care for the animals and their fate. A good example, he said, is the viral sequence from Planet Earth II that shows snakes chasing iguanas. “You’re still connected to that character,” he said. “You feel like you want that lizard to make it. You don’t want to see it [get] eaten by the snake. The lizard is not running around with a sad face or a panicked face. It’s still a lizard, but you’re still very much engaged, emotionally, in following the story.”
To make it work, the team needed large and incredibly detailed environments. Every location, including the Elephant Graveyard and Wildebeest Valley, was painstakingly realised and, more importantly, placed within a larger world map. “We always said we didn’t want to use back paintings in the background,” Newman said. “We wanted to try and ask: ‘How far can we push the environment into the background? How far can we extend the world?’ So, a lot of what we did was 3D, it was a full 3D environment. Just a lot of asset building. a lot of trees and plants and grass models were built, and we extended the world really far.”
Some viewers have criticized the lack of emotion in the film. They miss the bright colours and surreal visuals that accompanied musical numbers like “I Just Can’t Wait to Be King” in the 1994 original. Others, like British film critic Mark Kermode, have questioned whether live action — or rather, the illusion of live action — is the right medium for a story that is, ultimately, fantastical and filled with singing.
Traditional animation allows characters to have a special elasticity. They can grin, twist and jump in ways that wouldn’t be possible in real life. Disney and Pixar have, of course, made films with stylized CG animation. Favreau had a vision for The Lion King, however, and ‘over-animating’ would have betrayed the movie’s documentary style. “As soon as you see an animator use a brow up shape to convey sadness, it’s like, ‘No, it doesn’t work. It just doesn’t work for what this movie is. Let’s dial it down. You don’t need it,'” Newman explained. “Its a strong sort of storytelling and a lot of that was conveyed through more subtle animation that was in keeping with the documentary style and the realism that we were going for.”
For some, the original will always reign supreme. Others will prefer the Broadway version. But many have, and will, appreciate the new Lion King for its stunning visuals and commitment to a single, distinctive style. It is, if nothing else, a technical triumph. And whether or not you like the end result, it will go down in history as one of the first ‘virtual productions’ with a Hollywood budget. One that proves the potential of VR as a way to make movies.
The Lion King wouldn’t have been possible without The Jungle Book, a 2016 remake by Favreau and Robert Legato, a visual effects supervisor best known for his work on Star Trek: The Next Generation, Titanic and Avatar. It was a special movie that cleverly surrounded Mowgli, played by Neel Sethi, in a breathtaking world filled with pixel-perfect animals.
Almost all of the animation was handled by MPC Film, a division of Technicolor that specializes in visual effects. The team had worked with Disney before on blockbuster films including Maleficent, Into the Woods and Cinderella.
“We did the majority of [The Jungle Book],” Elliot Newman, a visual effects supervisor at MPC told Engadget. “We did something like 1,300 shots.” The only scenes it didn’t touch — with King Louie, played by Christopher Walken, and his tribe of monkey minions — were handled by WETA Digital in New Zealand. MPC won both an Academy Award (Best Visual Effects) and a BAFTA (Best achievement in Special Visual Effects) for its work on the film.
“It really did feel like ‘The Jungle Book’ was the test, in a way.”
Toward the end of production, MPC discussed VR as a filmmaking technique. At the time, the consumer versions of both the HTC Vive and Oculus Rift were nearing release, and developers were excited about their potential for consuming and creating content. “We were already talking towards the end of The Jungle Book about new technology, and ‘should we use VR?'” Newman explained. “And the idea of ‘virtual production’ was discussed.” A ‘virtual production’ meant creating a digital world with characters, or ‘actors’ that could be ‘shot’ while wearing a VR headset. It would be like a video game, but with the quest of creating a movie.
These conversations were put on hold until roughly one year later, when MPC started talking to Disney about The Lion King. The company was keen to be reunited with Favreau and other key members of The Jungle Book team. “It really did feel like The Jungle Book was the test, in a way,” Newman said. “It was like ‘We know that we can achieve this. How do we improve upon that?’ We were all keen to go even further than what we were expected to produce. Everyone was just so enthusiastic to work on [the movie].”
MPC produced a brief animation test of Simba walking through the jungle. It was enough to win the studio the project, which started with a second piece of test footage shown at Disney’s D23 Expo in 2017. “That was a minute and a half of the opening of the movie,” Newman recalled. Thankfully, the crowd loved the team’s interpretation of “The Circle of Life.” “I was lucky enough to be in the audience and I think that was a really clever move,” Newman said. “It was something Jon [Favreau] really pushed for, to create some sort of a buzz around this thing.”
(72)