An invading army numbering about 10,000 was marching through a storm to regroup in front of a fortress built on the mountainside.
From a distance, the warriors look small like ants but are filled with menace when well organized to the point of fear. All raised their spears and let out a growl through their fangs, and when lightning struck, their great numbers were finally revealed. As the battle began, arrows flew, swords found their way to shields. The bodies of both sides fell to the ground.
This bloody image is from the battle at Helm’s Deep, from the movie The Lord of the Rings: The Two Towers. This was one of the first on-screen battles to be powered by Massive, a piece of software specifically designed for the Lord of the Rings series. Its role is to create computerized armies, using artificial intelligence to simulate real battles on a terrifying scale.
In the past, movies like Star Wars revolutionized special effects, or studios like Pixar shaped the use of computers to create masterpieces. Massive software developed by engineer Stephen Regelous has changed the way the modern film industry operates. The software he created enhances the visual effects of the epic battles that audiences have come to expect on the big screen. Regelous herself received the Academy Award for Scientific and Technical Achievement for this work.
While the advanced effects don’t always hold their value over time, 20 years later, looking back at the Orcs Uruk-hai during the battle at Helm’s Deep, they still look lifelike and full of character. threaten.
And over the past few years, Massive has continued to help create and perfect a number of iconic fight scenes, including the final season of Game of Thrones and the blockbuster Avengers: Endgame.
“You can see its impact just by looking at the movies that have used the software over the past 20 years. That’s a pretty big list of popular movies and TV shows,” said Bob Thompson, Founding director of the Bleier Center for Television and Popular Culture at Syracuse University.
Of course, Massive now has to compete with countless other competitors, in a more technologically advanced world of visual effects. It both stands up against other shows that have the ability to simulate crowds while trying to meet growing expectations for special effects.
But every time the fate of the world – in the movie scripts – falls into jeopardy, Massive will march into battle again.
Before the Orcs and Elves could start attacking each other in Middle-earth, Massive was really just a dream.
In 1993, Regelous, the author of Massive software, said he dreamed he walked into his office and saw a group of people watching a simulation of a forest on a computer. Inside this computer-generated forest are trees, animals, and weather. And it all co-exists like in real life, working in real time.
The people in that room, whom he assumed were aliens, told Regelous how things work: It’s a system of interconnected nodes that induce the behaviors of all the creatures on the planet. screen.
This dream then followed the computer graphics software engineer from New Zealand. So when director Peter Jackson asked Regelous to code a crowd simulation software to create the ending party scene in the horror comedy The Frighteners, Regelous wondered how it might work. . Although he initially rejected Jackson, he ended up hiding in his apartment, even programming a stopwatch to keep track of whether he actually worked the full eight hours a day for the project. judgment or not.
Two years later, Massive was officially used to simulate the first battle, although things were not as detailed or realistic as the battle of Helm’s Deep. That’s when Regelous arrived at Weta Digital, a visual effects studio then still housed in a small house in Wellington, New Zealand. Here, he used this software to show the clash between 1,000 silver soldiers and 1,000 gold soldiers. In general, each character in Massive is called an agent by him.
Massive wowed the visual effects professionals there. At one point, it appeared that several soldiers were fleeing the battle, and the initial assumption was that maybe these agents were smart enough not to want to get involved in a deadly conflict. In another scene, several members of the Weta Digital team had to point to a couple of soldiers fighting and say that he thought he had just seen a soldier trying to avenge someone else who had just been killed.
“Actually that’s not what happened,” Regelous said, “but people make it happen by looking at simulations of things that aren’t really there.”
Suddenly, a very loud hiss sounded overhead. The terrified soldiers looked up at the sky to see winged monsters called beasts swooping down towards them. The monsters caught a few people, it was as easy as picking up candy from a bowl, then flew up into the sky and then dropped them to death. The bodies softened as they hit the roofs below of the city of Minas Tirith.
If anyone thought that the battle at Helm’s Deep was intense, the battle in the fields of Pelennor in The Lord of the Rings: The Return of the King episode. even scarier and more terrible.
During his two years focusing on programming his software, Regelous made some important decisions about how Massive would create a scene and all that was possible after that.
And the focus used is on what researchers often call “fuzzy logic”. If traditional logic states that something is true or false, then fuzzy logic allows for possibilities in between. In Massive’s case, this meant that if an Orc stepped in front of an Elf, there would be multiple options for how those two agents would fight each other, based on different rules of logic that had been established. written to guide the interaction of actors. The software then multiplies it by thousands and no two interactions will be repeated on the battlefield.
Martin Hill, Weta Digital’s visual effects supervisor said: “The human eye is very good at recognizing copies.”
Using fuzzy logic not only provides unique interactions, but also provides more flexibility in use, even compared to neural networks, according to Regelous. Usually, when you hear about artificial intelligence, you hear about so-called artificial neural networks or neural networks. They are a vehicle in the field of deep learning, first proposed in 1944. A basic example to understand it is an object recognition system, and one would need to display thousands of images. about different items so the system can figure out what the object is. For example, if Jackson had decided that a group of orcs needed to be more aggressive in a scene, the neural network could take months to train the AI characters to become more aggressive. But Massive allows those changes to be made quickly.
In this case, the fuzzy logic applies only to a living agent. Once that agent dies, another important part of Massive is activated, called Rigid body dynamics or Rigid Body Dynamics.
The basic idea is this: An Orc hit by an arrow, dies and falls off a cliff, or a soldier of the Minas Tirith citadel hits a roof. The aforementioned dynamics would account for the orc’s body limping and falling, and that’s when the inactive agent obeys its fuzzy logic rules anymore. Because it’s dead.
But it’s a piece of code that Regelous can’t write on its own. Fortunately, Regelous found a college student who wrote the code on the problem and let him use it. “That allows us to have all thethis beautiful physical interaction, while you would believe there were thousands of people in the scene.” Regelous said.
And when you watch the movie, do you remember the screams that came from the beasts as they swooped down on the soldiers menacingly? Agents are also designed to react to that. They not only react to combat situations, but also to the sounds that occur in the virtual world. Weta Digital has found a way to integrate audio into Massive. The team can represent that sound with a signal in the software, then designate any agents that can be heard in battle. In other words, if they hear something, they can look up and perhaps just in time to see their end.
For the great battle of Winterfell in the final season of Game of Thrones, the expectations of the filmmakers and the VFX team are extremely high. Viewers may be disappointed about the whole season, but the battle itself is one of the most iconic in pop culture history.
In this case, Massive must animate what the viewer cannot see.
At the start of the battle, the Dothraki army burst out of the castle, rushing to where the White Walker army and the undead awaited. But then, each flame turned into darkness. The Dothraki, once a group of brave, impregnable warriors, died, when their flames were extinguished.
Everything on the screen seems very small and hard to see. But the question of how to realistically represent the burning and dying fires is a challenge for the artists at Weta Digital. Creating animations by hand would be difficult, time consuming, and unconvincing in terms of the quality of the movements. Instead, they used Massive.
“There’s a whole battle going on there, inside the Massive,” Martin Hill said.
When the Dothraki warriors clashed with the enemy, they fell from their horses, dropping their weapons. And that’s all you can expect from a carnage. The difference is that every element of the battle, except the flames, is black. The audience can’t see what’s going on in this scene. All that can be seen are flaming swords, whose light is quickly extinguished when its wielder dies at the hands of the White Walkers in the dark.
The dynamic element also comes into play in this episode, in another iconic moment of battle. After Arya Stark killed the Night King, his army fell to pieces. Weta Digital’s artists used a digital signal in the software to free the zombie army from the rules of fuzzy logic and let them literally disintegrate.
When planning the battle of Winterfell, director Miguel Sapochnik wanted to draw inspiration from the battle of Helm’s Deep. Luckily for him, the artists who worked on the battle back then were still at Weta Digital.
“The DNA of Helm’s Deep was transmitted through the Battle of Winterfell”, Hill said.
If the Battle of Winterfell represents the fate of the continent of Westeros, then in popular culture there is another battle that is even more epic and important. It’s the final battle in Avengers: Endgame. Because the bet here is half the population of the entire universe.
At a time when hope seemed lost and “Captain America” looked like it was about to end, the gates suddenly opened and reinforcements poured in. It’s a team of Wakanda, Asgardian, Ravager … appear to confront Thanos and his minions for the last time. All the characters then lunge at each other. In the background of various shots, viewers can see them running over piles of debris, or painful bodies soaring through the air.
While the audience may not realize it, they are watching AI-generated agents, with different fighting styles and weapons. Some wield swords, some carry shields and spears, some fly through the air with sparks emanating from their hands.
To create combat models specifically for different groups, Weta’s team relied on motion capture, which allowed Massive to provide each agent with an arsenal of weapons to choose from.
Weta Digital visual effects supervisor Matt Aitken said Endgame is one of the biggest projects the company has worked on. Recording specific fighting styles was a three-day process, preceded by periods of research from other films to see established moves. Motion capture artists have created a variety of combat vignettes that agents can learn from.
Not only are all fighting styles different, but the agents themselves are also different. Back in the days of The Lord of the Rings, Jon Allitt, crowd leader at Weta Digital, created a tool in Massive called the Orc Builder. It can randomly generate different variants of Orcs based on characteristics such as height and limb length. The Orc Builder is now called Body Page and it works the same way in Endgame.
“We don’t want two people to look alike,” Allitt shared during a DVD presentation about the visual effects behind The Lord of the Rings: Two Towers.
Massive is not the only software to create a crowd. Since Regelous, the software’s creator, made Massive its own company in 2003, many other programs have emerged with the ability to simulate crowds. And filmmakers also have some unique tactics for filling arenas, stadiums and the like.
Gray Marshall, an industry veteran and dean of the visual effects department at Savannah College of Art and Design says: “Visual effects, at its core, are not a perfect simulation of reality. It’s about getting the viewer to believe it.”
Visual effects artists on film can solve the problem by creating a composite. They can videotape a group of 20 to 30 people, then ask to change their clothes, move them to another spot, re-record, then repeat the process to finally bring them all together to fill a space. stadium, the steps of a building or anything like that.
Another method involves filmmakers essentially placing digital green screens on the seats of a stadium, and projecting people onto it.
Once, Marshall had to fill 90,000 seats in Wembley Stadium. And he did it with a flesh-colored digital grass fluttering in the wind, because the requirements for sharp details were so little.
Why would anyone decide to choose one technique over another? That depends on the needs of the film, the complexity of the crowd, the budget, and the preferences of the visual effects artists themselves.
Jon Allitt says that Weta Digital doesn’t always use Massive to simulate crowds. In addition to it, they have other special effects programs like 3D animation software Houdini, which can do more than simulate crowds like compositing, modeling, and lighting. Or other options are Miarmy and Golaem Crowd, both plug-ins for Maya, Autodesk’s 3D computer graphics program.
In Marshall’s view, while there is obvious overlap, they are all somewhat different, with different uses in different situations. “It’s like comparing Ferrari to Toyota,” he said.
Compared to all that Massive has created for film and television, the story itself has relatively little dramatic elements. For Regelous and the folks from Weta Digital, what they remember about nSometimes it’s just moments like when they only have a 64GB workstation to use, or when they design a new file format that makes the system run more smoothly.
Regelous remembers that he built a mock-up just in time for Easter and finished it just before dinner. Aitken remembers having to run simulations through the night in order not to crash.
Today at Weta Digital, the team in charge of creating crowds consists of just five people.
“I don’t know if they realize the visual power they’re using.”Aitken said.
Regelous, who still runs the company, said he’s always trying to figure out how to keep moving forward. That means giving filmmakers the ability to see special effects almost instantly during production. For example, the 2019 Mandalorian TV series made headlines when it premiered by using Unreal Engine’s real-time rendering to create computer-generated immersive scenes. That is, instead of inserting a simulated environment into the scene on the green screen, everything will happen right in the process of filming.
A new version of Massive – Massive 9.0 – is expected to be released, possessing new capabilities such as making it more compatible with other software programs, such as Autodesk’s Maya.
Hill thinks it’s unusual for a piece of software to have been used for so long. “It has evolved and gotten better, but the core elements are still what they were 20 years ago.” he said.
But after hundreds of movies and TV shows, and a few Emmys and Oscars, Regelous is still focused on pushing Massive forward. Because it is clearly off to a pretty good start.
“It’s still good for now,” Allitt said. “Massive was 20 years ahead of its time.”