20th Century Fox
The Na'vi, a blue-skinned species in the 2009 movie "Avatar." were created with the help of computational physics.
Remember the Na'vi – the blue-stripped humanoid species with pointy ears and a powerful bond with nature in last year's biggest sci-fi epic, "Avatar"? They were created in a physics lab.
In fact, the entire movie "stands out for the amount of physics that was involved," Robert Bridson, a computer scientist at the University of British Columbia in Vancouver, told me in an e-mail. "A lot of the environments, and of course the characters, were completely computer-generated."
Bridson is an expert in the physics of computer-generated animation and co-author with Christopher Batty of a review of the state of the art and the challenges facing the field, published this week in the journal Science.
"Compared with more traditional animation methods that rely chiefly on artists' efforts, numerical solutions to the equations of physics allow computers to calculate realistic motion, such as that of smoke, fire, explosions, water, rubble, clothing, hair, muscles and skin," they write.
This, in turn, results in animated films with amazingly realistic scenes. For example, when "Avatar" was made, New Zealand-based Weta Digital used physics to simulate how the muscles and skin of the Na'vi worked, how their clothing moved, and how the trees and plants on Pandora moved as well, Bridson noted.
"I also helped write the Naiad software they used to simulate a lot of the water in the film, from the river Jake Sully falls into at the start to the ocean waves pounding the coast near the end, and the water drops in the leaf that Neytiri drinks from," he said.
Routines and challenges
Water, Bridson noted, is one of the biggest current challenges in computer animation, given the complex geometry involved.
Some aspects of making digital characters are becoming routine, such as making clothing ruffle, flames flicker and smoke billow realistically. (You can check out animations on Brison's website.) But really large-scale explosions, as well as scenes with multiple size scales — such as a boat on stormy seas — remain a challenge.
And then there's hair.
"Long hair or curly hair is still a huge problem," Bridson said. "It's difficult enough to get hair to behave in real life for action scenes."
Other challenges include getting computers to work fast enough to appease demanding directors. Filmmakers are also looking for methods to judge the quality of an animation. These are largely problems in transferring the technology from the lab to the movie studio.
"The scale at which studios want to do things is almost always a lot greater than what academics can tackle," Bridson noted.
He noted that most audiences may not even be aware how much of the action in the movies they see is already computer-generated. For example, rather than getting a yacht, camera crew and actors out in a real ocean to shoot a scene, it's actually cheaper for a studio to build a model of the boat, simulate the ocean around it and then put in the actors who were shot in front a green screen.
In the coming years, advances in the algorithms used for computer-generated animation should lead to subtle improvements in the quality and creativity of visuals, and allow directors to achieve their vision with shrinking budgets, Bridson noted.
"'District 9' from 2009 is an important harbinger of what's going to be coming," he said. "With a relatively modest budget of $30 million, and from outside the traditional Hollywood blockbuster network, they made a gorgeous, visually compelling work of art with a bit of a risky and very intriguing idea."
Just don't get your hopes up for feature-length films made by amateurs, a la YouTube, he added.
"Even if all you were to need on the technical side was a computer — no cameras, no sets, no actors — the artistic challenge of creating a film remains, and will still require a huge amount of dedication, time and talent."
That sentiment echoes what Ohio State University computer scientist Rick Parent told me in November when we chatted about the computer technology used to allow Jeff Bridges to play a nearly-30-year-younger version of himself in "Tron: Legacy," the sequel to the 1982 blockbuster.
Bridges told the Daily Mail that he could imagine a day when he could appear in movies without really acting by simply leasing studios his image. Parent said maybe, but not anytime soon.
"With removing the actor completely, now you've got a whole different problem of building those body motions, those facial motions, the speech — which is a whole other problem. Building that essentially from scratch … that's a whole other level of complexity, and we are not there at all."
Check out the links below for more stories on computer animation.
- Brave new world for virtual actors
- 'Avatar' technology raises Oscar question
- Alien plants get new twist in world of 'Avatar'
- Scientists visualize virus on the attack
- Moving closer to a 'Matrix'-style virtual world
John Roach is a contributing writer for msnbc.com. Connect with the Cosmic Log community by hitting the "like" button on the Cosmic Log Facebook page or following msnbc.com's science editor, Alan Boyle, on Twitter (@b0yle).