TechRadar will tell you about the coolest new stuff. Our experienced writers operate under Future Publishing's 20 year old policy of a cast-iron guarantee of editorial independence. We'll explain how it works and why you should care (or not).
How special effects transformed the movies
From Westworld to WALL-E,
CGI has revolutionised cinema
By Tom Dennis | Source:
To create movies of the quality we now expect, special effects houses have to use every trick in the book, from classic green-screen technologies to the creation of full artificial intelligence systems. It's no wonder that names like Industrial Light and Magic are as important in Hollywood as any producer's or director's.
While there's no question that you need advanced software techniques on your side to produce Hollywood effects, most of what's needed comes down to raw processing power.
These applications are fine for the basics, but the larger effects houses spend as much time on software engineering as they do on the artistic side, writing custom code to fix specific problems and bringing new effects to life. Sometimes these become products in their own right, as happened with Pixar's RenderMan, the engine behind not only the company's own films such as Ratatouille and WALL-E, but also most major Hollywood blockbusters, including Harry Potter and I Am Legend.
Mental Ray is another common industry render engine, and it's used on all manner of Hollywood blockbusters. Essentially acting as an API, Mental Ray allows batch mode rendering within common software environments. This means that designers can render their output via their favourite software package, be it Maya, 3DS Max, Softimage XSI or Side Effects Software's Houdini.
The advantage of this is that designers and artists can use a common rendering file format a '.mi' scene file across different applications, using each app's own shading methods, procedural textures, bump and displacement maps, atmosphere and volume effects, environments, camera lenses and light sources. The level of complexity involved here is closer to an engineering project than a standard artistic one, but it's wasted if the artistic side falls flat.
Pixar is a great demonstration of the two working side by side. When Toy Story came out, the relatively primitive state of 3D graphics didn't allow for the complex effects we're now used to seeing cloth effects, convincing human animation and photorealistic backgrounds, for example. So the company focused on the type of effects it could pull off rigid-body toys, where any weaknesses would simply contribute to the charm.
Each subsequent release followed a similar pattern, introducing more realistic animation in A Bug's Life, mastering fur in Monsters Inc and coming up with the cartoon humans that made The Incredibles so much fun to watch. Every movie raised the stakes. Every movie was a hit.
The State of the Art
The history of CGI in live-action films hasn't always been smooth. The earliest practical application of CGI is generally agreed to be the point-of-view sequences of Yul Brynner's robot gunslinger in the 1973 futuristic western Westworld. The producers employed 2D computer-generated animation to simulate the robot's vision. For the 1976 follow-up Futureworld, the producers went one stage further and introduced 3D elements via rendered polygonal models, a technique which has now become standard.
Not all the effects of the time were so complicated. In many cases, it was easier to cheat. The TV version of Douglas Adams' The Hitchhiker's Guide To The Galaxy (1981) appeared to use computer graphics for the pages of the Guide, but in fact these were hand-drawn scenes created to mimic the style of contemporary computer animation.
Other artists found that the technology available simply wasn't able to produce what they wanted.
The Japanese anime film Golgo 13 (1983) was one of the first movies of its kind to introduce proper computer animation interposed with traditional techniques, leading to a hysterical scene where the cell-animated main character keeps cutting away to a blocky, untextured helicopter gunship.
It's therefore not surprising that the first truly legendary CGI-heavy film was, like Pixar's films, designed to play to the technology's weaknesses as well as its strengths.
1982 saw the release of Tron, complete with real actors and the first fully computer-engineered 3D scenes. "One of the difficult tasks on Tron was to create a unified look for both the real world and the electronic world," said producer Donald Kushner several years after the film's release.
"Like in The Wizard Of Oz, there are two worlds. The difficult part was integrating both of them. We used computer simulation, we used backlit techniques and we used conventional live action. The challenge was to make it all look cohesive."
After Tron, a variety of watershed films employed ever-more impressive CGI advancements, from Indiana Jones and the Last Crusade featuring the first all-composite scene to Terminator 2: Judgment Day's startling visuals of the T1000. The latter marked the first use of natural human motion for a computer-sculpted character.
Its liquid metal effects, particularly in conjunction with the then-revolutionary morphing technology that would soon take over every film and commercial in sight, was a particular eye-opener, giving us a villain that combined the best technology from both 1991 and a post-apocalyptic 2029.
It was Toy Story, though, that really cemented CGI's place in the industry. While producing the film, Pixar grew from just 10 people to 150 an unheard-of number for a computer graphics project.
50 to 70 people were on the technical team, working under technical director Bill Reeves and animator John Lasseter. They were tasked with producing the software that would become RenderMan.
"If you have a good story and good characters, you can use CGI to create a movie that does $200million at the box office and accolades up the wazoo," Reeves said, noting the importance of choosing the right project instead of just relying on effects. As for Lasseter, it's tough to argue with his recent description of his field: "Computer animation's an art form that grew out of a science."
While the likes of Pixar and Disney transformed animation using CGI principles and George Lucas and Steven Spielberg leveraged ever-more believable CGI and compositing work into their films, it was Peter Jackson who truly propelled the technology into its next stage by incorporating artificial intelligence into the huge battle scenes that featured throughout The Lord of the Rings trilogy.
Jackson wanted software that would allow hundreds of humans and orcs to interact naturally without the need to animate each character individually. Each soldier had to fight the right enemy and behave as a character would in battle.
The answer came via a developer named Stephen Regelous, who created Massive. This program allowed developers to quickly create thousands of individual characters, each of which responded differently to its surroundings.
The reactions of every character affected other characters in turn, changing how they acted and allowing motion-captured animations to create a realistic scene. Without Massive, the battle scenes of Middle Earth would have been near-impossible to create.
An apocryphal tale recalls how Massive's AI was so sharp that when confronted with thousands of baying orcs, the armies of Middle Earth quite sensibly turned tail and ran away in terror. In reality, a bug in Massive caused the glitch, but the other story is an excellent anecdote.
Blending fact and fiction
Most will cite the likes of Toy Story, Shrek and The Lord of the Rings trilogy when thinking about CGI, but the real skill of special effects studios is blending real-life footage with elements of CGI for a rich, believable tapestry. Green-screen compositing and rotoscoping effects as seen in Sin City, 300 and A Scanner Darkly have blurred the line between reality and fiction.
In some cases, the effects have been purposefully 'virtual'; for example, A Scanner Darkly used the Rotoshop digital technique to achieve a classic look. The recent blockbuster Iron Man clearly needed to use huge elements of CGI, but for certain passages of the film it's hard to tell what is composited and what is real life footage.
The Iron Man's cybernetic costumes started life as hardware, but Stan Winston Studio built practical versions of all the suits, along with those worn by Iron Man's nemesis, Iron Monger.
These suits were used during the live shoot, but the physical props were replaced with digital versions when needed. Visual effects giant Industrial Light and Magic (ILM) did the bulk of the compositing and animation for the film, including building the virtual suit.
Because ILM was also responsible for Transformers, it had experience of rendering metal objects. But Iron Man was a different kind of movie: "The designs needed to be believable," says digital model supervisor Bruce Holcomb, who moved onto Iron Man from Transformers. "For Transformers, we constructed alien robots with lots of parts, and the visual confusion added to the enjoyment. Iron Man was more about design. The suits didn't have ambiguous parts moving for no reason."
Creating by removing
Just as it can add elements, CGI can also remove them. The 2007 release I Am Legend needed to depict a post-apocalyptic New York City. Director Francis Lawrence didn't want an obviously computer-created set, so he blended CGI with the motion capture. Special effects were used to dilapidate NYC, remove stray New Yorkers from windows and stall moving cars seen in wide shots. "We didn't want to make an apocalyptic movie where the landscape felt apocalyptic," said Lawrence. "There's something magical about an empty city, as opposed to it being dark and scary."
Whether adding fantastical characters and scenery, removing human elements or simulating epic battles, CGI is now a staple element of modern movie making. It may seem odd that the common tools of the trade are commercially available software packages, but this only goes to show that the real skill of special effects lies in the artistic expression used rather than the sheer processing power of the technology. The Best Visual Effects Oscar, established properly in 1996, sets out to reward "the artistry, skill and fidelity with which the visual illusions are achieved". As a result, even in today's world of super-powerful computers, the award is still won by skilled creatives rather than nimble programmers.