This article is more than 1 year old

Industrial Light & Magic: 40 years of Lucas's pioneering FX-wing

The roots of multithreaded rendering software

Star Wars New Hope @ 40 In the 40 years since the release of the original Star Wars, special effects have changed beyond recognition.

Computers – critical to today's generation of Star Wars successors – were relatively rare during the mid 1970s. Expensive and huge, they were the preserve of governments, universities and big corporations. Personal computers as we know them – GUI and mouse – were still some way off.

In the 1970s there were no FX workshops up to the task of his forthcoming film so writer and director George Lucas acquired an empty lot next to California's Van Nuys Airport, which would become the first home of Industrial Light & Magic (ILM), the special effects and computer graphics division of Lucasfilm.

The individual who helped Lucas found this unwieldy but ingenious project was John Dykstra. He applied some aggressive problem solving and prototyping to devise new technology to realise of Lucas's vision.

Model miniatures of space ships were filmed on specially designed motion-controlled cameras with a basic computer memory that could store and repeat complicated moves. This enabled whole fleets of miniature spacecraft to interact in fly-bys and space battles.

Based on matched moves, film of two objects could be composited together using blue screens and protection mattes. To achieve this, equipment was invented from scratch and manned by what at the time must have seemed like visual-effects renegades. This caused a fair amount of anxiety among the producers and financiers of 20th Century Fox.

ILM created a digitally controlled camera known as the Dykstraflex, which performed complex and repeatable motions around fixed spaceship models, enabling separately filmed elements for examples backgrounds and spacecraft to be coordinated more accurately with each other.

But it was optical printing that was at the heart of the process.

Filming with large format Vista Vision cameras, the VFX team managed to minimise grain build-up which would inevitably occur during the many compositing stages. These seemingly ad hoc processes allowed Lucas to create enough dynamic screen wizardry for A New Hope to influence a new generation of imagineers and animators.

For The Empire Strikes Back, ILM moved north to San Francisco and used the foundation of traditional Ray Harryhausen-style stop-motion animation combined with blue screen compositing to create the iconic and innovative scene of the AT-ATs attacking the Rebel base on Hoth.

The use of Frank Oz and his puppets to populate the planet of Dagobah and bring Yoda to life was another practical old-school technique that arguably gave the Jedi master more gravitas than the somersaulting CGI goblin of the prequels.

Return Of The Jedi completed the initial trilogy and the stop-motion techniques were revamped and renamed Go Motion. ILM constructed a camera rig named the Technirama – a high-speed, servo-driven track system capable of very fast moves combined with digitally added motion blur. This was used on footage of speeder bikes zooming through the towering forests of Endor and meant ILM – once again – managed to convey a sense of flexibility and dynamic camera movement not seen before.

Sixteen years later, and by the time of The Phantom Menace computers had long become mainstream. The film introduced us to a Star Wars universe almost completely realised using CGI, building on the accomplishments in character rendering seen foremost in such films as The Mummy and Terminator 2. ILM was now constructing spaceships, aliens, and automatons pretty much exclusively using CGI with the film boasting 2,000 CGI shots and new CGI characters who could interact almost seamlessly with live-action actors. One might argue that ILM went too far and Lucas over-indulged, leading to the creation of filler characters such as the despised Jar Jar Binks.

Things have changed and in trying to avoid the mistakes made with some of the visually invasive FX of the prequels, recent directors have set out to streamline and focus the use of CGI. Obviously aiming for a more touchy-feely aesthetic to make us all nostalgic for the first trilogy, in The Force Awakens JJ Abrams leaned towards practical effects boasting a Luddite-friendly 28 CGI shots out of 357 scenes and the return of huge sets, big (and miniature) models, and using matte paintings instead of relying so heavily on green/blue screen technology.

The use of IMAX cameras (the Panaflex Millennium XL2 and IMAX MSM 9802) and anamorphic Panavision lenses lent The Force Awakens a warm, fuzzy analogue feel. For all the talk of practical effects work, every single spacecraft seen in the trailer has been digitally modelled and rendered with ILM's preferred pallet of software tools: Autodesk's automation and modelling software Maya, Pixar's 3D-rendering software Renderman, and Arnold – ray-tracing rendering software from Solid Angle developed by Sony Pictures and Imageworks

ILM also employed Plume, a GPU-based simulation and volume renderer of its own, which uses a fully 3D Eulerian solver with volume ray-marching renderer.

Rogue One was going for a much grittier, warlike aesthetic and director Gareth Edwards and John Knoll, who has for the last 10 year been ILM chief creative officer and senior visual effects supervisor, suggested this Rebel-based narrative should include at least half of the movie recreating a handheld coverage feel.

The Death Star combat scenes are the show-stopper in Rogue One. For the first time audiences were shown the devastating effect of the WMD from the point of view of characters on the surface of the target planet, Jedha. This was a huge creative opportunity for Knoll. Creating a shockwave of earth and rock that move slowly from the impact point created a suitable devastating and dynamic effect.

The digital creatures like the android K-2SO, as well as the space battles and a host of AT-ATs and AT-STs, were divided up among ILM's four studios in San Francisco, Vancouver, Singapore and London based on production requirements and what made sense in terms of workload under the supervision of Hal Hickel.

But it was the reanimating the deceased Peter Cushing as Grand Moff Tarkin and the youthful Carrie Fisher as Princess Leia that was the biggest challenge – to avoid falling into the queasiness of the uncanny valley.

The animation and modelling employed in Warcraft – the quality of the facial motion-capture, the hair rendering and grooming – proved close-up human work was possible.

ILM used FLUX, developed by DreamWorks Animation. It's multithreaded software that uses the computer's core CPU rather than the GPU for improved rendering and more efficient processing. DreamWorks' software chews on shape and expression models from 3D data that's been encoded in old footage along with a wealth of micro displacement maps, roto and motion capture. ILM also employed Renderman for plausible shading and – overall – succeeded in delivering a final effect that left jaws on floors.

Created by the Star Wars franchise, ILM has explored, fashioned and fabricated classic optical effects with digital processes and delivery techniques.

ILM walks a fine line between the point technology can essentially replace actual photography. It unsettled audiences by tipping the balance too far into the digital realm with the prequels but tipped it back in Rogue One. To try and match the subtle lighting and gritty realism of A New Hope somehow Rogue One manages to create its own, distinct aesthetic.

With Star Wars: Episode VIII – The Last Jedi coming out at Christmas and Star Wars "Red Cup", which explores Han Solo's background, scheduled for early 2018, we can expect further boundary pushing and line walking. ®

More about

TIP US OFF

Send us news


Other stories you might like