A group of cinematographers assess the digital methods they adopted to render the fantastic scenes and settings in Star Wars: Episode II.


In 1895 and 1896, respectively, Alfred Clark and Georges Méliès first utilized simple visual effects on motion-picture film. In the years since, the cameras used to film miniatures, matte paintings, process screens and other elements have been consistently and creatively modified to suit the needs of visual-effects cinematographers. Visual effects have typically been shot with the oldest motion-picture technology, including pin-registered silent film cameras such as Fries-Mitchells and Rackovers.

George Lucas' mandate to capture every element of Star Wars: Episode II digitally put a group of experienced visual-effects cinematographers at Industrial Light & Magic on filmmaking's new frontier. Working under visual-effects supervisors John Knoll, Pablo Helman, Ben Snow and Dennis Muren, ASC, cinematographers Pat Sweeney, Marty Rosenberg, Carl Miller and others used Sony HDC-F950 24p high-definition video cameras and Fujinon lenses to capture Episode II's extensive model and miniature effects work. "We had the opportunity to photograph the effects for Episode II with the latest in 24p digital technology, which was exciting," says Rosenberg, whose credits include The Right Stuff (see AC Nov. '83), Backdraft (AC May '91) and A.I.: Artificial Intelligence (AC July and August '01).

ILM's inventiveness and perseverance have consistently made the effects house a pioneer in the field, beginning with the first Star Wars film. Because they are often among the first to push a technology in new directions, ILM artists are accustomed to trial-and-error experimentation and learning curves. Sweeney, Rosenberg and Miller's collective experience on Episode II was no different; to achieve what Lucas wanted, they had to marry a time-honored tradition with a very new technology. "We decided to shoot practical models with digital cameras, but there wasn't an immediate consensus that we could do it," says Sweeney, whose credits range from the original Star Wars trilogy to last year's A.I. "We were trying a new format, and we were concerned. We saw the upside and downside and were a little nervous about the risks, but John Knoll said we should just go for it. Overall, it was very successful."

According to Miller (whose credits include Back to the Future II, Terminator 2: Judgment Day and Deep Impact), one significant plus of using HD was the format's ability to offer immediate visual feedback. "We could see an effects shot immediately at full resolution on the big, high-res monitors that were kept inside a black tent on stage," he says. "That allowed us to more accurately assess whether our shots were working, so we didn't have to keep tweaking them to death. Also, rather than pulling out a light meter and reading values of exposure, you can actually look at the model you're shooting digitally and build the shot until it feels right. Unlike film, where we have to calculate the exposure and trust our intuition, with HD we can actually open or close the lens, then look at and ‘feel' the image, so we can now be much more aesthetic about it.

"We could also do more accurate onstage composites because the images from the HD signal were better-registered and clearer, compared to the normal videotap images we're used to getting with film," Miller adds. "If we had to add someone into a miniature shot, we could make sure his feet were tracking perfectly to the floor of the background we were shooting. And we didn't need to shoot wedges. For example, in lighting our model spaceships in the past, we had to shoot wedges to determine whether the exposure for the model's practical lights looked good with the lighting on the model. With HD we can actually see how it all looks together on the monitor."

Sweeney emphasizes that a large, properly calibrated monitor is crucial for effective evaluation. "You really need a decent-size monitor when you're looking at digital images; you can't really judge the image or the focus on a small one. From the start we knew we wanted the best monitors we could get. Ours were 24 inches across, and their size helped to minimize discrepancies between a stage monitor and the DLP projection system we used."

Part of the learning curve included the way colors photo-graphed. According to Rosenberg, the cinematographers often found that "when going from stage monitors to the DLP projector in the dailies theater, certain colors shifted slightly. We attempted to calibrate the digital cameras and monitors to a certain standard, but we learned that every color responds differently, just like with film. I photographed elements of the 'droid factory sequence, which had a lot of red, and red is a color that ‘pops' a little too much in digital, just like it does with certain film stocks. I had to be very careful to step on it a little bit, otherwise I'd get images that were too red – not just redder than I wanted, but redder than it actually was. But we quickly learned the nuances of the medium and adjusted to them."


next >>
 
www.theasc.com

© 2002 American Society of Cinematographers.