The solution was to electronically simulate the look of longer exposures. "We took advantage of the fact that there's no minimum or maximum shutter angle in digital and used the equivalent of a 360-degree shutter, which allowed us to accumulate frames sequentially for motion-control work," says ILM HD engineer Fred Meyers, who tackled capture issues both on-set and in postproduction. "By synchronizing our motion-control rig to the digital camera, we could slow the effective exposure time by slowing down the camera moves, and we could increase or decrease the effective frames per second through a process of merging and accumulating exposure. So if we ran a move at half speed and threw out half the frames, we got the equivalent of something running at normal speed with a 180-degree shutter, which allowed us to adjust ‘motion blur' in post. We used that formula to reduce 24p to a given camera speed so we could move the rig or the model at the speed needed for the effect. Between merging and skipping out frames, we could get a good simulation of any frame rate we needed, which allowed us to use digital cameras throughout the effects process. We were shooting things all the way down to effective one- or half-second exposures pretty regularly."

That method enabled Rosenberg, Sweeney and Miller to do relatively simple calculations, but Meyers acknowledges that "it doesn't work in all cases. For example, it doesn't allow you to shoot down at a T32 or something really closed down, like T64, which you can sometimes get away with in film using a really long exposure."

Shooting motion control in HD and merging frames meant that the cameramen were viewing moves on the HD monitors that were considerably longer than they would be in the film. "A 100-frame motion-control move of a spaceship flying by at 1 fps translates into approximately four seconds of screen time," Rosenberg notes. "But on the monitors on stage, we saw exactly what we shot in real time; in one instance, we were watching a move that was traveling eight times slower than it was meant to be. We used the HD monitors to judge color, lighting and art direction, and we used our digital disk recorders, which record a single image for a single motion-control frame, to judge pace."

Another concern for cinematographers capturing visual effects is how to use lighting to help give miniatures the proper scale. "Depending on the scale, we sometimes had to use a 1K key light instead of a 20K," Sweeney recalls. "Focus became a real challenge because we needed to achieve greater depth of field with less light. Sometimes we had to deal with multiple focus planes."

This was especially true of setups where the camera was very close to the foreground plane and shooting all the way to the deep background of the miniature. In a few cases, Sweeney found himself shooting as many as three different planes on a single miniature in the desert sequences; Rosenberg shot multiple layers in the 'droid factory; and Miller repeatedly captured the miniature bar exterior in the city of Coruscant. "When Obi-Wan and Anakin were at the door to the bar, the camera was close to the ground plane and you could see Coruscant way off in the distance," Miller notes. "We had to shoot multiple planes because we couldn't stop down enough to hold the focus throughout, even though HD has greater depth of field than 35mm film. That meant that instead of shooting the model once, I had to shoot it three times: once in close-up, again at mid-focus and finally at far-focus. Then our CG artists joined the three image planes together to create a single sharp image."

Miller, Sweeney and Rosenberg were essentially shooting the digital equivalent of an old cel-animation technique: multiplane backgrounds. "When you shoot effects, you love to make it as complete as you can because you're shooting elements of a larger shot that someone else will complete later," Rosenberg observes. "On this show, when I needed to shoot three or four focus passes, we'd use blue cards behind foreground set pieces and then remove the foreground elements as we worked away from the camera. Once you put a blue card behind something, it just isn't the same. To overcome that, I'd shoot the whole miniature at the start of the shoot, even though it wasn't in focus, as a reference for the compositor and also to see how the lighting and sets worked emotionally."

Based on their groundbreaking experiences on Episode II, Rosenberg, Sweeney and Miller look forward to the day when HD manufacturers turn their attention to the unique demands of specialized cinematography. They agree that the format's efficacy for visual effects would be greatly improved by the development of a camera that can shoot motion control at traditional motion-control speeds and F-stops, using smaller lighting instruments. "HD development has been focusing on the live-action market," Sweeney observes, "so they're not [yet] as worried about [effects cinematography]. But after our experiences with the technology on Episode II, we're very excited about the future."


<< previous || next >>
 
www.theasc.com

© 2002 American Society of Cinematographers.