0
Return to Table of Contents
Return to Table of Contents February 2009 Return to Table of Contents
Coraline
Page 2
Page 3
Page 4
Page 5
Page 6
International Awardee
Just Not That Into You
Short Takes
Post Focus
DVD Playback
ASC Close-Up
Pete Kozachik, ASC details his approach to the 3-D digital stop-motion feature Coraline, whose heroine discovers a sinister world behind the walls of her new home.


Unit photography by Galvin Collins
Additional photos by Pete Kozachik, ASC

Exciting events tend to happen as soon as conditions are right, and Henry Selick’s stop-motion feature Coraline, based on Neil Gaiman’s supernatural novella, rides in on a host of new innovations, including advanced machine-vision cameras and the emergence of practical 3-D. Most instrumental was the birth of Laika Entertainment, Phil Knight’s startup animation company in Oregon, fresh and eager to try something new.   

I made it a priority to line up talented and experienced cameramen early. Leading their three-man units were cinematographers John Ashlee, Paul Gentry, Mark Stewart, Peter Sorg, Chris Peterson, Brian Van’t Hul, Peter Williams and Frank Passingham. Most of the camera assistants and electricians had shooting experience of their own, making the camera department pretty well bulletproof. With more than 55 setups working at the same time, we needed guys that were quick, organized and versatile.   

From the beginning, we knew the two worlds Coraline inhabits — the drab “Real World” and the fantastic “Other World” — would be distorted mirror images of each other, as different in tone as Kansas and Oz. Camera and art departments would create the differences, keeping the emphasis on Coraline’s feelings. Among the closest film references for the supernatural Other World were the exaggerated color schemes in Amélie, which we used when the Other Mother is enticing Coraline to stay with her. The Shining and The Orphanage provided good reference for interiors when things go awry.    

Image banks such as flickr.com were a good source for reference pics, and including those shots in my lighting and camera notes helped jump-start crews on new sequences. Artist Tadahiro Uesugi supplied a valuable influence for the show; his work has a graphic simplicity, like fashion art from the Fifties, with minimal modeling but an awareness of light. It helped in spirit to guide us away from excess gingerbread, which is typical in both art and lighting for stop-motion animation. Our film has plenty of interesting things to look at, but we did our best to make every bit of eye candy contribute to the main story. Uesugi’s handheld stylus gives lines a slightly wavy edge, which the art department used to weave more life into architecture.   

Before hiring on, I sought a way to improve on limitations of digital SLRs we encountered on Corpse Bride (AC Oct. ’05). On that show, fuzzy video-tap images were animators’ most common complaint. Ideally, they would have been using superior images fed from a production camera, not fuzzy ground-glass images. When testing cameras for Coraline, I discovered that the most promising among contenders was the MegaPlus EC11000, a machine-vision camera based on a 4K Kodak CCD sensor. It sported these features:

•    Able to double as its own tap, outputting sharp 1K or 2K mono at fast frame rates
•    Thermoelectric sensor cooling for low noise in long exposures
•    Physically large 36x24mm sensor
•    Among the cameras tested, its response curves were most similar to film
•    Rugged, machinable aluminum body
•    Nikkor F mount
•    Sensor housed in a dust-free, inert gas-filled chamber
•    Software-development documentation for custom user applications

Unlike dSLRs, each camera had to be tethered to a smokin’ fast PC (running our custom application) that grabbed production frames and served as a higher-res animator guide and color display to check lighting. The company R&D team worked hard on the ambitiously spec’d software, delivering a workable beta version before moving on to other projects. It is an exciting step forward with a lot of features, including 3-D diagnostics and two-way serial communication with Kuper motion-control. When production reached full speed, we were shooting with 38 MegaPlus EC11000s and eight Nikon D80s.   

With all the talk about small image sensors making it difficult to soften backgrounds, plus the traditional use of economical Nikkors for model shoots, the big sensors were a great find. But the large size also had a limitation: each pixel in a CCD array sits at the bottom of a tiny well, so if a ray of light is oblique, less of it gets captured. Worst-case scenario is a wide-angle lens imaging on the outer edges of the sensor; this produced a significant vignette.   

Redlake, the original camera manufacturer, gave us some useful tips on new glass designed to minimize the effect by projecting a more parallel bundle of light rays. Luckily, two of the three were zoom lenses, giving us a collective range from 17mm to 70mm, plus a 90mm prime, all made by Tamron. They were fitted with focus-ring gears by one of our camera techs, Chris Andrews. With 46 cameras on the floor, we had to duplicate everything, lenses included. I got over my bias against zooms (in terms of sharpness) as soon as we shot tests, but as shooting progressed, we all wished for tighter, cine-style construction. We eventually augmented with various Sigma lenses for their close-focus capabilities. As the show progressed, we used primes as short as 14mm, deciding the slight vignetting was more of a production value than a defect.   

Another concern in technical prep was outfitting the studio with enough motion-control capability to shoot a feature. Prior to coming on the show, I worked with a Wisconsin-based machine shop via the Internet, developing a motion-control rig spec’d for stop-motion. It was a bit surreal; my co-designer and I have never met. We used his CAD models as our means of design and approval. This was a departure from my past approach, making wooden scale models and being on-site with the builder. CAD files were a natural for long-distance collaboration and quick turn-around, though I missed being there to make real-world checks on design details. Indeed, we ended up modifying several details later for just that reason.   

The rig’s design was as basic and economical as possible, with the intent to service many setups with motion-control at the same time. I ordered 11 of them, which raised some eyebrows, but I knew how much Henry likes to use camera movement in his storytelling. Sure enough, those rigs were swallowed up along with the existing half-dozen and were never idle for the entire shoot.   

For all the reasons that anyone who has depended on video dailies can appreciate, we were fortunate to have the resources to project dailies with a full-fledged D-Cinema projector. It was one of three display devices we used, and all of them had strengths and weaknesses. The most unique and important uses of projection were checking 3-D in terms of depth and checking exposure as affected by the 3-D polarizing filters. It was also useful to see full-size tests for performance and dressing.   

Though we only projected under stereo conditions, the Flipbook monitors displayed a D-Cinema look and brightness. That supported our strategy to make the image as good-looking as possible and then try to duplicate it in stereo.   

The Avid display used in editing was far brighter, lifted to see more into the shadows, and thus unsuitable for assessing anything beyond basic composition. Of course, the convenience of the edit room held an irresistible temptation to make lighting decisions there. To cover with a reality check, our pre-launch procedure included checking lighting continuity with surrounding shots on stage monitors, followed by a “launch frame” in the screening room. Such methodical testing is typical in stop-motion because most shots take several days to animate — second takes are very unpopular.   

Having the studio set up with an intranet enabled many important capabilities; as we shot, each frame traveled to a server, allowing Henry to see animation-in-progress cut in or projected. Cameramen could do the same, and we enjoyed easy access to projected tests whenever we felt the need. And we could call up other shots on the studio floor, a valuable timesaver in matching lighting. Our only seriously missed target was capture speed, which I hope to revisit with new data-transfer technology.  
 

next >>