A new technique assures that, for the first time ever, the sound heard in the theater is identical to that heard by the director during the mix


Dolby Laboratories is best known to the general public as a result of the B-type noise reduction system, manufactured under license by most high fidelity companies, and incorporated in the majority of cassette recorders, and an increasing number of receivers and tuners. The Dolby A-type (professional) system itself, is found in most music recording studios, many broadcast organizations, and in connection with motion picture sound tracks.

Dolby first became involved in film sound in 1971 when A Clockwork Orange was dubbed in London. Kubrick's film used the Dolby system throughout the magnetic generations (up to five in some cases, depending on premixes), and there was a significant improvement in noise level even on the optical-track, which had been conventionally recorded without noise reduction.

What's Wrong with Motion Picture Sound

The hiss that comes from magnetic recording generations, though, is only one of the problems that beset conventional film sound quality. High distortion, limited frequency response, optical noise, and—a lack of intelligibility, are all major quality constraints. A conventional optical track, heard in the theater, will have a high frequency response little better than 4kHz (about the same as a telephone receiver), compared with the 12kHz or more expected from a home high fidelity system. No single problem in the recording and playback chain is responsible for the high frequency losses, but rather a series of effects, including filters during recording and playback, printer resolution in the laboratory and loudspeaker and screen limitations in the theater creates a cumulative loss. Despite the fact that the frequency response seems inadequate today, it is worth remembering that in the late twenties and early thirties, when many of the standards for film sound were established, the quality of sound in the motion picture was better than could be heard in the home: it is only the development of consumer high fidelity equipment that by comparison reveals the inadequacies of movie sound.

In 1972, Dolby started an in-depth investigation into how film sound could be improved. Central to this was the idea that the noise reduction could be used as a tool to effect other changes. If the playback frequency response were to be improved in the theater, the noise (hiss and sparkle) from the sound track would sound much worse, as if the treble control had been raised on a hi-fi set when playing an old record. Use of noise reduction on the optical track, though, allows the response to be improved without the noise build-up that would otherwise occur. An associated benefit is that the distortion goes down, since the mixers making the sound track do not need to boost the highs so much if they know that the playback response will be extended, and this in turn leads to lower distortion.

These ideas have been utilized on nearly a dozen monaural optical releases between 1973 and today, including such films as Steppenwolf and Stardust. These films exhibited a significant improvement in sound quality over conventional optical tracks, when played back in theaters which had installed the suitable decoding equipment. The quality improvements seem equally useful to dialogue, music and effects; the general subjective response is typically that the track has a greater reality, much in the way that a picture seems more real in color than black and white.

Stereophonic Sound

Despite the improvements achieved with Dolby encoded monaural optical sound tracks, stereophonic sound represents a further quantum jump on subjective sound reality. Until recently, stereophonic sound in the theater could only be achieved by the use of magnetic striping on the release print, with three, four, or (on 70mm film) six tracks. The fidelity constraints described above, relating to optical tracks in the areas of frequency response noise and distortion, apply also to the magnetic release print, though perhaps to a lesser extent. A few films have been released using the same Dolby techniques as applied to the mono optical track on magnetic stereo release prints, including The Little Prince and Nashville.

Magnetic tracks have other problems, though, in addition to those mentioned. Significant among these are the price (magnetic striping and recording can add as much as 50% to the cost of a release print), and maintenance problems caused by excessive playback head wear in the projector. So, despite the fact that magnetic stripe can have a frequency response flat to 12kHz, this is rarely replayable in the theater.

Stereo Optical Sound Tracks

In 1973, Eastman Kodak and RCA collaborated on the design and development of a two-channel stereophonic optical system, based loosely on earlier proposals from as far back as the late thirties. Stereo optical tracks have advantages over magnetic tracks, since prints cost no more than conventional optical releases, and none of the maintenance problems associated with magnetic stripe are present. The Kodak/RCA proposal was to have two tracks side-by-side in the area normally occupied by the single optical track, with the additional advantage that in this way an unconverted projector can play the track and get a mono compatible signal, in the same way as a stereo phonograph record played in mono.

Dolby Laboratories joined this project in 1973, in an effort to apply the techniques they had previously used on monaural optical tracks, in upgrading the fidelity of the stereo optical system. Current specifications call for a signal-to-noise ratio in excess of 62dB, and a frequency response to approximately 12kHz.


[ continued on page 2 ]