Others have professed that "the film look" is not really softer, but instead is a consequence of the construction of film. Film grain is random, they remind us, different from frame to frame, 24 frames per second. Even in a fine-grain film, they postulate, the film image has a kind of frame-to-frame shimmer that makes it look a little more ambiguous than the inherently grainless, dynamically stable electronic image.
As for me, I've been given the impression that NTSC video has a color palette the equivalent of about 8 bits in depth, while film reaches 36 or 48 bits of depth, maybe more. This wouldn't make film look "softer," except to the extent that the transitions between color shadings would be less abrupt, more rounded and richer. That would be a definition of "the film look" that no amount of filtering could bestow upon a videotape.
What Is "The Digital Look?"
Regardless the source of "the film look" compared to "the video look," everyone seems to agree upon its existence. The fact that it's attributed to so many origins reveals that there are many technical reasons why pictures made by cameras on film look different than those on tape. What's not so clear, however, are the technical reasons why a "digital look" would be so drastically different from film.
Most consumer digicams deliver an image 24-bits of color deep. Some of the professional models reach 36-bits and maybe even 48-bits. Film, at its best, might deliver an image with even broader color rangesit's not so easy to make direct comparisonsbut one thing's for sure: the very best digicam picture is a lot closer to the very best film picture than the very best NTSC videotape picture is.
But regardless of whether the picture is captured at 36-bits or 48-bits, it'll finally be dumbed-down to the 24-bit range that is the current standard for display. Having that extra bit-depth in the original provides that much more to work with when a particular outcome is desired in the 24-bit final. But the final will be 24-bit, with 16.7 color shadings between black and white.
So regardless of whether captured on film or digital, the picture starts off on a fairly even footing (assuming a high-end digicam) and is compromised almost equally to be squooshed down to 24-bit display standards.
So if it's the display that's the weakest link, how can there be a "digital look?" The essential distinctions between film and digital have been trimmed away. What we're left with is neither "the film look" nor "the digital look." It's "the monitor look."
The Proprietary Look
Since the dawn of photographic recording, all sorts of folks have had all sorts of opinions on what constitutes the "best look." Among current film stocks, you can get richly saturated or pastelish colors, high-contrast and low, and they all have supporters.
Digicams follow suit. Olympus touts its TruePic technology, Fujifilm's Super CCD gives rich saturation. These are among the promoted features of those two lines, but other manufacturers have their tendencies and trends, too. How many digicams seem to think everything in the world should be blue?
Some digicams use RGB filters, some used CYM. All subject their resulting pictures to algorithms which, in effect, interpret the colors both individually and in relation to one another. They make their own decisions about contrast and sharpness as well. There's a lot of "manipulation" in-camera, behind-the-scenes, between the time the image enters the lens and gets written to memory.
Arguably, no two digital cameras reproduce colors and contrasts the same. They could be instructed to, if somebody so desired, but nobody's instructed them to (except, possibly, as a component of a larger scheme, such as Epson's well-received PIM or Print Image Matching systemwidely adopted, though not universal). So where "the film look" currently is restricted to, say, a couple dozen interpretationsone each for each brand of filmthe "digital look" could potentially be as diverse as every digicam model on the market.
So when someone cites "the digital look," exactly what does he have in mind?
And if his reference is to something so diffuse, so elusive of definition, how come the Digital Dude was getting so snappish when the editor said he liked film better?
It must have been the Campari.
Meanwhile, Back At The Computer ...
As a columnist for PTN, the Digital Dude is as ecumenical about capture devices as the editor is, and the guy repping Nikon is. There's plenty to be said for film in the digital era.
But having said it, I'd also have to say that in the past two years, every photo I've taken as a pro has been taken with a digital SLR.