Click here to learn more about anamorphic widescreen!
Go to the Home Page
Go to The Rumor Mill
Go to Todd Doogan's weekly column
Go to the Reviews Page
Go to the Trivia Contest Page
Go to the Upcoming DVD Artwork Page
Go to the DVD FAQ & Article Archives
Go to our DVD Links Section
Go to the Home Theater Forum for great DVD discussion
Find out how to advertise on The Digital Bits

Site created 12/15/97.

The Digital Bits logo
page created: 6/24/08

Yellow Layer Failure, Vinegar Syndrome and Miscellaneous Musings by Robert A. Harris

Robert A. Harris - Main Page

DNR... and Other Things That Go Bump in the Night *

Blu-ray has won the high definition sweepstakes, with marketing that trumpeted the reproduction of the look of cinema. To that end, it has a huge amount of space to back up its abilities, which are many.

But within months of its taking over the high definition market, problems have set in caused by and affecting many people both in and outside of the industry.

The problems?

One one side, bloggers, critics and "reviewers," some with an affinity toward the look of video games, HD nature channels and HD sports, who have been led to their position via shelves of HD monitors in stores all with the same glossy, clean image, and who consequently now want "eye candy." Let's call them "eye candy folks."

And on the other, studio executives who know about film, but who may be wincing in fear of each and every comment on line that accuses their releases of being "grainy," lest the precious discs not sell in high numbers.

Quality and Quality Control have been a problem since the laserdisc days.

Generally on VHS you couldn't see problems if they existed.

DVD changed all that.

Early on it was recognized that masters prepared for laserdiscs weren't going to cut it, and the studios did the correct thing by going back and preparing new masters for the less tolerant DVD playback system.

But even new masters didn't solve all of the problems. One major concern was film grain, and the havoc it played upon compression.

Vendors began to hang out signs for mastering, compression and other skills.

Some with all of the requisite bells, whistles and tech speak, but not much more experience than one might find at Ray & Irwin's Garage.

Studio asset protection staff would create a quality master, occasionally from newly minted film elements, only to have work destroyed in the end by vendors contracted to do a job without the requisite skills.

Two of the more notorious that come to mind are the standard definition DVD releases of Cold Mountain and Gangs of New York. Both digital abominations from quality masters.

Which brings us back to The Great Grain Debate.

Film Grain has been with us now for well over a century and a half, and was never looked upon as "the enemy" until a few "reviewers" and bloggers got it into their heads that it was somehow in the way -- keeping them from seeing the true image.

Grain is an inherent part of the film image.

Remove it at your peril, as layers of real problems may then arise.

Do the DVD buying public and studio executives truly believe that Chapin, Keaton, Ford, Wellman, Welles, Hitchcock, Lean, Mamoulian and others were a bunch of hacks?

Is it possible that Bitzer, Burks, Young and Toland had not a clue, and need some lowly digital tech to clean up their errors?

Have generations of scientists in Rochester had no clue about the horror that they were handing down to us?

I don't think so.

Film Grain has been a known entity and a part of the design of film from the beginning of photography. It has merely gotten smaller and less obvious over the decades.

Everyone involved, from the scientists who created the emulsions, to the production designers who created the sets, the make-up artists who gave the actors their "look," the costume designers who dressed them, the cinematographers and camera operators who exposed the raw stock, the processing labs, effects experts, optical camera workers and finally post-production executives who put it all together...

really did know then, and still know today, what they were and are doing.

We have a film history that goes back 104 years.

Why, all of a sudden, is it left to people who wouldn't know which side of a camera to point toward an actor to totally re-write the history and look of our cinema?

And this is precisely what is at stake.

Don't get me wrong.

I'm more than aware that a certain amount of grain reduction is occasionally needed for acceptable compression in standard definition DVDs. Grain can be digitally manipulated to form a more cohesive image, especially when dealing with a myriad of elements sometimes needed to create a quality master from old or damaged film elements.

And then came Blu-ray, and along with it, a great piece of hardware -- the popular price leader -- PS3.

Early on, one of the major selling points for the Blu-ray system was a higher capacity disc as compared to HD. This meant that not only could more data be encoded to a disc, but that compression rates could be lowered, and by that I mean more bits per second making their way through the pipeline.

There have been some glorious Blu-ray discs produced of catalog titles. Discs that allowed the full cinematic beauty and grain to survive the transition.

Think Bullitt from WB, The Sting from Universal (released on HD-DVD), and Reds from Paramount, and more recently The Sand Pebbles from Fox, The Professionals from Columbia (Sony). All of these films were shot beautifully, with fully exposed negatives. Add to these more problematic productions like Columbia's Baron Munchhausen and Dracula, films with a courser grain structure, and a slightly softer feel to their imagery.

Some of these beautifully replicated films were hit by the backlash of cinematically less sophisticated innocents.

Feelings are made known, and the message goes through the studio food chain like wildfire.

Which is the long way 'round in finally bringing us to one of the newest and most discussed Blu-ray releases of 2008.

And it is being discussed for all the wrong reasons.

A film, photographed on Eastman Color Negative 5254 on 65mm 5 perf, through some of the finest glass imaginable by a great cinematographer.

The Best Picture of 1970.

Franklin J. Schaffner's Patton.

Allow me to state up front that this isn't really about Patton, and it isn't about Fox. Nor is it about any of the vendors who worked toward bringing Patton to Blu-ray. It is about film in general, and how our heritage may be viewed and survive in the future.

Patton is merely the current poster child for how it should not be handled.

This is a difficult piece to write, as there must be a balance between passion and compassion.

Passion toward the concern that our film heritage is in jeopardy, and compassion toward those individuals involved on both sides, in a virtual tug of war.

The studio people making decisions are intelligent. They want to create top quality software, service the needs of the public, and have that software jump off store shelves.

No one wants to create a problematic disc.

The folks on line and in print are just as concerned about what they see and what they may perceive to be a problem. And they too are an intelligent bunch.

Fact: The apparent grain structure of Patton is approximately 40% of that of a normal 35mm 4 perf production. It should not need to be touched in preparation of porting it over to Blu-ray.

Fact: We now have extremely high quality means of extracting an image from a piece of 65mm element, be it scanning in 8k or capturing the image directly to HD.

And this all should have come together to yield what many hoped would be one of the finest Blu-ray releases of 2008.

Some believe that it is.

Anyone recall my mentioning that a major selling point of Blu-ray was its ability to reproduce the look of cinema in the home theater?

Patton didn't.

Blu-ray, as a system, failed the acid test.

But why did it fail?

Was it the fault of Blu-ray?


There seems to be no protective mechanism in place to weed out questionable work, which at this early stage of life will make Blu-ray look like gaming software, rather than software designed to reproduce motion pictures.

Think an entity like THX or TAP.

As an organization Blu-ray has failed because it allows software that does not deliver their promise of quality to hit the marketplace. This is something that should have been in place since Day One.

Is this the fault of the "grain reduction" vendor?

Not really. Someone hired them to do a specific job and approved their work.

Is it the fault of the home video division?


But this is where it gets difficult. Let me re-state a fact.

No one sets out to create a problematic Blu-ray release.

Everyone is trying their utmost to create a superior product.

However, the wrong people may be adding their thoughts to what film should look like. This becomes a problem only if they have no idea what it should look like.

As cinematographer Gordon Willis said: "The film has already been made."

"The job is to reproduce it."

And that should be simple, except that there may be too many involved in the creation of the final result.

Some executives are fearful that the disc they release might not be a hit, or might get poor notices from bloggers. Because of this, some have begun to listen to people who have no idea what they are preaching.

This isn't about what anyone likes or doesn't like.

It is about the intentions of the filmmakers.

So what's the upshot?

Someone at the studio makes the decision that "our films can't have grain."

A well-meant decision…

but wrong.

What is grain anyway? What are they thinking?

Grain, they feel, must be something that is obviously not a part of the exposed image; that it somehow makes its way onto the film as it is either duplicated or ages.

But, if the grain grows as the film emulsion ages, then it should (must) be removed so that the crystalline clarity of the original can shine through on guess what…


And hence, the perceived grain "problem."

Can you scrape it off?

Tried that.

Doesn't work. Somehow it has attached itself to the image. Think Facehugger.

Can you wash it off?


What if we print it to dupe from the other side of the film.


But then things don't look quite as they used to.

What's the answer then?


The concept makes the rounds with executives making possibly correct decisions, but based upon a flawed set of facts.

I'm not suggesting these aren't good executives, just bad facts.

There are digital facilities willing to remove the unwelcome grain.

These facilities are all over the world.

Some are extremely capable, others less so, and some -- not at all.

And this is where perfectly reliable and sometimes capable facilities do the bidding of concerned and wary studio executives, and...

They remove the grain.

Let's get down to the basics.

Is it easy to remove grain?


At its most basic, even a child can do it.

For those who have ever projected film or slides, the answer is simple.

You get rid of grain by throwing the image out of focus.

Not blatantly out of focus, but marginally… ever so slightly.

Then you add a bit of digital sharpening, a touch of gamma, and a bit of basil.

The final product?

Grain reduced… or gone.

Occasionally pretty, and if no one compares it to the original, quite acceptable.

Although every digital facility promises grain removal, and some have a quality product in incremental stages, I've personally only seen the work of one facility that to my eye has the capability to remove or reduce grain and not affect resolution, and by that I mean the removal of a large chunk of high frequency information along with the offending grain.

Which brings us back to Patton.

An element is scanned and captured digital, and an edict goes out from someone at the studio to "remove the grain."

The vendor goes about the work of removal, and at some point someone needs to inspect and approve the work, but here comes the next problem.

Professional monitors are notoriously small -- generally less than 40" or smaller.

I viewed Patton on a 30" Sony HD XBR CRT, and the image looked glorious. The information was so compacted, it was difficult to tell that anything was missing.

Only later, when I viewed it on a larger screen, did it become apparent that all was not well.

Faces were waxy, background detail was gone, clothing, walls, dirt on Jeeps was all missing high frequency information, and the image appeared dead, much like a video game.

This is where it gets more complex.

Because this isn't really just about grain or the removal of grain.

One can reduce grain and have a perfectly acceptable Blu-ray --

just with less grain.

The problem is grain reduction gone wrong.

The removal of all high frequency information, and the destruction of the image.

This is the problem with Patton.

Had grain been reduced, rather than removed, and had it been done properly, without the loss of detail, all would have been well.

But someone approving the work would have had to recognize the problem.

Patton has arrived on Blu-ray.

And reviews, across the board have been generally excellent.

Great for the studio.

Less so for the film.


Because some people are "reviewing" discs on systems inappropriate to the task, and without the necessary background or reference to even have a proper opinion.

While it's perfectly fine for someone to say to a friend "Gee, that looks pretty," it can become problematic for that individual to broadcast their thoughts…

"GEE THAT LOOKS PRETTY!" around the world…

and onto the computer screens of concerned executives that, quite rightly, want to keep their jobs.

The people who made these films cannot be happy.

At worst it makes it appear that cinematographers, camera operators, focus pullers are incapable of doing their jobs.

Where do we go from here?

A few suggestions:

The folks behind Blu-ray need to take a position.

Is their system to be used as promised, to give the home theater enthusiast the cinema experience?

Or will our film heritage hence forth look like video games?

Studio executives need to be educated about grain, whatever it is that makes up an image and how it gets to Blu-ray, or sit back and allow someone else to deal with the technical end of things.

I can tell you as an absolute, that every studio has someone in place that can do this.

Those who call themselves reviewers, whether they be bloggers or work for high-end magazines and newspapers, also need to be educated as to what film and video can and should look like.

Let's be quite honest about this.

Ray and Irwin's Garage & Blu-ray Disc Review is probably a combination that won't work in the real world, especially when tied in to the Internet.

Can this be fixed?


But only if people are willing to listen, to turn to their own technical people and understand that the thousands of people responsible for the films in their libraries really did know what they were doing.

Can the studio fix Patton?

Since this is now old news, I would bet that they're already working on it.

They have great people in place, and the ability to correct, perfect, recall and replace.

No harm done.

But the point is to not do it again. That is the concern. As I said, this isn't about Patton, and this is not about Fox.

This is about harmful and improper grain and high frequency removal that can have a horrific affect on catalog titles from every studio and copyright holder across the board.

That is the concern.

Not Patton.

As an aside, I'm pleased to find that in recent days, as discs get to consumers, that I'm not a lone voice in the quest for film looking like film. More and more people who "get it" are adding their thoughts to web forums and blogs, including some on-line reviewers.

I can hear it now. So he likes the bloggers that agree with him.

Of course. Because I'm right.

A recently posted comment from "Xylon" on the AVS Forum states a clear position:

"The Blu-ray release of Patton may give us a glimpse of what could/has happen[ed] when studios cater to the masses. A revisionist piece of cow dung (!) that only they could like. This is not cinema. This transfer is not Patton. This is not the same movie I watched.

The Blu-ray format with all of its 50GB disc space and bandwidth is useless if the movie put on it is not representative of what was shown at the theaters. If you really have to use DNR and EE to cater to the lowest common denominators… put them on the players. Let them switch it on. As for film lovers, that means us, you know the early adopters?, the ones that spend thousand of dollars on your hardware and software. Take care of us. Restore the movie according to the filmmaker's intent.

To those people who have been asking me in recent days if it's worth the purchase, I will say no. Don't reward the studios with this release. Renting it is the best I can recommend. -- Xylon"

Can we just do nothing and allow a negative trend to continue?

I would hope not.

The point that the folks who hate grain need to understand is that while it can be reduced if absolutely necessary to create a quality piece of software, it need not be eliminated completely, and most important, high frequency information must remain inviolable.

More people are likely to see our cinema heritage on Blu-ray than in theaters in the future. It is precisely that heritage that is now in jeopardy.

Once again, cinematographer Gordon Willis said it best:

"When people see things they don't understand, they become frightened, and the concept of what [a film] is -- or was -- still eludes some people. We all tend to reduce or expand things to a level we understand, and it can be fatal to a film… if what someone understands is Petticoat Junction.

The film has already been made."

To those who might wonder why I choose to make Patton the poster-child for poorly used DNR on a Blu-ray disc, the answer is simple.

With all of the classic or catalog titles thus far released via the Blu-ray format, Patton, because of its great quality as a film and huge technical benefits, had the most to gain.

It could easily have been a superb example for showing off the Blu-ray system.

In place of what might have or could have been, we have, in terms of its raw potential, a disc that could have been extraordinary, but is far from it.

And this is very, very sad for any number of reasons, inclusive of the ultimate marketing damage to Blu-ray as a system for the dissemination of quality cinema in the home.

What we need is a level of quality control within the Blu-ray organization that achieves a standard that permits consumers to purchase a Blu disc knowing that the quality will be of the highest order.

Ultimately, and most easily, the grain situation might come down to reading the directions. Want less obvious grain?

Turn down the sharpness on the Blu player or monitor. The controls are there waiting to be used.


* By DNR I refer not only to Digital Noise Reduction, but to any and all digital means of maneuvering, changing, cleaning, de-graining or in any way taking the image originally captured on film and modifying it on its way to release on home video.

CLICK HERE to discuss the DNR issue with Robert and other Blu-ray and film enthusiasts at The Home Theater Forum.

The Mona Lisa... with and without DNR

Robert A. Harris - Main Page
E-mail the Bits!

Don't #!@$ with the Monkey! Site designed for 1024 x 768 resolution, using 16M colors and .gif 89a animation.
© 1997-2015 The Digital Bits, Inc., All Rights Reserved.