A debate raged at the dawn of digital photography. At that time, we all knew digital lacked the fidelity of film, but the key question was, “How much better is film (in terms of quality, not convenience), and how long will it take for advances in digital technology to make it as good or better?” I’m writing this article as another year ends, and I just wondered where things stood today.
To a large extent, this question is no longer pondered. Digital has become commonplace, while film photography lingers in the hands of die-hard fans, camera collectors, and some professionals working with large formats. In regular life there appears to be a widespread acceptance that we have reached and passed the turning point, where digital exceeds the capabilities of film, and the dominance of the former is indisputable.
To answer the question, many commentators took a very academic approach, which sought comparable measures of sharpness of detail, and color resolution.
For film, the number of lines per millimeter quantifies the image resolution. For digital sensors, it’s the number of pixels and their area. Film records the finest of details, which means it captures coarse and fine textures. Digital sensors are extra responsive to the medium level details, but less sensitive to fine details. Detail is exaggerated by boosting contrast, and this produces the illusion of a sharpened image, which makes up for the lack of fine detail.
With film, there is full red, green, blue resolution at every point, and it can record endless, continuous color information. On the other hand, most digital sensors are black and white, covered with red, green and blue dots, and therefore each color only covers one-third of the sensor. The capture of one-third the color data is compensated for by the use of something called the “Bayer Interpolation Firmware”. This estimates the color values between pixels: so color transitions are smoothed over. For this reason—with regard to color resolution—most camera manufacturers’ mega-pixel counts are considerably exaggerated.
The mathematical solution was a convenient but flawed comparison of lines with pixels, the detail of which is unimportant, but calculations nevertheless provided ballpark figures of widely varying magnitudes.
The common answer was that there are the equivalent of an estimated 20 million “quality” pixels in a top-quality 35mm exposure. This assessment was made with many caveats attached, which described perfect conditions: a shot using a tripod with the mirror locked up, decent light, a top-quality lens, the finest-grained film, and let’s not forget, an optimum aperture and spot-on focus.
In less than perfect conditions, the estimate fell to about 12 million for a more typical good shot, and as low as 4 million for a handheld exposure with a point-and-shoot camera.
Yet all the same caveats apply to any shot taken with a digital camera. For example, a 12-mega-pixel camera does not guarantee optimum quality image capture at every click of the button due to all the variables of conditions and user competence. Additionally, just because a model has a sensor with a theoretical capability of “x” mega pixels, it doesn’t automatically follow that the standard lens supplied with the camera has an equal potential. Most significantly – in later years – the mega pixel count has become a significant marketing tool, and is subject to exaggeration. I know from personal experience that many cameras boasting a far higher pixel count that my ancient 3.9 mega-pixel digital Leica, don’t actually hold a candle to it in terms of overall picture quality.
In reality, the “extreme potential” argument doesn’t apply to the day-to-day camera user. Most of the time, all we want is an image to display on a tablet or mobile phone. If we commit to print, hard copies are infrequently enlarged to giant sizes sufficient to expose noise or grain defects. Most low-end digital cameras are capable of this, as was the humble Olympus Trip 35, back in 1967.
Clearly, we have reached the point where digital photography can rival and exceed the performance of film in measurable ways. Yet film, despite the near cessation of its development, remains as good as digital when exposed with decent equipment and good technique. However, while the price of new gear climbs ever higher, the cost of classic film cameras continues to plummet, making the highest quality models of their day very affordable.
The bonus with film is that, even for the casual user, it retains fidelity of color and contrast that is almost indefinable, yet somehow perceptible. It is the direct counterpart of the vinyl versus MP3 debate in music.
During my research for this article, I stumbled upon the 2014 results of comparative tests between the 36.3 million pixel Nikon D800E and a Mamiya 7 medium format film cameras (published by PetaPixel). Which at best was a pretty close call.
So here’s where I stand. If a typical good film camera can capture the equivalent of an unexaggerated 12 million pixels, that’s a damn fine performance, which will likely continue to be so for many years to come—until imaging technology for the masses takes another giant leap forward. But why should it? The drivers for change seem to be doing things smarter, which is not the same as doing it better.
Today we don’t need to ask, “How good is digital?” but rather, “How good is film?”
About the Author:
John A Burton is a film photographer.
Go to full article: How Many Megapixels Is That?
What are your thoughts on this article? Join the discussion on Facebook
PictureCorrect subscribers can also learn more today with our #1 bestseller: The Photography Tutorial eBook
The post How Many Megapixels Is That? appeared first on PictureCorrect.