This topic keeps coming up, so I m republishing this article from 2017.
See my Mac wish list.
[This is a rerun article, the question keeps coming up so I thought I’d repost]
In my my mention of the LG 5K display, I wrote that “the pixel density is way too high for that type of detail work”, which generated at least two reader emails, below.
But first, the flip side: being able to see 14.2 megapixels (5K) or 33.2 megapixels (8K) is a huge boon in image assessment—overall assessment. But high pixel density is not good for assessing fine detail, and that’s a problem for anyone shooting a burst of frames (focus may be subtly better on one frame of several), comparing lens performance, determining whether an f/9 or f/11 shot is better (competing interests of DoF vs diffraction dulling), assessing how much to sharpen, etc.
What I did not make clear in that statement is the conditions under which it is true, and it could be false for someone 25 or 30 or 35 years old with perfect 20/20 vision. I have no way of knowing that directly. By “true” I mean that by direct experience, I know what works and what does not work for me, that is, what leads to errors in evaluation and what does not.
I’m not young any more—my sixth decade, which means that presbyopia has become an annoying issue (one reason that lack of an EVF option is going to drive me away from DSLRs entirely within a few years).
My eyes need +10 diopters correction, so eyeglasses are marginal solution (introducing chromatic errors of their own and other issues). I wear contact lenses and when my eyes are not tired or irritated correction is excellent at 20/20, with a slight astigmatism, which is why I focus cameras left-eye only. I also have limitations on close-focus range with contact lenses. So I CANNOT peer a little closer at a computer display—can't focus there.
My sense is that many of my readers are not spring chickens either, and may have similar or worse vision limitations. That said, I am not claiming “proof” of anything here as a general principle, only that Retina displays of 220 dpi or more make it extremely difficult for me to evaluate images for critical sharpness.
The bottom line here is “try it yourself”. I think most users are fooling themselves about image sharpness if all they do is view at 100% pixels on an iMac 5K (or LG 5K or Retina display). Those “sharp” images often are not quite sharp.
Displays with optimal or acceptble pixel density
My workhorse display, the NEC PA302W has been discontinued but can still be purchased. The NEC PA302W is by far the best choice for image evaluation for both its pixel density and its GB-R LED backlight backlighting which is far more neutral in grayscale than most IPS displays, many of which have a faint but visible magenta tint even if the measurement device falsely claims otherwise—this can readily be seen side-by-side!
$3559 NEC MultiSync PA311D 31.1" 17:9 Color Critical Desktop HDR IPS D… CANNOT BE ORDERED AT THIS TIME in Computers: Displays
Stefan D writes:
"Pixel density way too high" for assessing sharpness? Could you please elaborate on this in your article a little bit more. I would think more density = easier to assess sharpness. Thank You!
DIGLLOYD: an iMac 5K (or LG 5K) has pixel density of about 220 dpi = ~4.3 line pairs/mm. Without peering closely at the display, the pixels disappear. If the eye cannot resolve these pixels, how can one be sure of sharpness differences? Many an image that is not quite sharp still looks great at 220 dpi, and yet the same image at 101 DPI on my NEC PA302W is obviously less than fully sharp. I’ve seen that over and over, so I’m on my guard if an image looks sharp on my MacBook Pro Retina and I cannot tell f/2 from f/5.6 without going to 200%.
Consider a 6 X 4" print from a slightly blurred image that looks really sharp at that size (because it is 300 or even 600 dpi), but when printed at 13 X 19" it is obviously less than fully sharp.
How can I tell if my image is fully sharp, or sharper than another similar frame?
At pixel densities over 160 dpi, it becomes too hard to reliably distinguish critically sharp from almost sharp. The exception might be young eyes viewing the image at very close range. I can do neither.
Digital displays were nominally 72 DPI (dots per inch) to start with, more or less. As larger screens emerged, the dpi rose to as high as 110 DPI or so. With the advent of Retina and HiDPI display, DPI becomes very high.
- NEC PA302W 2560 X 1600, 29.8" diagonal, 20" wide = 101 dpi = ~2 line pairs/mm
- MacBook Pro Retina 2880 X 1800, 13 inches wide = 220 dpi = ~4.3 line pairs/mm
- Apple iMac 5K 5120 X 2880, 23.375 inches wide = 220 dpi = ~4.3 line pairs/mm
- Dell UltraSharp 32 Ultra HD 8K 7680 X 4320, 32" diagonal = ~23.6 inches wide = ~ 320 dpi = ~6.3 lp/mm
It is far easier to assess image sharpness at 101 dpi than at 220 dpi (320 dpi makes it impossible). Zooming to 200% is a possibility, but problematic for reasons discussed further below. Note that I am not talking about thin clean lines from vector graphics, but complex image details.
My closest comfortable focusing distance under relatively dim indoor lighting is 18 inches. That means I should be able to resolve at best about 3.5 lp/mm (a rough estimate based on Norman Koren’s analysis), assuming my eyes are working perfectly (often not the case!). So right off the bat, most human eyes cannot resolve the 4.3 lp/mm of the iMac 5K display without peering closely, say 12" away—which is absurdly close for a 27" display (not really usable) and a serious ergonomic problem to boot. And of course there are all sorts of human perceptual issues involved that make it much more complex than that, and I’m not evaluating black and white line pairs here, but real images with complex detail and color.
For my work, I have to evaluate sharpness correctly all the time for my readers, so a Retina or HiDPI display is problematic. It is one of several reasons that I evaluate images on the NEC PA302W (2560 X 1600, 30" display = 101 dpi), and while I am reluctant to do lens assessments while in the field with my MacBook Pro Retina. It’s hard enough to compare/shoot lenses fairly while also having pixel density hide subtle differences.
There are other reasons too: when doing fine detail work, assessing the amount of sharpening to apply, etc, the high pixel density makes it difficult to assess any nuances. This forces working at 200%, where each image pixel is now a 2 X 2 block of screen pixels, and this raises yet more issues, more on that below.
Ed A writes:
I was interested to read your review of the LG 5k monitor and the hint about the upcoming 8k from Dell. I've been using HiDPI displays for several years now, starting with the old IBM T221 and now with Dell 5k screens.
But I was surprised that you said the higher resolution display was not recommended for evaluating image sharpness.
Why not? Surely if you need to view individual pixels you can just view the image at 200% magnification and effectively have about 100 chunky pixels to the inch. Or even 400% magnification, where each pixel on the image becomes a block of sixteen on screen. Then you can check the raw image sharpness without having to squint.
However, I can guess one possible reason. Often when viewing an image at 200% magnification it is scaled up with some kind of 'smart' resizing which, rather than simply mapping one pixel to a block of four, applies some kind of blurring. When looking at a whole photograph this does give a more pleasing result than pixel-doubling. But it is infuriating for pixel-level work like you mentioned. A similar defect applies to monitors themselves: typically a 4k monitor run at plain old HD resolution won't just display blocks of four, but will blur the image too. Great for video games, not so great for still images and text.
Back in simpler times, image viewing software would just scale up naively to a block of pixels, and monitors would too (the T221 does it right). It is frustrating that things have gotten worse, at least for some software.
Does your favourite viewer or application for pixel-level work allow you to zoom in to 2x, 3x, or 4x scaling and cleanly distinguish the individual pixels? If not, then really the fault is with the software rather than the HiDPI monitor. On the other hand, if the software can do it right, surely a 27 inch 5k display is very nearly as good as your preferred 32 inch PA302W?
DIGLLOYD: it was no review, just a mention from the show.
I use Adobe Photoshop CC 2017. Using 200% is problematic for my purposes and 300% or 400% serves no useful purpose at any DPI, particularly given the false detail present from Bayer matrix demosaicing. Even 200% is problematic that way.
Hugely enlarging an image is looking at twigs on trees, not the forest. I am not a “pixel peeper”, and I consider it a pejorative. So the last thing I want to do is use 200%. For a good example of the wanton foolishness of MTF charts or other pixel peeper favorites vs real world behavior, see Sigma 12-24mm f/4 DG HSM Art: Two Aspen.
- Perception matters, acutance in particular. A blurry image at 200% loses acutance, and acutance is a key feature of the very best lenses. So 200% actually makes it worse for comparing to another lens, or another frame, by degrading both and thus reducing the apparent differences.
- Sharpness is not about some pinpoint spot; I need to see sufficient context for proper evaluation. It is a mistake in methodology to zero in on a small area for checking sharpness. Zooming to 200% shows an area 1/4 as large as at 100%, reducing the context greatly while showing a blown-up version lacking the original acutance.
- At 200%, one image pixel becomes a 2X2 block of screen pixels. Acutance is lost; the image looks soft and blurry. It is visually annoying and frustrating to work that way (and time wasting to zoom in/out constantly). I do this in the field when I must, but it is tedious. Scaling always has do something: harsh edges with no smoothing, or some kind of smoothing. The best solution if one is going to scale is to resample and sharpen with algorithms that one has determined to work well for assessing sharpness differences—but there is no option to force the GPU to do that. So... maybe a solution is possible that has fewer negatives.
- GPUs often scale pixels in undesirable ways that do not preserve acutance and/or smooth things, etc. See Photoshop and GPU: Blurry Image Scaling Damages Image Assessment Workflow, which shows that simply changing a setting can affect image display dramatically, but the behavior can change as the image size changes! This might not be a problem for 200%, but it shows that scaling problems do exist.
- “cleanly distinguish the individual pixels” is a mistaken idea. Any interpolation will introduce its own problems, which is seen directly by using various resampling algorithms, all producing different results. Once the original image is resampled (200% or whatever), it not the original any more.
Similar issues apply for workflow, such as how much to sharpen. This generally sorts itself out; a skilled operator can make tweaks to an established scaling and sharpening regimen known to be ideal for a particular printer, image size, etc. But in general, a too-fine pixel density hides errors, such as excessive sharpening.
Patrick L writes:
I totally agree.
Some years ago I needed a laptop (mac), for shooting tethered on location with a Hasselblad H4D 50MP. It was almost impossible to see if the subject was sharp on the MacBook Pro Retina screen, as I was used to see when using an Eizo monitor in the studio.
So I used an MacBook Air with a non-retina display, with much better result. I have not used any higher pixel density monitor for some years for evaluating sharpness, but I believe it would be even more difficult with 4K and higher. Thank you for all your work and interesting articles.
DIGLLOYD: the Apple Retina displays are all about the same impossibly high resolution of 220 ppi—gorgeous for viewing, horrible for evaluation.