All Too Often: the Garbage Image Quality of Camera Phones with HEIC or JPEG
What I refer to as computational photography is all the good stuff that improves image quality (resolution, noise, whatever) without losing anything. For example, Adobe Camera Raw Enhance Details, frame averaging, focus stacking, deconvolution (blur removal), etc.
This article below perfectly demonstrates why I keep stating that iPhone image quality (HEIC or JPG) is total freaking garbage. Unless you shoot RAW*. Or unless you pretend it’s fine by looking at it on a small screen (eg iPhone or iPad) whose extreme pixel density hides reality from you.
I keep trying to get my kids to shoot RAW, but since all they see photos on is their iPhone, they keep shooting crap-quality JPEGs on their phones and do not believe me that the photos suck in terms of detail.
For Apple (and presumably other phone vendors), it’s all about minimizing image size while making images look good to the eye at a very crude/coarse level (eg a tiny display).
HEIC is the newest gaslighting joke as it does next to nothing for quality vs JPEG, saving a little more but producing the same trash. For the most part, all fine details are discarded, and this is plain to see for anyone that actually looks. There are exceptions that meet the compression model’s expectations but skin, cloth, any kind of landscape, etc are all mutilated, typically cutting real resolution by a factor of 4X to 10X.
* TIP: ou can shoot your camera phone using RAW. This sidesteps all the issues discussed here. There is nothing inherently bad about a camera phone sensor; all the nasty badness comes in the processing pipeline to HEIC/JPEG. Which need not be so—it is possible to create very high quality JPEG files.
The Limits of "Computational Photography"
Jan 2023, by Will Yager
I was recently discussing laser etching with an engineer/font-designer friend of mine, and I wanted to show him a picture of some really good laser etching on a particular piece of optical equipment.
No problem, I figured - I would just snap a quick picture on my phone and send him a text. Unfortunately, I ran into an unexpected problem; my phone simply could not manage to take a picture of the text. This article is a bit of a rant spurred by this annoyance, so please forgive any hand-waving and epistemic sloppiness I’m engaging in so that I can pound this out before I stop being annoyed.
Every time I tried to take a picture of the engraved text, the picture on my phone looked terrible! It looked like someone had sloppily drawn the text with a paint marker. What was going on? Was my vision somehow faulty, failing to see the rough edges and sloppy linework that my iPhone seemed to be picking up?
What is going on here? Well, I noticed that when I first take the picture on my iPhone, for a split second the image looks fine. Then, after some processing completes, it’s replaced with the absolute garbage you see here. Something in the iPhone’s image processing pipeline is taking a perfectly intelligible and representative (if perhaps slightly blurry) image and replacing it with an “improved” image that looks like crap.
Significantly more objectionable are the types of approaches that impose a complex prior on the contents of the image. This is the type of process that produces the trash-tier results you see in my example photos. Basically, the image processing software has some kind of internal model that encodes what it “expects” to see in photos. This model could be very explicit, like the fake moon thing, an “embodied” model that makes relatively simple assumptions (e.g. about the physical dynamics of objects in the image), or a model with a very complex implicit prior, such as a neural network trained on image upscaling. In any case, the camera is just guessing what’s in your image. If your image is “out-of-band”, that is, not something the software is trained to guess, any attempts to computationally “improve” your image are just going to royally trash it up.
DIGLLOYD: the example shown is obvious in its failures, though it’s a little unfair in having different size/resolution. Still, the total mutilation of all detail except the coarsest edges is self evident in the phone picture and it is precisely what the iPhone does to just about everything.
Real life is far worse: an iPhone picture of my skin makes me look like I have some nasty disease, skies are banded/stepped, textural detail of just about everything is smeared to oblivion.
Anon MD writes:
Your comments about crappy iPhone pics has an analogy to the world of medical photography past and present.
With the iPhone and similar now in the hands of gazillions of customers combined with platforms such as Instagram and their ilk pushing the populace to generate more gazillions of mostly useless pictures, it all comes down to quantity over quality. And if all you are doing is looking at images on your phone, who cares if they look OK there but like shit on anything bigger than a 3x4” screen?
When I started practice in 1984 the ophthalmic images we took were either taken by the MD or by a trained professional medical photographer. You had to know what you were taking a picture of, what details you wanted to enhance, and how to adjust focus and exposure to capture the relevant details of a three dimensional image. To do this required years of experience. Also, we were limited to one roll of film or less so you had to make every shot count. Plus film images took up a lot of physical storage.
As digital supplanted film and digital storage got cheaper and cheaper it all changed to quantity over quality. Now the “technicians”, who for the most part are feckless morons, (medical photographers having gone the way of the dodo except at huge medical institutions) have the ability to take hundreds of pictures of one patient in the literal blink of an eye and screw focus, exposure, or framing of the desired portion of the image, etc. cuz that’s for losers. It’s not really the “technician’s" fault - they are just mimicking the photographic behavior they now do in their personal smartphone life.
I’m not sure there is a moral to this story - just a recognition that if you want a quality image it will take time, experience, patience, and quality equipment and software.
The photographer George Lepp used to run a digital photographic training institute in Los Osos years ago. One of his published portfolios was on the California poppy. I heard that once after a photo shoot comprising some 900 images, he kept only one image and deleted the rest. I guess this is halfway between the requirements of the film world and the advantages of the digital world. But George was certainly no iPhone photographer.
DIGLLOYD: I really enjoy emails like this.