MKBHD's new video about the lag between the iPhone's camera system against that of the competition strikes a nerve I've been feeling for the past few years. I consider myself an amateur photographer, which is the only reason I've ever needed to get the new Pro-level iPhone every year. The camera upgrades are usually worth it.
But I remember receiving the iPhone 12 Pro on launch day and taking a few test shots against my trusty 11 Pro — a camera system I truly loved — and noticing something that threw me off. A lot of the reviews I'd read highlighted improvements in the hardware, but it seemed to me that the machine-learning photo-processing was overdoing it. I thought I must be wrong, and wrote as much in my review at the time:
The cameras feel about on par with the last generation of phones — I personally haven't noticed their supposedly being better when taking photos in everyday situations, but I'd believe they were better if you or some other tech columnist told me so.
Should have trusted your gut, Bigley!
Over time though, it's become more and more evident that the software side of iOS has been mangling what should be great images taken with a great sensor and superbly crafted lenses. To be clear: The RAW files produced by this system in apps like Halide are stunning. But there's something lost in translation when it comes to the stock Camera app and the ways in which it handles images from every day use.
This is not a phenomenon that has flown under the radar of non-techie users either. I saw a TikTok (which I unfortunately didn't save and I'm sure will never see again) earlier this year showing an iPhone "hack" that involved taking a photo, opening the preview, and taking a screenshot before the processing could hit to produce a more natural and softer look than the sharpened and HDR-laden final image. On my end, I've gone all out.
I like the speed the stock Camera app provides, especially having it accessible from my lock screen. To curb the over-corrections, I've tuned my own version of a Rich Contrast photographic style and turned on Live Photos. To my eyes, both in conjunction with one another reduce the amount of work the iPhone's chip does to "improve" images, creating a more natural photograph when all is said and done.
Either way, ever since the iPhone 12 lineup hit I've gotten into the habit of throwing photos into an app like Huji or DazzCam to dirty them up. But I'd rather not! I'd rather share a barely-altered and high-resolution image if possible, as I used to do when I was a Google Pixel user. For now it seems like fake film grain and Live Photos are the way to go.
I'm hopeful Apple will course correct in time.
They usually do!