What is it like to live in a world that ignores your existence? To live in a future, which has no “now” for you?
Them: “Keep waiting, we’ll get to you eventually.”
Automation. Artificial intelligence. In its early development, it won’t be for many of you. Not Google search and facial recognition. If you’re Black, it will mistake you for a gorilla. Not automated soap dispensers or faucets. The below viral video posted by a Facebook employee in Nigeria features a Nigerian man and a white man, showing what happens when companies likely don’t test their sensors on melanated hands before bringing their product to market.
If you have ever had a problem grasping the importance of diversity in tech and its impact on society, watch this video pic.twitter.com/ZJ1Je1C4NWAugust 16, 2017
Photo optic sensors have long had a love/hate relationship with layers of pigmented flesh, depending on the color of your skin. A lack of representation in tech has meant that, if you’re not white, smartphone snaps often leave you looking like a wax figure at Madame Tussauds or have your skin tone just completely washed out.
Finally though, it’s our turn? Maybe.
Do you know about the “Shirley card?” I’ll take a long story and give you the tl;dr: In the mid-1950’s, a caucasian American woman by the name of Shirley Page was a model for Kodak. They used her to set the standard for how the color temperature for an image should be appropriately processed. Because Kodak was the largest supplier of film and printers for small finishing labs, for many years, she became the standard for making sure processed negatives looked “normal.” You dropped off your photo negatives, the processing guy or gal processed your images and used her image as a guide, et voila. You received your exposures back, processed perfectly. If you looked like Shirley.
Apple and others have sought to right that wrong with differing technologies and methods for attacking the problem of processing for darker skinned human beings. Apple has given us deep fusion, and with the launch of iPhone 14 devices, Photonic Engine. When it has shown us sample images during its launch events, and during media briefings, they’re absolutely gorgeous. People of all hues look beautiful. Dark to light. Midnight to morning. A variety of skin tones and lighting conditions, all producing people’s faces as they should look. Their natural best.
But what do my dark skinned fellow human beings look like outside of immaculately choreographed presentations? In real users hands, in real life?
Marques Brownlee of MKBHD fame reviewed the iPhone 14 Pro Max (opens in new tab), and found the experience mostly positive but was still left wanting when it came to how the phone rendered his skin when taking pictures alongside people of much lighter skin tones. He displayed this in a tweet (which I unfortunately can no longer find). He being a dark skinned Black male, and that tweet being a photo of him next to a white friend.
I endeavoured to see how this would work because my photos generally come out fine, but I’m more of a caramel than a dark chocolate complexion. To that end, I enlisted the help of a family friend who is closer to Brownlee’s complexion. And, to have something to compare the iPhone’s output to, I also brought along Google’s new Pixel 7 Pro. Why? Because with the release of its Tensor chip, it made a big deal of its new Real Tone technology which is meant to address the specific problem dark skin human beings encounter with smartphone camera AI.
Google learned from its terrible primate faux paus and actually trained its AI with massive data sets of melanated individuals so that its Pixel cameras would process broad ranges of hues appropriately. Showing them in their best light, even if the lighting wasn’t actually the best. I took the images on a tripod with a double head, both phones mounted, and with the three-second timer turned on to eliminate any blur. I then presented these images to two photographers I highly respect. Ant Pruitt, host of TWiT’s Hands-On Photography, and Juan Carlos Bagnell who literally wrote the book on smartphone photography. This was a blind test for Ant. He received only 44 numbered photos and didn’t know which came from the iPhone 14 Pro Max or the Pixel 7 Pro. As he references the images, Series 1 were those I captured with Google’s Pixel 7 Pro. Series 2, the iPhone 14 Pro Max.
Ant Pruitt’s thoughts
Here’s what Ant had to say about the nighttime photos, taken with any night modes deactivated:
“On series 1, there’s a boost in saturation that isn’t needed. This negatively affects darker skin tones. Series 2 doesn’t boost the saturation so darker skin tones are more accurate. On series 2, there’s an apparent boost in warmth for fairer skin. This doesn’t always work. It’s overdone most of the time.
“BUT on series 1, fairer skin doesn’t seem to have the same boost in saturation, nor does it have good color balance. There’s a slight green tint on fairer skin which is in contrast to the slight warmth and magenta seen in series 2 for fairer skin. I like that series 2 didn’t do so much auto-boosting in saturation in comparison to series 1. But, phone manufacturers know images are going to social media first and foremost, therefore a boost in saturation will more than likely lead to someone stopping to view your images vs scrolling past.”
And the daytime photos, taken on an overcast afternoon:
“Maaaaan. I swear it seems like series 1 is trying to work hard algorithmically. There’s still a boost in saturation, just not as much as before. In image *20, the gent at the front of the frame looks like the algo decided an exposure lift was needed in series 1. This unnecessarily cranked the specular highlights on his face. It’s not bad, but not necessary, in my opinion. Series 2 looks way more natural. The only time the algo’s exposure correction looks beneficial to me is in *18 of series 1. But Series two has enough data to increase in post. The diffused lighting outside really made both of these cameras look good, though.”
Juan Carlos Bagnell’s thoughts
And what did the guy who wrote the book on smartphone photography have to say? A lot actually, but I’ve condensed it down for the purpose of this article. First, it’s important to note that originally I’d planned to have him shoot many of the photos but our schedules didn’t work out. He was going to shoot with the Pixel, so he knew which phones I was using to shoot these, unlike Ant whose analysis was blind.
On the nighttime photos, Juan had this to say:
“I know this isn’t supposed to be a ‘contest’ per se, but immediately on image one, your shots here BEAUTIFULLY demonstrate my biggest issues with iPhone camera processing.
“I don’t know exactly what skin color and texture your models have, and shooting under fluorescent lights is a tricky challenge, but I can’t believe for a second that the fairer skinned model has that much pink in her cheeks. The Pixel drawing out more olive tones feels truer.
“Your darker skinned model shows this disparity even more dramatically.
“The iPhone seems to be trying to maximize exposure, at the expense of blowing out all the richer hues of the model on the right. She looks pale, and ashy, and a little ill. Apple’s color processing failing to properly represent her look, and again, almost adding a pink hue to her cheek and jaw line.
“This woman’s make up is on point, and it’s clearer in the Pixel photo. That touch of gold eyeshadow is really hot against her natural chocolate and bronze complexion. On our first image, the Pixel is absolutely delivering a bolder and richer image which feels more ‘honest’. It’s not hyped HDR or overblown exposure.
“Image 4 helps highlight the issue I have personally with iPhones. I’m Hispanic, but pale, and photos of me from iPhones routinely feel off for the highlights and yellow/pink accents. Like your model here, I think the iPhone image again is washed out. The Pixel image might be a touch warm (again for not having met your model IRL), but is certainly more flattering. Seeing both models together, the iPhone version makes both seem a bit ‘sickly’.
“Image 9 is another challenging shadowed shot for sure, but the iPhone just refuses to lean into yellow, gold, or bronze for skin. Seeing it side by side like this, it’s such a drab look for a variety of different faces.”
Coming into the daytime photos, Juan looked more favorably on the iPhone’s captures.
“Image 17 and 18 are nice! Some daylight shots. BOO photographer for missing focus on the Pixel on 17!
“Here the iPhone is competing better. HDR photos drawing light and color out of the scene better. The exposure is MUCH more flattering for both phones. Again, the iPhone seems to highlight pink tones, and seems to avoid bronze and gold highlights. I’m not GREAT at describing this effect, but the iPhone photos just look a duller shade of ‘dark brown’, where the Pixel is richer.
“With image 19, this is the main image of the whole group where I feel the iPhone lands a similarly flattering exposure of the darker complexions in this scene compared to the Pixel, but unfortunately our paler companion looks washed out by comparison. Again, not the point of the exercise, but I like the brighter gradient of the blue sky better on the iPhone, but prefer the texture and detail in clothing better on the Pixel.”
Conclusion
I think that the analysis here really proves how much of photography is art and science. Both photographers are experts I highly respect. Both walked away with differing conclusions. Ant favored iPhone 14 Pro Max’s captures and Juan, not a fan, preferred the Pixel 7 Pro.
Ultimately, I think the future looks bright for consumers of all shades, with manufacturers taking note of these previous inequities in how technology has treated darker complexioned consumers. I’ll end this with a quote from Juan, which I thought was poignant given the topic here, “…better accuracy for ‘some’ is really better accuracy for all. At the end of the day, we just want more flattering memories of our family and friends.”
To hear more of what Ant Pruitt has to say and teach about photography, check him out at antpruitt.com/prints, and increase your photography IQ by checking out his podcast twit.tv/hop. Juan’s book, Take Better Photos: Smartphone Photography for Noobs! Can be found on Amazon.com, and you can chat with him on Twitter at https://www.twitter.com/somegadgetguy. And special shout to Jessica Schrody who helped me find models for the shoots!
Discussion about this post