As smartphone cameras continue to mature and improve in their ability to capture and process more cinematic images, they continue to face one very important challenge: How to capture the different skin tones that make up the varied communities of our world. 

Companies like Google have made a commitment to be more inclusive in their imaging technology, but it turns out that it may not be as easy as doing a few tweaks of the camera’s algorithm.


The topic has been an ever-evolving discussion for a few years now, but, in reality, the problem with accurately capturing these different skin tones is very much a scientific one. To effectively approach a solution to the problem, we must look at a bit of photographic history.

Apple iPhone 14 Pro Max PhotoPhoto from an iPhone 14 Pro MaxCredit: Apple

Those Who Do Not Learn From History...

Our history is laden with landmines of difficult moments, terrible atrocities, and inequalities that we still feel today, even infilm photography. How we see ourselves can be a direct representation of how we think about ourselves. 

Back in the days of film, Kodak film was developed specifically for light skin tones. It was the chemical baseline for the technology. But this left out a very crucial demographic—those with darker skin tones.

Kodak's Multiracial Shirley CardKodak's Multiracial Shirley Card, North America (1995)Credit: Dr. Lorna Roth, Concordia University, Montreal, Canada

There were difficult lessons to learn and even more difficult ones to unlearn. Photography tools like the Shirley card evolved to include subjects of different skin colors, which sadly weren't implemented at scale as digital photography took over in the ‘90s.  

As we reached the digital age, the same lessons learned by Kodak in the analog realm were forgotten when digital cameras came on the scene. Manufacturers made the mistake of training camera algorithms on lighter skin tones, making it more difficult to be accurately capture and process darker skin tones.

Teaching An Old Dog New Tricks

Now, companies like Google and Apple are making acommitment to retraining these image-processing algorithms with a far wider subset of skin tones that more accurately affect the people who will use their smartphones to capture the moments of their lives. Google’s “Real Tone” camera modifications were incorporated after adding over 10,000 new images of various people of color in order to expand the image processor’s ability to better reflect and include accurate depictions of darker skin tones.

“Over the past year, we’ve added more than 10,000 images to the data sets used to tune Pixel’s camera,” Shenaz Zack, director of product management at Google. “Through that work, we’ve tuned exposure and brightness to better represent darker skin tones in low light situations.”

However, it’s just an overnight fix. The Washington Post did a comparison of several modern smartphone models, including the Google Pixel 7 Pro, and compared it to the Apple 14 Pro Max and the Samsung Galaxy S22. What they found was that each camera had made some improvement in depicting a perceived accurate skin tone but in the process, lost resolution.

Direct lighting would enable a camera to pick up a better image of a subject. If the light were moved to behind the subject or if the lighting conditions were poorer or just low light, then that improvement would be lost in the shadows. At times, the colors would end up being washed out when they should have been bolder.

Skin tonesImproving skin tone representationCredit: Google

How We See Ourselves

Beauty (or, in this case, accuracy) is in the eye of the beholder. While one person may prefer the image results of one smartphone, another may prefer a different choice. Also, how the subject perceives themselves, in general, plays a crucial part in how they view a picture that reflects them. That is why these initiatives are important. 

How we are represented greatly affects how we feel about not only ourselves, but our place in the world. 

There is an agreement that improvements have been made, but there is still a long way to go. As camera designs become more integrated with artificial intelligence, the accuracy of how a camera’s image processor can interpret the interaction of light reflecting off the subject will certainly improve, and with it will come a better reflection of all skin colors.

The commitment is there, but users will have to wait until results improve to get the right solution right out of the box.

Source: The Washington Post