Image via NYU.
If you are a STEM employer, ensure that you are hiring people of color for the development of new technology.
Buy technology from companies that are actively working to develop more inclusive hardware and software.
The word inclusivity may not immediately come to mind when we think about camera design. After all, cameras do the job they have been doing for years: they capture the image in front of them so that we can keep a piece of the moment we are capturing. However, if you have noticed that often it is harder to take photos of more melanated individuals, you might be onto something. Google and Snapchat both recently announced that they are redesigning their cameras to be more inclusive to individuals who have darker skin (The Verge, Muse). But what does this mean?
Cameras have been historically calibrated for lighter skin. When color film was developed, the first model to pose for camera calibration in photo labs was a woman named Shirley Page. After that, all color calibration cards were nicknamed “Shirley cards.” For decades, the “Shirley cards” featured only white women and were labeled “normal.” It wasn’t until the 1970s that Kodak started testing cards with Black women (NPR). They released Kodak GoldMax, a film advertised as being able to photograph “a dark horse in low light” – a thinly veiled promise of being able to capture subjects of color in a flattering way (NYTimes).
Although digital photography has led to some advancements, like dual skin-tone color balancing, it can still be a challenge to photograph individuals with a darker skin tone in artificial light. There are special tricks that cinematographers and photographers use for shooting darker skin despite these technological limitations, such as using a reflective moisturizer (NYTimes). Snapchat’s camera filters have been criticized as “whitewashed,” with Black individuals pointing out that the Snapchat camera makes their faces look lighter (The Cut). Snapchat has also released culturally insensitive camera filters including a Juneteenth filter encouraging users to “break the chains” and a Bob Marley filter that amounted to digital blackface (Axios).
After taking heat for digital whitewashing, Snapchat has enlisted the help of Hollywood directors of photography to create what they are calling an “inclusive camera” led by software engineer Bertrand Saint-Preaux to hopefully ease the dysphoria that Black users may feel after taking selfies through the app. Some of these efforts include adjusting camera flash and illumination variations in order to produce a more realistic portrait of users of color (Muse). Similarly, Google is changing its auto-white balancing and algorithms for the Pixel camera. They’re also creating a more accurate depth map for curly and wavy hair types (The Verge). Apple started this process a few years ago when they developed the iPhone X in 2017 (Engadget).
It’s not just the quality of photography that needs to be changed. We must also consider bias in the way that AI analyzes images. Twitter’s “saliency algorithm” has come under fire for racial bias in their preview crops of photos. Twitter automatically favors white faces in preview crops, no matter which image was posted first to the site. Twitter is currently planning to remove the algorithmic cropping from the site entirely in response (BBC).
This is not the first time that the company has simply removed an AI’s ability to recognize an image instead of redeveloping the AI to be more inclusive. In 2015, it was pointed out that Google Photos was labeling Black individuals as “gorillas.” Instead of fixing the AI, the company simply removed gorillas from their recognition software. In 2018 Wired followed up by testing photos of animals and although Google Photos could reliably identify multiple types of animals, there were simply no search results for “gorillas,” “chimps,” “chimpanzees,” and “monkeys” (Wired). Less than 1% of Google’s technical workforce is Black (NBC News).
Since photography is almost exclusively digital at this point, hopefully companies will take more initiative to better develop cameras that adequately capture people of color in a flattering way. We also need to adopt inclusive AI practices to ensure everyone’s treated equally in social media. When we are seeking to develop inclusive tech, people of color need to have a seat at the table to help ensure that both the software and hardware we use are not racially biased.
Since film photography was developed, cameras have historically favored white individuals.
Currently, tech companies are working to develop more inclusive cameras after criticism from people of color.
The way we consume photography is also biased by the way algorithms and AI show us photographs through social media.