Google told Reuters this week that it is developing an alternative to the industry's standard way of grading skin tones. A growing group of technology researchers and dermatologists say this method is insufficient to assess whether products are biased against people of color.
There is a six-color scale known as the FST, which has been used by dermatologists since the 1970s. Tech companies rely on it to classify people and measure whether products like facial recognition systems or smartwatch heart rate sensors work equally well across skin tones.
Critics say the FST ignores the diversity among black people. The scale includes four categories for white skin and one each for black and brown. Last October, researchers at the US Department of Homeland Security recommended abandoning the FST for assessing facial recognition. This is because the standard poorly represents the color range in diverse populations.
Google said, we seek better measures. We are working on alternative and more comprehensive procedures that could be useful in developing our products. We are collaborating with scientific and medical experts, as well as groups working with communities of color on this effort.
The debate is part of a larger problem around racism and diversity in the tech industry, where the workforce is more white than in other sectors.
Google develops a new standard for skin tone
Ensuring that technology works well for all skin tones, as well as different ages and genders, is even more important. This is because new products, often powered by artificial intelligence, extend into sensitive areas such as healthcare and law enforcement. Companies know that their products can be defective for groups not represented in the research and test data.
And when Google announced in February that cameras on some Android phones can measure pulse rates with a fingertip, it said the readings averaged an error of 1.8 percent regardless of whether users had light or dark skin.
The company has made guarantees that skin type will not affect the results of the background filter feature in Google Meet. Nor is the upcoming web tool for identifying skin conditions, informally called Derm Assist. Until recently, technology companies were not interested in this.
The Unicode Consortium, which oversees emoji, cited the FST in 2014 as the basis for adopting 5 skin tones after yellow. And a 2018 study titled Gender Shades found that facial analysis systems often mislead people with darker skin.
In a study conducted in April to test artificial intelligence in detecting deep fakes, Facebook researchers wrote that it is clear that the FST standard does not include diversity within brown and black skin tones.