EconomyForex

Google unveils new 10-shade skin tone scale to test AI for bias

1 Mins read
Monk Skin Tone Scale. — Image via skintone.google

OAKLAND, Calif. — Alphabet Inc.’s Google on Wednesday unveiled a palette of 10 skin tones that it described as a step forward in making gadgets and apps that better serve people of color.

The company said its new Monk Skin Tone Scale replaces a flawed standard of six colors known as the Fitzpatrick Skin Type, which had become popular in the tech industry to assess whether smartwatch heart-rate sensors, artificial intelligence (AI) systems including facial recognition and other offerings show color bias.

Tech researchers acknowledged that Fitzpatrick underrepresented people with darker skin. Reuters exclusively reported last year that Google was developing an alternative.

The company partnered with Harvard University sociologist Ellis Monk, who studies colorism and had felt dehumanized by cameras that failed to detect his face and reflect his skin tone.

Mr. Monk said Fitzpatrick is great for classifying differences among lighter skin. But most people are darker, so he wanted a scale that “does a better job for the majority of the world,” he said.

Mr. Monk through Photoshop and other digital art tools curated 10 tones — a manageable number for people who help train and assess AI systems. He and Google surveyed around 3,000 people across the United States and found that a significant number said a 10-point scale matched their skin as well as a 40-shade palette did.

Tulsee Doshi, head of product for Google’s responsible AI team, called the Monk scale “a good balance between being representative and being tractable.”

Google is already applying it. Beauty-related Google Images searches such as “bridal makeup looks” now allow filtering results based on Monk. Image searches such as “cute babies” now show photos with varying skin tones.

The Monk scale also is being deployed to ensure a range of people are satisfied with filter options in Google Photos and that the company’s face-matching software is not biased.

Still, Ms. Doshi said problems could seep into products if companies do not have enough data on each of the tones, or if the people or tools used to classify others’ skin are biased by lighting differences or personal perceptions. — Reuters

Related posts
EconomyForex

DA allows imports of up to 21,000 tons of onions 

1 Mins read
PHILIPPINE STAR/WALTER BOLLOZOS THE Philippines’ Agriculture department said on…
EconomyForex

Dry soil to curb Asia’s early 2024 rice output, pressure supply 

2 Mins read
SINGAPORE – Asian off-season rice production is poised to…
EconomyForex

People-centric approach needed in adoption of AI — experts

3 Mins read
STOCK PHOTO | Image by Gerd Altmann from Pixabay…
Power your team with InHype
[mc4wp_form id="17"]

Add some text to explain benefits of subscripton on your services.

Leave a Reply

Your email address will not be published. Required fields are marked *