Follow us on social

Latest Posts

Stay in Touch With Us

For Advertising, media partnerships, sponsorship, associations, and alliances, please connect to us below


+91 40 230 552 15

540/6, 3rd Floor, Geetanjali Towers,
KPHB-6, Hyderabad 500072

Follow us on social


  /  Artificial Intelligence   /  Does Google Really Need to Track People’s Skin Tones to Make AI Unbiased?

Does Google Really Need to Track People’s Skin Tones to Make AI Unbiased?

Google has started focusing on skin tone for inventing unbiased artificial intelligence.

The connection between AI and skin tone has been a complicated and sensitive issue in the digital world. There are a number of cases in which it has been proved beyond doubt that the algorithm has been coded with bias when it comes to the recognition and non-recognition of skin tones. There is gross favoritism toward fair and white tones, and by the same token, the darker shades, especially black, are put at the receiving end. Not surprisingly, such a revealing issue has been associated with racism. As a result, AI has found itself to be the classic villain, being accused of having guided programs ingrained with exclusionary bias against darker skins. Of particular concern has been the facial recognition software though there are other cases like apps meant for cancer detection. As a remedial measure, various tech giants are adopting measures to track people’s skin tones with the goal of installing AI unbiased. But does it serve the purpose in this digital world? Google is trying to address the issue.

Google has candidly acknowledged in that products built using today’s AI and ML technologies “can perpetuate unfair biases and not work well for people with darker skin tones.” Google has come up with an open-sourcing 10 point-scale, known as the Monk Skin Tone (MST) Scale. It has been devised by Harvard University sociologist Ellis Monk. The MST Scale claims to be responsive enough to discard the undesirable tone bias embedded in the earlier scales and it goes one step forward to claim that it addresses the sensitive issue not only with more variations in tones but it does so by keeping in mind those having darker skin tones. Apparently, it is a welcome move as it not only has greater points than the hitherto dominant 6-point-Fitzpatrick Scale but also a conscious design to be more inclusionary in terms of having provided much importance to darker shades. Yet, questions are being asked if Google can solve the grave problem by its measure alone in the digital world.

The feedback on the MST Scale overwhelmingly suggests that it is one thing to create a new tone scale and another thing to apply in reality, with credibility. The concern largely stems from the fact that in the past Google has not always been able to fulfill its promises about being more inclusive and diverse to a satisfactory degree. It has particularly been criticized for sacking two of its top AI ethics researchers in April last year, which was also followed by a spate of resignations by many responsible AI researchers. Citing a specific instance, skeptics point out that the company could not properly manage to counter the Google Photo algorithm ‘equating’ Black people with gorillas and chimpanzees— a pathetic failure to amend an unjust practice in the modern digital world. On a broader note, even the unbiased AI industry has been criticized for not being able to play a consistent role when it came to justice, fairness, and ethics vis-à-vis the color issue.

Under such circumstances, the introduction of the monk skin tone scale, however well-meaning, needs a solid backup in the form of Google’s proclaimed and unwavering commitment to inclusion and diversity of AI unbiased beyond what looks like an act of tech tweaking in the digital world.