Googles AI tool is no longer going to label people in photographs of male or female
According to Business Insider report, Google’s Cloud API’s vision is to no longer label pictures of people with ‘man’ or ‘woman’. Vision Cloud API is an Al-to-use tool that allows developers to identify the components of the picture.
Google sent an e-mail Cloud Vision API clients, according to which the tool is no longer attach labels to images of gender. Google mentioned in an email that they had decided to stop gender labels, because “you can not deduce someone’s gender, their appearance alone” and use these stickers to control the unethical use of AI. Google also mentioned that an individual photo only tagged as “person”.
AI expert bias Frederike Kaltheuner speaking at Business Insider, called this change as “very positive”, according to which “classifying people man or woman to assume that gender is binary. That does not fit right, it is automatically classified and misgendered. So this is more than just a bias – person’s gender can not be inferred from the appearance. Any AI system, which will inevitably tried to make misgender people. “
Google also said in the email that they intend to continue developing the AI to ensure that people are not discriminated against on grounds of sex, race, ethnicity, income and religion.