Wednesday, November 28, 2018

Google’s gender pronouns and AI’s bias

According to recent news, Google is deleting all gender pronouns from its autocomplete feature in Gmail, to avoid machine bias. Alphabet’s Google in May introduced a slick feature for its mail service that automatically completes sentences for users as they type. However, Google runs from gender bias.
   Google’s technology will not suggest gender-based pronouns because the risk is too high that its “Smart Compose” technology might predict someone’s sex or gender identity incorrectly and offend users, product leaders revealed to Reuters in interviews.
   At the same time, it might mean that Google is using your personal data to help you compose a mail, which means that your privacy is not safe at all.
   Indicatively, a company research scientist discovered that the suggestions weren’t neutral. When he typed “I am meeting an investor next week,” Smart Compose suggested: “Do you want to meet him?” instead of “her”, mentioned in a post on MIT blog.
   But the article does not refer to previous mails from that user, if he had exchanged messages with a male investor, hence Google’s machine learning did it correctly. After trying several workarounds to fix the issue, the product team decided to ban all gender pronouns from the tools’ suggestions entirely.
   The way machine learning uses our data to suggest things is not perfect yet. Our privacy and personal data are used by artificial intelligence, in terms of picking up personalization. If we want technology to help us, we must help it with our interests and desires.
Δρ. Κωνσταντίνος Μάντζαρης, Dr. Konstantinos Mantzaris, Economistmk

Published at     
Sign-up to Economistmk© Newsletter.

Bold font phrases are clickable links.
Thanks for reading! Have a Creative Day!
This post has no comments yet.

0 comments: