Google Translate Decreases Bias
In the past, if you entered “o bir doktor” in Turkish into Google Translate, you would get the result: “He is a doctor.” In a blog post, the company explained that translations were based on common usage, so “it would skew masculine for words like strong or doctor, and feminine for other words, like nurse or beautiful.”
Now, Google Translate will offer both a masculine and a feminine possible translation. The company plans more changes: “We're already thinking about how to address non-binary gender in translations, though it’s not part of this initial launch.”
A Gmail product manager identified the gender-bias problem in the Smart Compose technology, which is used to predict what users will type. Computer-generated follow-up questions to “I am meeting an investor next week,” included “Do you want to meet him?”
Gender pronouns is one issue AI programmers want to solve to improve natural language generation (NLG), which finishes our sentences for us.
Discussion:
What’s your experience with NLG? For example, how helpful do you find Gmail’s suggestions for finishing your sentences in email?
What’s your view of Google’s attempt to decrease gender bias? Is this a worthy goal? Why or why not?