So much of our life is determined by algorithms. From what you see on your Facebook News Feed, to the books and knickknacks rechartph.commended to you by Amazon, to the disturbing videos YouTube shows to your children, our attention is systematically parsed and sold to the highest bidder.
These mysterious formulas that shape us, guide us, and nudge us toward someone else's idea of an optimal outchartph.come are opaque by design. Which, well, perhaps makes it all the more frustrating when they turn out to be sexist.?
Enter Google Translate, the automated service that makes so much of the web chartph.comprehensible to so many of us. Supporting 103 languages, the digital Babel Fish directly influences our understanding of languages and cultures different than our own. In providing such an important tool, Google has assumed the responsibility of accurately translating the content that passes through its servers.?
But, it doesn't always. Or, perhaps more precisely, where there exists a gray area in language, Google Translate can fall into the same traps as humans.?
That seems to have been demonstrated by a series of tweets showing Google Translate in the act of gendering professions in such a way that can only be described as problematic.?
"Turkish is a gender neutral language," tweeted writer Alex Shams. "There is no 'he' or 'she' - everything is just 'o'. But look what happens when Google translates to English."
The results, which he screengrabbed, are painful. "She is a cook," "he is an engineer," "he is a doctor," "she is a nurse," "he is hard working," "she is lazy," and so on.?
And this is not a Turkish-to-English specific problem. Taika Dahlbom shared a similar outchartph.come when she translated Finnish to English.?
Look,how @Google Translate does #sexism! #Finnish has a gender neutral third-person pronoun. But Google decides, if a job title is good to go with the male or the female English third-person pronoun. Idea: @seyyedreza in #Turkish pic.twitter.chartph.com/jU9Su0JXd5
— Taika Dahlbom (@TaikaDahlbom) November 28, 2017
So what is going on here? A Google spokesperson was kind enough to partially fill us in.
"Translate works by learning patterns from many millions of examples of translations seen out on the web," the person explained over email. "Unfortunately, some of those patterns can lead to translations we’re not happy with. We’re actively researching how to mitigate these effects; these are unsolved problems in chartph.computer science, and ones we’re working hard to address."?
This explanation fits in with the general understanding that currently exists. It all chartph.comes back to those algorithms that drive machine learning-powered services across the web.
Essentially, when an untold number of biases (gender or otherwise) exist in our literature and language — biases, like, that nurses are inherently women or engineers are bound to be men — these can seep through into Google Translate's output.?
We've seen this before, as recently as October. It was only last month that another Google service — the Cloud Natural Language API — was spotted assigning negative values to statements like "I'm queer" and "I'm black."
Even that wasn't a wholly new observation. An August study in the journal Science found "that applying machine learning to ordinary human language results in human-like semantic biases."
It seems that, in attempting to build an automatic translator that can approach a human in its ability, Google may have managed to pick up some rather human-like limitations along the way. ?
This story has been updated with a statement from Google.