Google Translate already offers results for them

0
337
Google Translate already offers results for them
Google Translate already offers results for them

Google Translate already offers results for them ... and they

The inclusive language, which some call, is the one that consists of giving visibility to the two genres on all those occasions in which grammatically we have a solution through a single word and, as you know, it is a question that has raised countless debates with groups that are in favour, and others radically against it.

Since we do not believe that we are going to offer a solution to the conflict from here, we will simply stay with technology companies are making significant efforts to eliminate the so-called gender bias in their communications or translations, which caused professions such as that of a doctor to be automatically assigned to a man and that of a nurse to a woman.

Google offers gender alternatives.

Thus the things, the team of artificial intelligence of Google Has published a complete report in which it comes to offer a scalable model in its Translate service where it is possible to obtain translations that take into account all possible gender alternatives. In other words, since the system does not know if we are referring to him or her, it will display the two results so that the user is the one who decides which one is left.

Google Translator with results without gender bias
Google Translator with results without gender bias.

As you can see on the screens that you have just above, Google offered an example of those results that can be obtained from now on and compared them with those that the translator returned previously. If we wrote “My friend is a doctor” artificial intelligence considered that we were referring, by default, to a man, whereas now it will not be so and it will inform us of the two possibilities that exist.

SEE ALSO  ChatGPT introduces one of its most revolutionary features: what are mentions for?

Anyway, the work behind these improvements is very complicated. On the blog, the researchers explain that it was tough to get the platform to be aware of these particular nuances. They gave us an example translations from Turkish into English and into other languages ​​in which errors occurred in the final results. The reason was that using what they call “neural machine translation” (NMT), “it was not possible to display gender-specific translations for more than 40%” of the queries made. What a meagre percentage.

The solution they found was introduced millions of sentences in English to create a model to start from. Later they added the feminine nuance in all the masculine ones that allowed it, and vice versa, to conclude by merging both within a model capable of recognizing these subtle changes in the translations.