The Gboard (Google) virtual keyboard was spotted suggesting sexual terms when typing the word “neguinha”. The application for Android phones and iPhone (iOS) has a prediction function that aims to suggest words from what was typed before by the user.
However, in TechTudo tests, the suggestion of inappropriate terms such as “hired” and “brand new” happened even when installing the keyboard and using it for the first time on a Galaxy A10 and Huawei P30 Pro.
In response, Google stated that “Gboard was designed to avoid biased predictions in its generic models, but human language is complex and, as with any system that filters sensitive phrases, inappropriate suggestions sometimes enter machine learning models. When we discover an inappropriate suggestion, we work quickly to remove it. ” TechTudo confirmed on Thursday (30) that the offensive suggestions were removed from the keyboard.
When typing the word “neguinha” in Gboard, the terms suggested by the keyboard in the tests were “brand new” and “assanada”. The application uses artificial intelligence to improve features such as spelling correction and text prediction, a technology that learns from the user’s habits to return results that meet their demand. However, displaying sexual terms even when using Gboard for the first time raises issues of racism and machismo, as the same does not happen when typing variations such as “neguinho”, “negra” and “negro”, which return generic terms such as “ that ”,“ e ”and“ da ”.
It is not the first time that Google automation fails to represent: last year, users found that the search tool displayed explicit sex images, indexed from pornography sites, when searching for “black woman teaching” – and the same did not happen when using keywords related to white women. In France, Google had to change the search engine algorithm so that pornographic results would stop showing when searching for “lesbienne” (lesbian, in French) and other terms related to sexual orientation.
The machine learning mentioned by Google in a note uses artificial intelligence to observe user behavior and thus offer suggestions that are relevant to him. The technology appears, for example, with music recommendations from Spotify or alternative directions directions from Google Maps. Although it seems “neutral” because it seeks to automate actions, the technique can reflect society’s prejudices, since its work is based on content produced by people.