artificial intelligence that can detect signs of racism

Researchers at the University of Virginia, in the United States, have developed an artificial intelligence (AI) system that is able to identify and quantify physiological signals associated with racism, a technology presented in a study recently published on ArXiv.

In the survey, 76 volunteers underwent an Implicit Association Test, a method in which implicit racial prejudices can be detected based on people’s reactions when looking at images and words that should be associated with expressions such as “dark skin”, “fair skin” , “Bad” and “good”.

During this test, wearable devices such as an Apple Watch or another type of smartwatch were used, which had the function of measuring the physiological reactions of the participants when they encounter strange people and possible threats, represented by the figures shown to them.

Then, it was the AI ​​algorithm’s turn to take action, to analyze the volunteers’ responses and the data obtained by the smart watch while they were doing the test. The goal was to see if a specific combination of physiological responses can show us whether a particular person is experiencing involuntary feelings of racism.

What was the result?

According to researchers at the University of Virginia, the AI ​​developed by them was able to predict an “implicit racism bias” from the analysis of physiological signals with an accuracy of 76.1%.

This precision of the algorithm is considered low, but taking into account that the purpose of the study was not to create a smartwatch capable of detecting racists, the result may help to better understand the mental associations of dark skin color with something negative and the physiological manifestations of this kind of reaction, as The Next Web points out.

In other words, AI does not label prejudice or racism, it only points out the “side effects” associated with such conditions, although there is a need to carry out new tests to confirm this ability.

0

LEAVE A REPLY

Please enter your comment!
Please enter your name here