Several artificial intelligence systems turned out to be racists

Systems for recognizing human reactions, emotions and state of mind, in one form or another, are already used in various large companies. For example, at IBM and Unilever, they help test applicants for various positions, at Disney, artificial intelligence studies the emotions of viewers in the hall to evaluate new films, Microsoft products show the reaction of students to the presentation of educational material by the teacher. But recently, this area was shaken by a serious scandal.

Any emotion recognition system consists of two modules. The first is a camera with computer vision that analyzes human behavior and recognizes his gestures and facial expressions. The second module is specialized AI, which, based on data on a person's current actions and information about the situation, must decide what is happening in his head. To assess the behavior, draw conclusions about the subject's inclinations, check it for compliance with the criteria of the task.

In early winter, researchers at Wake Forest University published data on a major flaw in a similar emotion analyzer from Microsoft. This system, when analyzing facial expressions, has always underestimated the scores for applicants with a dark skin color. Even if they smiled sincerely, the AI ​​labeled them “deceivers, ” could significantly lower the assessments of a person's mental development. In a word, he demonstrated an almost complete template of racial prejudice, and this did not happen in relation to fair-skinned people.

And now, several months later, similar problems have been found in many other systems for recognizing emotions and even images of people. Algorithms fail the more often, the darker the skin tone of a particular person, to the extent that the onboard AI of some unmanned vehicles did not perceive dark-skinned pedestrians as living beings. It is unlikely that this is someone's large-scale joke or conspiracy, but rather a question in the collection of data for training AI at the initial stage. And that can be a big problem.