This article is more than 1 year old

Basic bigot bait: Build big black broad bots – non-white, female 'droids get all the abuse

What the fsck is wrong with people... judging from this study

If you want your robot to be abused, do as The Rolling Stones suggest and paint it black. Also, make it female.

Researchers from the University of Texas Rio Grande Valley, in the US, recently evaluated, for this research paper, the way people respond to online videos of female-presenting robots and of women. Their goal was to assess whether machines with non-white social identities get mistreated more readily than those presenting as white or than people associated with a specific race.

As other academics have noted, most robots are white and perceptions of gender and race affect how people interact with robots.

Megan Strait, assistant professor of computer science at UTRGV, Ana Sanchez Ramos, program specialist, and students Virginia Contreras and Noemi Garcia reviewed more than 1,200 comments associated with six online videos.

Three of the videos depict robots designed to appear female – gynoids, as opposed to androids – and given black, white and Asian identities. The bots are referred to as Bina48, Nadine, and Yangyang respectively.

The other three videos depict women with corresponding racial identities – singer Beyonce Knowles, actor Cameron Diaz (who presents as white, despite her Spanish/Cuban ancestry, the paper explains) and model Liu Wen.

The researchers hypothesized that the white robot (Nadine) would be treated better in comments than the black- and Asian-presenting gynoids, Bina48 and Yangyang. They also anticipated that commenters would be more abusive toward robots than people.

Radical

They found support for both assumptions. "The data indicate general support for our hypothesis: that people readily extend racial biases and employ stereotypes to dehumanize robots implicitly racialized in the likeness of marginalized human identities," the paper explains.

The compu-boffins also observe that for all three racial models presented, "people were consistently and significantly more negative towards and dehumanizing of the gynoids relative to their human counterparts."

They claim such boorish behavior cannot simply be attributed to the online disinhibition that makes aggression less risky than it would not in face-to-face interaction. However, they allow further research could help resolve whether the celebrity of the human subjects influenced the tenor of associated comments.

"It is not surprising that antisocial behavior in the form of gender-based stereotyping, bias, and aggression extends to human-like interactions with gynoids," the researchers observe. "What is surprising, however, is the frequency at and degree to which aggression towards female-gendered robots manifests."

While hostile commentary doesn't necessarily predict violence against robots in the real world, the researchers observe that their findings support a growing body of evidence that people will readily abuse robots.

They argue that those designing robots should consider abuse-avoidance mechanisms, conflict de-escalation strategies, and mediation options in the face of aggression during multi-person human-robot interactions. ®

More about

More about

More about

TIP US OFF

Send us news


Other stories you might like