Virtual chatterbot sex

Brahnam has stated that “If we’re practicing abusing agents of different types, then I think it lends itself to real world abuse.” It is troubling if the harassment of female chatbots may contribute to the already widespread issue of sexual harassment towards women.

In a similar scenario, studies have examined the influence of violence and sexual objectification of women in video games on rape myth acceptance, or the internalization of attitudes that justify or excuse rape.

Artificial intelligence may be crossing into the realm of Uncanny Valley, a phenomenon where a design that is similar, but not identical, to a human being causes a very negative response to this simulated likeness. Justine Cassell, Professor of Human-Computer Interaction at Carnegie Mellon, states that “The more human-like a system acts, the broader the expectations that people may have for it.” As virtual assistants do not visually indicate their functionality, we may assume they have more abilities than they actually do or probe to better understand those limitations. Since AIs lack emotional capability, does that mean these actions are harmless?

Though abusing a virtual assistant may seem innocuous, this behavior reflects back on our character.

When faced with unfamiliar artificial intelligence (AI) in the form of chatbots, from AIM’s Smarter Child to Apple’s Siri, humans try to push the boundaries of its capability. Sheryl Brahnam, Assistant Professor in Computer Information Systems at Missouri State University, 10%-50% of our interactions with conversational agents (CAs) are abusive.

Many in the UX field are now considering hedonomics, or the “branch of science and design devoted to the promotion of pleasurable human-technology interaction”.

Research shows that across cultures, we affiliate qualities such as kindness, helpfulness, warmth, and communicative, which may be why people prefer female voices for their chatbots.

In my opinion, though, these female virtual assistants perpetuate the stereotype that women are subservient. Brahnam’s research found that users direct more sexual and profane comments towards female-presenting chatbots than male presenting chatbots.

I liken it to teaching young girls how to avoid sexual assault rather than teaching others not to.

While UX designers strive to create the ideal experience, what does it say about our society if cruelty is tolerated and widespread?

What is our ethical responsibility as designers to accommodate or not?

With the emergence of AI virtual assistants, like Siri, Alexa, and Google Home, the majority of the personas are female.

While some identify as gender-neutral, the default voice is typically that of a woman.

Leave a Reply