Stereotyping Femininity in Disembodied Virtual Assistants
The focus of this paper is to examine the rhetorical ways that virtual assistants perpetuate western (American) gender stereotypes with users. While other literature has studied gender stereotypes with online service chatbots, scholarly research related to virtual assistants on standalone devices (e.g. Siri, Cortana, Alexa) is lacking. To conduct this study, I have relied upon a combination of the theories of identification and gender performance theory to show how the virtual assistants enact gender with users.
The findings of this analysis demonstrate that the virtual assistants easily enact harmful gender stereotypes because the virtual assistants are disembodied representations of femininity. As evidence, media reviewers of the technology hoping to demonstrate their abilities have used language to ask questions that would be inappropriate for face-to-face communication.