Discussant: Sareeta Amrute (University of Washington)
Keywords: avatars, A.I., animation, simulation, race, disability studies, anthropology of technology
Is the avatar enjoying new life? From Snapchat selves, to virtual reality military training, to industrial robots in manufacturing warehouses, to conversational assistants like Siri and Alexa, people are coming into intimate proximity with artificial agents and objects intended to convincingly resemble or capture some aspect of “the human.” Designers of digital avatars strive for authenticity in portraying realistic voices, faces, and bodies. As queer feminist scholars have noted, however, such notions of authenticity depend upon notions of “spiritedness” (Ngai 2002) and “life-likeness” (Stacey and Suchman 2012) that tend to employ the affective labor of marginalized people, particularly people of color. Lauren Michele Jackson (2017), for example, has argued that the recent rise in GIF and meme-sharing of predominantly non-white individuals amounts to “digital Blackface” that outsources emotional expression to the moving bodies of Black people as avatars.
Running alongside these technological developments and the hopes and anxieties that follow them, anthropologists have reinvigorated debates around “fakery”  and “resemblance” , bringing theories of secrecy (Simmel 1906), revelation (Jones 2014), virtuality (Boellstorff 2016; Helmreich 2000) and animation (Silvio 2011, Gershon 2015, Manning and Gershon 2013) to bear on newer practices of fabrication, misinformation, and virtual design. At the core of these concerns is the fidelity of representational and mimetic technologies to the objects they portend to portray. What is the work of making resemblance convincing? Whose labor is deployed to make the artificial “real enough”?
In this panel, we explore the reanimation of the virtual or artificial human at the nexus of digital play, data mining, commercial and military operations, and assistive technology. We invite papers that examine issues of artifice and artificiality, and that play at the tensions between and interfaces of the fake and the real. How might machines be “disclosing agents” (Suchman 2007) that expose the seam of what it means to be human? How is the machinic used to figure new forms of being human, but also used to replicate and reinforce the dehumanization of some for the benefit of others? How do “new” virtual human and avatar projects replicate older racializing, gendering logics? We especially welcome papers that engage with historical questions of the erasure of certain kinds of raced, classed, dis/abled, and gendered labor in the history and social practice of computing.