Sharp Image, Vague Face: Disrupting the Facial Transparency in A.I. through a Diasporic Approach

Abstract: Algorithm bias occurs when there is a lack of data diversity. A commonly adopted solution is to better an A.I. with datasets coming from minorities, implying a more general and severe process of data reaping. Unfortunately, recent discussion of A.I. ethics fails to consider this procedure as a constant exposure of the marginalized, including the diaspora, and latent risks brought by inevitable watching and listening. Meanwhile, the potential of the diaspora’s elusive identity has gained scant attention when reflecting on a possible way to resist persistent contemplation from the dominant. Based on this knowledge gap, my research criticizes compulsory transparency in facial recognition as an expression of power exercise while imagining an alternative and indistinct A.I. ethics originating from the diaspora. First, my study elucidates how power is exercised by pursuing face transparency and certainty. It further elaborates on how the process of facial dataset-making colludes with colonial photography on this point. Second, my research unpacks a poetic opacity that originates from the nomadic identity of the diaspora. Such ambiguity has the potential to contribute to the A.I. ethics focusing on marginalized communities and to resist top-down viewing. In addition, my study uses the documentary Welcome to Chechnya as a case study. It argues that the obfuscation created by Deep Fake technology in this work not only protects the Chechen LGBTQ diaspora’s privacy and dignity but also gives rise to a chance to challenge the totalitarian surveillance system. Last, my research articulates that the potentiality of A.I. for the weak lies not in how accurate and transparent an algorithm can be but to which extent those people can retain their opacity and invisibility with A.I. in front of the viewing entangled with power.

Presented at Many Worlds of AI: Intercultural Approaches to the Ethics of Artificial Intelligence Conference at the University of Cambridge, 26-28 April 2023