Race After Technology by Ruha Benjamin

Race After Technology by Ruha Benjamin

Author:Ruha Benjamin
Language: eng
Format: epub
ISBN: 9781509526437
Publisher: Wiley
Published: 2019-09-09T00:00:00+00:00


Exposing Privacy

Fanon’s (2008) Black Skin, White Masks reverberates through the work of MIT Media Lab researcher Joy Buolamwini, who discovered that the facial recognition technology with which she was working could detect the contours of her face only when she put on a White mask. This is what she calls the “coded gaze.” Buolamwini established the Algorithmic Justice League as part of her quest for “full spectrum inclusion,” to counter the bias she experienced.62 She asks: “If we do not improve the systems and they continue to be used, what are the implications of having innocent people identified as criminal suspects?”

While inclusion and accuracy are worthy goals in the abstract, given the encoding of long-standing racism in discriminatory design, what does it mean to be included, and hence more accurately identifiable, in an unjust set of social relations? Innocence and criminality are not objective states of being that can be detected by an algorithm but are created through the interaction of institutions and individuals against the backdrop of a deeply racialized history, in which Blackness is coded as a criminal. Inclusion in this context is more akin to possession, as in Fanon’s plea that the “tool never possess the man,” where possession alerts us to the way freedom is constrained.

Consider a population-wide facial recognition program in which the Zimbabwean government has contracted a China-based company to track millions of Zimbabwean citizens in order to make the Chinese database more comprehensive by “more clearly identify[ing] different ethnicities.” The benefit for Zimbabwe is access to a suite of technologies that can be used by law enforcement and other public agencies, while positioning China to become “the world leader in artificial intelligence.”63 Transnational algorithmic diversity training par excellence! Perhaps. Or, better, neocolonial extraction for the digital age in which the people whose faces populate the database have no rights vis-à-vis the data or systems that are built with their biometric input. Not only that. Since the biggest application of facial recognition is in the context of law enforcement and immigration control, Zimbabwe is helping Chinese officials to become more adept at criminalizing Black people within China and across the African diaspora.

Racist structures do not only marginalize but also forcibly center and surveil racialized groups that are “trapped between regimes of invisibility and spectacular hypervisibility,”64 threatened by inclusion in science and technology as objects of inquiry. Inclusion is no straightforward good but is often a form of unwanted exposure. Jasmine Nichole Cobb’s insight that “invisibility is … part of the social condition of blackness in modernity as well as an important representational tactic for people of African descent” – what Rusert describes as that “dialectic of calculated visibility and strategic invisibility” – is relevant to countering the New Jim Code.65

The figure of Saartjie (“Sara”) Baartman illustrates the violent underside of being forcibly seen. Baartman, who was taken from South Africa to Europe in 1810, was publicly displayed for large audiences in London and Paris, photographed, studied, and eventually dissected in death by the leading



Download



Copyright Disclaimer:
This site does not store any files on its server. We only index and link to content provided by other sites. Please contact the content providers to delete copyright contents if any and email us, we'll remove relevant links or contents immediately.