Voices (Towards Other Institutions) #13
Below is the beginning of Ramon Amaro’s contribution to Voices (Towards Other Institutions). To listen to the entire audio recording, click below.
«In our current social and political climate, issues of race and racism either are or should be at the forefront of any institutional conversation. This is more so apparent in events that have led up to the brutal killing of George Floyd and the increased attention given to anti-racism efforts around the globe. And the digital landscape is no exception. The collision of racialized bodies and technology has long preceded the digital. Yet the ideological processes that created this allowability for the use of some bodies as experimental spaces in the production of culture has remained the same since.
The distances between our social relations shorten through the prosperity of data driven technologies such as machine learning, artificial intelligence algorithms, and the racial politics that constitute a critical foundation for technological use and implementation, which travel with these types of artifacts of culture. They also bridge the gap between the historical circumstances of race-based injustices and contemporary digital practices.
Today, racialized bodies continue to reside as experimental spaces between data driven progress and the outcomes of analysis, where racial politics are an intermediary of the body’s acceptance into the cultural milieu. In other words, the racialized body is often the place where technology plays out a systemic way of doing culture, a culture which functions by mediating raciality of its users and techniques to create specific modalities of oppression and discrimination.
I want to consider a particular strand of this type of digital practice that is the use of data and algorithms for surveillance and policing of historically marginalized and racialized groups and individuals. Some of these algorithmic practices, for example the process commonly known as predictive policing, do little to consider that the data being analyzed might contain pre-existing biases, or that research into crime or gang violence, for instance, can stigmatized individuals and communities by creating nuanced views of individual and group characteristics. They also simplify the complex experiences of these individuals into a set of predetermined statistics: that these communities are predominantly racialize or ethnic minorities raises additional questions, whereby imbalance data practices have led to organizations such as the Leadership Conference on Civil and Human Rights to advocate that technologies that are designed and used in ways that respect the values of equal opportunity and equal justice contribute to the life chances of racialized individuals.»
Ramon Amaro is a design engineer and researcher, currently teaching Visual Cultures, Department of Visual Cultures at Goldsmiths, University of London. His writing, research and practice emerge at the intersections of Black Study, psychopathology, digital culture, and the critique of computation reason. His ultimate aim is to develop new methodologies for the study of race and digital culture.