An angry assemblage of algorithmic accountability

How does an understanding of the technology as a sociomaterial assemblage reveal new knowledge about the materiality of computation as well as systemic factors such as oppression? How do asymmetries of power and flows of oppression become amplified or attenuated within the systemic context of an information and communication technology (ICT)?

To understand the complexity of the actors involved in a case study of a biased commercial risk prediction algorithm, I created a ‘critical visualization’ that distills some of the key concepts of systemic inequities as a visual assemblage. This is an example of an iterative research through design (RtD) activity furthering my thinking at a critical point in my research.

This critical visualization examines a case study of bias in a medical algorithm which under-identified black patients for high-risk health prevention program by making the data set “color blind” (Obermeyer et al., 2019). I translated this case study into diagram in order to understand how designers might differently approach problem-framing and choice-making by visualizing the case as a sociotechnical assemblage that includes concepts of justice and systemic inequity.

The research questions for this visualization are:

  • How does an understanding of the technology as a sociomaterial assemblage reveal new knowledge about the materiality of computation as well as systemic factors such as oppression?

  • How do asymmetries of power and flows of oppression become amplified or attenuated within the systemic context of an information and communication technology (ICT)?

Previous
Previous

Radically hopeful visions of anti-racist futures

Next
Next

Anti-racism Transformation Team at Columbia