The Moral Affordances of Construing People as Cases: How Algorithms and the Data They Depend on Obscure Narrative and Noncomparative Justice

Like many modes of rationalized governance, algorithms depend on rendering people as cases: discrete entities defined by regularized, atemporal attributes. This enables the computation behind the behavioral predictions organizations increasingly use to allocate benefits and burdens. Yet it elides another foundational way of understanding people: as actors in the unfolding narratives of their lives. This has epistemic implications because each cultural form entails a distinct information infrastructure. In this article, I argue that construing people as cases carries consequences for moral reasoning as well because different moral standards require different information. While rendering people as cases affords adjudications of comparative justice, parsing noncomparative justice often necessitates narrative. This explains why people frequently reach for stories that sit beyond the representations of individuals found in records and databases. With this argument, I contribute to the sociology of categorization/classification and draw broader conclusions about modern systems of bureaucratic, computational, and quantitative governance.