Abstract:
This paper examines the racial implications of police interaction with algorithms, particularly in the context of racial disparities in rearrest predictions. Our experimental study involved showing police officers the profiles of young offenders and asking them to predict rearrest probabilities within three years, first without and then after seeing the algorithm's assessment. The experiment varied the visibility of the offender's race (revealed to one group, hidden in another group, and mixed (some shown and some hidden) in the other group). Additionally, we explored how informing officers about the model's accuracy affected their responses. Our findings indicate that officers adjust their predictions towards the algorithm's assessment when the race of the profile is disclosed. However, these adjustments exhibit significant racial disparities, with a significant gap in initial rearrest predictions between Black and White offenders even when all observable characteristics are controlled for. Furthermore, only Black officers significantly reduced their predictions after viewing the the algorithm's assessments, while White officers did not. Our findings reveal the limited and nuanced effectiveness of algorithms in reducing bias in recidivism predictions, underscoring the complexities of algorithm-assisted human judgment in criminal justice.