Viewing a single comment thread. View all comments

[deleted] wrote

3

celebratedrecluse wrote (edited )

error prone is the worst part, perhaps. since pools for machine learning are smaller for marginalised groups, and developers can be implicitly racist too, the software results in fale positives for lots of POC, "non-passing" people, etc.

so it reifies individual officers' profiling into a systemic, and thus technocratically legitimized, institution. basically, it bakes the racism into an algorithm. it is jim crow for the data era.

1