> more likely to correctly identify black participants than participants from other ethnic groups.
> AI surveillance that is experimental, untested, inaccurate or potentially biased has no place on our streets.
I wonder if they're more worried about putting too many men in prison or too many black people.
I am genuinely unsure what's going on.
My understanding of the article is that the system is problematic because it is more likely to correctly identify black people than "other ethnic groups". Is that right?
Adendum: Essex Ethnicity breakdown- 85.1% White British · 5.2% Other White · 3.7% Asian · 2.5% Black · 2.4% Mixed · 1.1% Other · (2021).
from: https://en.wikipedia.org/wiki/Essex
ie: most accurate (however acccurate that is) for the men of 2.5% of the regions population
Not so accurate for 98.75% of the regions population.
Ditto men vs. women, mutatis mutandis.
So essentially they're pausing the use of it because it works too well for group A / not well enough for group B, potentially leading to disproportionate (albeit correct) arrests of group A.
Technology has moved on a lot no doubt, however, studies were finding the opposite (and with order of magnitude errors) as recently as 2020 with a lazy google literature search
> these algorithms were found to be between 10 and 100 times more likely to misidentify a Black or East Asian face than a white face
https://jolt.law.harvard.edu/digest/why-racial-bias-is-preva...
gib444•1h ago
Essex police, well aware of all the issues before using it, pause use until expected bad publicity dies down
Or
Essex police chosen as force to take some flack for the issues while other forces steam ahead