https://www.theguardian.com/us-news/2026/mar/12/tennessee-gr... - Another article on this without a paywall.
It's annoying that both articles are calling this AI error. This was human error, the police did the wrong thing and the people of Fargo will end up paying for this fuckup.
When I load this URL I get "One more step Please complete the security check to access" and I cannot get past the archive.is computational paywall.
But the guardian article actually has text! Thanks.
Wrongly arrested individuals with mental disabilities have a history of physical abuse in jail potentially to the point of death.
The AI made the call and humans licked its butthole
Many people are treating this as a matter of philosophy, which it isn't.
At a primitive, physiological level if you delegate to AI and most of the time you don't get in trouble for it, the resulting relationship you have with the AI could only be called "trust".
If you're expected to be 40% more productive at your job, your employer is making it crystal clear that you will trust the AI or you will be fired. Even if nobody ever said it, the sales pitch is that AI does the work and people are mostly there to be their servants whose role is to keep them fed with decisions we want made but don't want to be responsible for making.
Yes, that's what the OPs "incompetence and negligence" referred to.
Though, the question remains: are the tools built in such a way as to deceive the user into a false sense of trust or certainty?
_Some_ of the blame lies on the UX here. It must.
The cops need to be held accountable.
But it’s glaringly obvious that if you build tools like this and give them to the US police this is the outcome you will get. The toolmakers deserve blame too.
Sales will sell the dream, who cares if the real world outcomes don’t align?
Are AI code assist tools built in such a way as to deceive the user into a false sense of trust or certainty? Very much so (even if that isn't a primary objective).
Does any part of the blame lie on the UX if a dev submits a bad change? No, none.
You are ultimately, solely responsible for your work output, regardless of which tool you choose to use. If using your tool wrong means you make someone homeless, car-less, and also you kill their dog, then you should be a lot more cautious and perform a lot more verification than the average senior engineer.
That's setting aside the tendency for police to hire from the left side of the bell curve to avoid independent thinkers that might question authority, refuse to do bad shit, etc.
No, the tools work perfectly as they were design to work. The problem is that the tools are flawed.
Ultimately, every single of these decisions should be approved by a human, which should be responsible for the fuck up no matter what the consequences are.
> _Some_ of the blame lies on the UX here. It must.
No, the blame lies with the person or the group who approve the usage of these tools, without understanding their shortcomings.
New LLM-related AIs are all supremely confident in every assertion, no matter how wrong.
Even if it also output a score, that score depends on how the model was trained. And the cops might ignore it anyways.
> No, the blame lies with the person or the group who approve the usage of these tools, without understanding their shortcomings.
The person who approved the tools might've understood, but that doesn't mean the user understands. _Some_ of the reason why the user doesn't understand the shortcomings of the tool might be because of misleading UX.
https://abovethelaw.com/2016/02/criminally-yours-indicting-a...
You can be arrested, indicted, and held in jail on pretrial, and there is literally no recourse. There are many other ways jail can happen without due process. Where I live:
* Civil contempt. Absolutely immunity. No due process. Record is about 16 years. Having a bad day? Judge can toss you in jail.
* "Dangerous." Half a year. No due process. He-said she-said.
* "Insane." Psychiatric hold. Three days. Due process on paper, not in practice. Police in my town can and do use this if they don't like you.
Absolutely no recourse. You come out with a gap in income, employment, and, if you missed rent/mortgage, no home. Landlords will simply throw your stuff away too.
You're also basically damned if things do move forward, since from jail, you have no access to evidence, to internet (for legal research), and no reasonable way to recruit a lawyer (and, for most people, pay for one).
Can happen to anyone. Less common if you're rich and can afford a good lawyer, but far from uncommon.
A friend of mine was committed longer than 3 days without council or the ability to represent themselves in the hearing. Apparently the whole process of being committed is ex parte in practice in some states.
At one point when I was brought to a hospital on false drug charges, they brought in nurses to curse and angrily make criminal accusations at me, trying to bait me into saying something to verbally defend against such brutal personal attacks so they could use it against me. I watched as two physicians carefully watched for my reaction and managed to just respond in the kindest way possible because I realized they were basically baiting me to do something to write up as evidence to support a committal.
Just because you disagree with the outcome doesn't mean that due process wasn't given.
A judge should have to recuse themselves if they are acting as witness to the supposed infraction.
Qualified immunity protects individuals, not departments, from liability.
The particular thread (in this thread) that I was responding to:
>> I hope she wrings at least several million dollars out of the government.
> With all the lovely qualified immunity doctrine? That's wishful thinking.
I was responding to the claim that qualified immunity protected the government, it does not.
As long as you keep electing clowns that let the police do whatever they want, the police will... Do whatever they want.
Of course there's a balance that has to be struck so that police are empowered enough to act. So perhaps something like settlements against the police being 30% borne by the police pension fund and 70% by taxpayers is sufficient. I think this will also make police very enthusiastic about bodycams and holding each other accountable.
This is on us as voters. If we didn’t piss our pants every time a police union sneezed, we’d realize wholesale restarting police departments is precedents in even our largest cities.
>...Abby Tiscareno, a licensed daycare provider in Utah, was wrongfully convicted of felony child abuse when a child under her care suffered brain hemorrhaging. After calling emergency services, subsequent medical tests supported these findings. However, during her trial, requested medical records from the Utah Division of Child and Family Services (DCFS) were not provided. It wasn’t until a civil suit that Ms. Tiscareno saw pathology reports suggesting the injury could have occurred outside of her care. She was granted a new trial and acquitted. Her subsequent lawsuit for due process violations, alleging that DCFS failed to provide exculpatory evidence, was dismissed due to lack of precedent indicating DCFS’s obligation to produce such evidence.
https://innocenceproject.org/news/what-you-need-to-know-abou...
which the citizens end up footing the bill for. yay.
Who stole her dog?!
Absolutely ridiculous, I hope she wins her civil case.
Same comment?
Yes, it's critical to remember that multiple parties can be at fault. In a case like this, it is true that
a) law enforcement misused a tool and demonstrated extreme negligence
b) the judiciary didn't catch this, which suggests systemic negligence there too when it comes to their oversight responsibilities
c) the company selling/providing this AI tool should have known it was likely to be misused and is responsible for damages caused by such predictable usage
We cannot have a just world until our laws and norms result in loss of jobs and legitimacy as punishment for this sort of normalized failure, from all three parties. Immunity is a failed experiment.
Modern AI seems incapable of any respectable amount of accuracy or precision. Trusting that to destroy somebody's life is even more farcical than the oppressive police in "Brazil".
Knowing that there are (N) new items is so useful (to some people), that as far back as the 1990s, we developed technology called "RSS" to give you this superpower over a website that doesn't provide anything of the sort. One that simply updates with new stuff when you hit refresh, with no UI to indicate what is new/changed.
Also me, reading further: Uh-oh.
The chief of police also resigned today; wouldn't be shocked if this was part of the reasoning.
Source?
That said, it's portrayed as a retirement, and doesn't seem to give any hints that it's connected.
Pretend the tool is 99.999999% specific. If it searches every face in the USA you're still getting about 3 false positives PER SEARCH.
You will never have a criminal AI tool safe enough to apply at a national scale.
> Once they were in hand, Fargo police met with him and Lipps at the Cass County jail on Dec. 19. She had already been in jail for more than five months. It was the first time police interviewed her.
How is this the fault of AI? It flagged a possible match. A live human detective confirmed it. And the criminal justice system, for reasons that have nothing to do with AI, let this woman sit in jail for 5 months before doing even interviewing her or doing any due diligence.
There's a reason why we don't let AI autonomously jail people. Instead of scapegoating an AI bogeyman, maybe we should look instead at the professional human-in-the-loop who shirked all responsibility, and a criminal justice system that thinks it is okay to jail people for 5 months before even starting to assess their guilt.
It could be the fault of the company that's selling this service. They often make wildly inaccurate claims about the utility and accuracy of their systems. [0]
> There's a reason why we don't let AI autonomously jail people.
Yes we do. [1]
> and a criminal justice system that thinks it is okay to jail people for 5 months before even starting to assess their guilt.
Her guilt was assessed. That's why she had no bail. It assessed it incorrectly, but the error is more complicated than your reaction implies.
[0]: https://thisisreno.com/2026/03/lawsuit-reno-police-ai-polici...
[1]: https://projects.tampabay.com/projects/2020/investigations/p...
Because we're seeing the first instances of what reality looks like with AI in the hands of the average bear. Just like the excuse was "but the computer said it was correct," now we're just shifting to "but the AI said it was correct."
Don't underestimate how much authority and thinking people will delegate to machines. Not to mention the lengths they'll go to weasel out of taking responsibility for a screw up like this (saw another comment in this thread about the Chief of Police stepping down but it being framed as "retirement").
I smell a lawsuit
Unfortunately we'll probably see a trend of people using AI and then blaming AI for cases where they mis-used AI in roles it's not good for or failed to review or monitor the AI.
There is far too little skepticism around the magic box that solves all problems which is causing issues like this. It's not the fault of the AI (as if it could be assigned liability) for being misused, but this kind of misuse is far too common right now so scare stories like this are helpful and we should highlight the use of AI in mistakes like this.
https://nob.cs.ucdavis.edu/classes/ecs153-2019-04/readings/c...
“A computer can never be held accountable, therefore a computer must never make a management decision.”
(https://www.ibm.com/think/insights/ai-decision-making-where-...)
This type of incident isn't new and is only going to get worse. The problem is our governments are doing absolutely nothing about it. I'll give two examples:
1. Hertz implemented a system where they falsely reported cars as being stolen. People were arrested and went to jail for rental cars that were sitting in the Hertz lot. Hertz ultimately had to pay $168 million in a settlement [1]. That's insufficient. If I, as an ordinary citizen, make a false police report that somebody stole my car I can be criminally charged. And rightly so. People should go to jail for this and it will continue until they do. These fines and settlements are just the cost of doing business; and
2. The UK government contracted Fujitsu to produce a new system for their post offices. That system was allowed to produce criminal charges for fraud that were completely false. People committed suicide over this. This went on for what? A decade or more? But resuted in a parliamentary inquiry and settlements. It's known as the British Post Office scandal [2]. Again, people should go to jail for this.
Everyone here is one LLM decision away from having their life ruined where nobody can explain why a decision was made, what the basis of that decision was and what can be done to fix it. No requirements exist to prove such claims. The burden falls on ordinary people to prove the claims are false.
[1]: https://www.npr.org/2022/12/06/1140998674/hertz-false-accusa...
[2]: https://en.wikipedia.org/wiki/British_Post_Office_scandal
rectang•1h ago
> Her bank records showed she was more than 1,200 miles away, at home in Tennessee at the same time police claimed she was in Fargo committing fraud.
> Unable to pay her bills from jail, she lost her home, her car and even her dog