frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Shall I implement it? No

https://gist.github.com/bretonium/291f4388e2de89a43b25c135b44e41f0
311•breton•1h ago•115 comments

Malus – Clean Room as a Service

https://malus.sh
912•microflash•8h ago•357 comments

Bubble Sorted Amen Break

https://parametricavocado.itch.io/amen-sorting
217•eieio•5h ago•74 comments

Reversing memory loss via gut-brain communication

https://med.stanford.edu/news/all-news/2026/03/gut-brain-cognitive-decline.html
169•mustaphah•5h ago•48 comments

ATMs didn't kill bank teller jobs, but the iPhone did

https://davidoks.blog/p/why-the-atm-didnt-kill-bank-teller
266•colinprince•7h ago•320 comments

Innocent woman jailed after being misidentified using AI facial recognition

https://www.grandforksherald.com/news/north-dakota/ai-error-jails-innocent-grandmother-for-months...
194•rectang•1h ago•102 comments

The Met releases high-def 3D scans of 140 famous art objects

https://www.openculture.com/2026/03/the-met-releases-high-definition-3d-scans-of-140-famous-art-o...
177•coloneltcb•6h ago•35 comments

Forcing Flash Attention onto a TPU and Learning the Hard Way

https://archerzhang.me/forcing-flash-attention-onto-a-tpu
21•azhng•4d ago•2 comments

Document poisoning in RAG systems: How attackers corrupt AI's sources

https://aminrj.com/posts/rag-document-poisoning/
20•aminerj•8h ago•7 comments

Show HN: OneCLI – Vault for AI Agents in Rust

https://github.com/onecli/onecli
105•guyb3•5h ago•36 comments

Bringing Chrome to ARM64 Linux Devices

https://blog.chromium.org/2026/03/bringing-chrome-to-arm64-linux-devices.html
27•ingve•2h ago•30 comments

Runners who churn butter on their runs

https://www.runnersworld.com/news/a70683169/how-to-make-butter-while-running/
52•randycupertino•3h ago•24 comments

Launch HN: IonRouter (YC W26) – High-throughput, low-cost inference

https://ionrouter.io
29•vshah1016•3h ago•13 comments

WolfIP: Lightweight TCP/IP stack with no dynamic memory allocations

https://github.com/wolfssl/wolfip
72•789c789c789c•6h ago•7 comments

Dolphin Progress Release 2603

https://dolphin-emu.org/blog/2026/03/12/dolphin-progress-report-release-2603/
281•BitPirate•13h ago•47 comments

An old photo of a large BBS (2022)

https://rachelbythebay.com/w/2022/01/26/swcbbs/
128•xbryanx•2h ago•93 comments

Converge (YC S23) Is Hiring a Founding Platform Engineer (NYC, Onsite)

https://www.runconverge.com/careers/founding-platform-engineer
1•thomashlvt•5h ago

Big data on the cheapest MacBook

https://duckdb.org/2026/03/11/big-data-on-the-cheapest-macbook
273•bcye•10h ago•241 comments

Show HN: Understudy – Teach a desktop agent by demonstrating a task once

https://github.com/understudy-ai/understudy
66•bayes-song•5h ago•18 comments

US private credit defaults hit record 9.2% in 2025, Fitch says

https://www.marketscreener.com/news/us-private-credit-defaults-hit-record-9-2-in-2025-fitch-says-...
174•JumpCrisscross•9h ago•304 comments

Show HN: Detect any object in satellite imagery using a text prompt

https://www.useful-ai-tools.com/tools/satellite-analysis-demo/
6•eyasu6464•4d ago•0 comments

Show HN: Axe – A 12MB binary that replaces your AI framework

https://github.com/jrswab/axe
120•jrswab•8h ago•85 comments

Show HN: OpenClaw-class agents on ESP32 (and the IDE that makes it possible)

https://pycoclaw.com/
4•pycoclaw•46m ago•1 comments

Are LLM merge rates not getting better?

https://entropicthoughts.com/no-swe-bench-improvement
88•4diii•10h ago•95 comments

The Cost of Indirection in Rust

https://blog.sebastiansastre.co/posts/cost-of-indirection-in-rust/
73•sebastianconcpt•3d ago•31 comments

The Road Not Taken: A World Where IPv4 Evolved

https://owl.billpg.com/ipv4x/
40•billpg•6h ago•67 comments

NASA's DART spacecraft changed an asteroid's orbit around the sun

https://www.sciencenews.org/article/spacecraft-changed-asteroid-orbit-nasa
91•pseudolus•3d ago•57 comments

Full Spectrum and Infrared Photography

https://timstr.website/blog/fullspectrumphotography.html
40•alter_igel•4d ago•22 comments

DDR4 Sdram – Initialization, Training and Calibration

https://www.systemverilog.io/design/ddr4-initialization-and-calibration/
47•todsacerdoti•2d ago•13 comments

Italian prosecutors seek trial for Amazon, 4 execs in alleged $1.4B tax evasion

https://www.reuters.com/world/italian-prosecutors-seek-trial-amazon-four-execs-over-alleged-14-bl...
224•amarcheschi•6h ago•58 comments
Open in hackernews

Innocent woman jailed after being misidentified using AI facial recognition

https://www.grandforksherald.com/news/north-dakota/ai-error-jails-innocent-grandmother-for-months-in-north-dakota-fraud-case
193•rectang•1h ago

Comments

rectang•1h ago
> facial recognition showed she was the main suspect in what Fargo police called an organized bank fraud case.

> Her bank records showed she was more than 1,200 miles away, at home in Tennessee at the same time police claimed she was in Fargo committing fraud.

> Unable to pay her bills from jail, she lost her home, her car and even her dog

Jtsummers•1h ago
https://archive.is/yCaVV - Archive link to get around the paywall.

https://www.theguardian.com/us-news/2026/mar/12/tennessee-gr... - Another article on this without a paywall.

It's annoying that both articles are calling this AI error. This was human error, the police did the wrong thing and the people of Fargo will end up paying for this fuckup.

superkuh•1h ago
> https://archive.is/yCaVV

When I load this URL I get "One more step Please complete the security check to access" and I cannot get past the archive.is computational paywall.

But the guardian article actually has text! Thanks.

hrimfaxi•26m ago
That's a common issue if you use cloudflare dns.
janalsncm•10m ago
I would argue it was both. No doubt this company was marketing it in a way to make it seem very reliable. And all of the procedural things afterwards made the error so much more damaging.

But imo this is why local police departments should not have access to this kind of tool. It is too powerful, and the statistical interpretation is too complicated for random North Dakota cops to use responsibly. Neither the company nor the PD have an incentive to be careful.

mitchbob•1h ago
https://archive.ph/2026.03.12-183903/https://www.grandforksh...
jauer•1h ago
AI or not, it's unconscionable that victims of compulsory legal processes by way of mistaken identity are not made whole.
ryandrake•49m ago
People will defend this, too, saying “well, she was eventually exonerated, right? So the system works!” Ignoring how she’ll never be fully reimbursed for the time, money, and grief of going through the system.
munk-a•39m ago
We also need to question how many people might go through the same process without eventual exoneration and how much going through this process costs individuals. Being falsely prosecuted comes usually imparts a permanent black mark in search results about the person (outside of places with sane laws like the EU) as well as causing stress or permanent injury.

Wrongly arrested individuals with mental disabilities have a history of physical abuse in jail potentially to the point of death.

janalsncm•17m ago
> In all criminal prosecutions, the accused shall enjoy the right to a speedy and public trial

This is from the Sixth Amendment. Where the rubber hits the road is what “speedy” means.

neaden•1h ago
I hate this headline (not blaming submitter). Police incompetence and negligence jailed her for months and left her stranded in a North Dakota winter. The AI is no more responsible than the cars and airplanes they used.

Edit: this is in reference to the original headline "AI error jails innocent grandmother for months in North Dakota fraud case" not the revised title that it was changed to.

conartist6•1h ago
I disagree. Clearly the police felt the AI was "responsible enough" to be the only thing they needed to trust.

The AI made the call and humans licked its butthole

nkrisc•1h ago
And that is a complete failure of the police and authorities. They made the decision to extradite her with such flimsy evidence.
conartist6•43m ago
If it didn't erase accountability, how would it create any value?

Many people are treating this as a matter of philosophy, which it isn't.

At a primitive, physiological level if you delegate to AI and most of the time you don't get in trouble for it, the resulting relationship you have with the AI could only be called "trust".

If you're expected to be 40% more productive at your job, your employer is making it crystal clear that you will trust the AI or you will be fired. Even if nobody ever said it, the sales pitch is that AI does the work and people are mostly there to be their servants whose role is to keep them fed with decisions we want made but don't want to be responsible for making.

dmurray•1h ago
Even if she was guilty, they shouldn't have imprisoned her for 3+ months without interviewing her. The AI didn't tell them to do that.
throw-the-towel•1h ago
I think you actually agree with the GP? As I understand them, they're saying that it's not the AI tool that takes the most blame, it's the police.
rpdillon•1h ago
And the police were wrong, which is why they're the culpable ones.
Chris2048•54m ago
Even if the id was correct, why would they leave her in jail for 5 months before the first interview and/or court appearance?
like_any_other•53m ago
> Clearly the police felt the AI was "responsible enough" to be the only thing they needed to trust.

Yes, that's what the OPs "incompetence and negligence" referred to.

PTOB•41m ago
No indication that the licking was consentual.
add-sub-mul-div•1h ago
Your picking apart the words doesn't matter if police are more incompetent with AI than without it. AI being the catalyst to a worse society is a more interesting and worthwhile topic than whether "AI is responsible" is the right way to phrase it.
_m_p•58m ago
A jury will probably decide the AI company's level of responsibility at trial. It is an open question til then!
mirekrusin•27m ago
Brave police officers wanted to show us all the dangers of AI slop.
mmooss•13m ago
If you make the AI software, then your software malfunctioned.

If the laser printer screws up a page in the middle of the document, and the user doesn't catch it and includes it in the board of directors binder, the laser printer still malfunctioned.

tony_cannistra•1h ago
Completely infuriating, but more of a commentary on the sad state of incompetent power-hungry law enforcement with tools they don't know how to use than the tools themselves.

Though, the question remains: are the tools built in such a way as to deceive the user into a false sense of trust or certainty?

_Some_ of the blame lies on the UX here. It must.

sidrag22•1h ago
It must land as human's fault or this will become more and more of a pattern to avoid accountability.
paulhebert•55m ago
It’s both.

The cops need to be held accountable.

But it’s glaringly obvious that if you build tools like this and give them to the US police this is the outcome you will get. The toolmakers deserve blame too.

hsbauauvhabzb•57m ago
Spoken like someone who isn’t built for a sales role at said company.

Sales will sell the dream, who cares if the real world outcomes don’t align?

ImPostingOnHN•56m ago
> are the tools built in such a way as to deceive the user into a false sense of trust or certainty? _Some_ of the blame lies on the UX here. It must.

Are AI code assist tools built in such a way as to deceive the user into a false sense of trust or certainty? Very much so (even if that isn't a primary objective).

Does any part of the blame lie on the UX if a dev submits a bad change? No, none.

You are ultimately, solely responsible for your work output, regardless of which tool you choose to use. If using your tool wrong means you make someone homeless, car-less, and also you kill their dog, then you should be a lot more cautious and perform a lot more verification than the average senior engineer.

tony_cannistra•43m ago
I agree with all that. Maybe the word isn't "blame," then. Surely there must be some code, perhaps moral or ethical, but ideally more rigorously enforcible, which ought to prevent the development of intentionally deceiving tools. Sure you could say this about all software, but that which can cause actual physical harm ought to be held to a higher standard.
ImPostingOnHN•30m ago
Yes, unfortunately technology is advancing faster than the average human brain evolves more neurons, so it will only become less comprehensible to the average person.

That's setting aside the tendency for police to hire from the left side of the bell curve to avoid independent thinkers that might question authority, refuse to do bad shit, etc.

throw_m239339•55m ago
> they don't know how to use than the tools themselves.

No, the tools work perfectly as they were design to work. The problem is that the tools are flawed.

Ultimately, every single of these decisions should be approved by a human, which should be responsible for the fuck up no matter what the consequences are.

> _Some_ of the blame lies on the UX here. It must.

No, the blame lies with the person or the group who approve the usage of these tools, without understanding their shortcomings.

Pxtl•53m ago
I miss the days of earlier AI image-recognition software that would emit a confidence percentage.

New LLM-related AIs are all supremely confident in every assertion, no matter how wrong.

janalsncm•30m ago
I don’t know what tool they used, but it was very likely not an LLM. They probably have some database of drivers’ licenses and they ran a similarity search against the surveillance footage. This poor lady happened to be the top match.

Even if it also output a score, that score depends on how the model was trained. And the cops might ignore it anyways.

jolmg•36m ago
>> are the tools built in such a way as to deceive the user into a false sense of trust or certainty? _Some_ of the blame lies on the UX here. It must.

> No, the blame lies with the person or the group who approve the usage of these tools, without understanding their shortcomings.

The person who approved the tools might've understood, but that doesn't mean the user understands. _Some_ of the reason why the user doesn't understand the shortcomings of the tool might be because of misleading UX.

rpcope1•1h ago
There's no way this isn't a slam dunk case to sue the piss out of the Fargo Police, probably the US Marshals and maybe other orgs. The woman in the surveillance phone clearly looks way younger, among the many other obvious signs this woman didn't do it. I hope she wrings at least several million dollars out of the government.
Blackthorn•1h ago
With all the lovely qualified immunity doctrine? That's wishful thinking.
Jtsummers•59m ago
That may protect them personally, but not the city and the department itself from being sued.
blagie•45m ago
Nope.

https://abovethelaw.com/2016/02/criminally-yours-indicting-a...

You can be arrested, indicted, and held in jail on pretrial, and there is literally no recourse. There are many other ways jail can happen without due process. Where I live:

* Civil contempt. Absolutely immunity. No due process. Record is about 16 years. Having a bad day? Judge can toss you in jail.

* "Dangerous." Half a year. No due process. He-said she-said.

* "Insane." Psychiatric hold. Three days. Due process on paper, not in practice. Police in my town can and do use this if they don't like you.

Absolutely no recourse. You come out with a gap in income, employment, and, if you missed rent/mortgage, no home. Landlords will simply throw your stuff away too.

You're also basically damned if things do move forward, since from jail, you have no access to evidence, to internet (for legal research), and no reasonable way to recruit a lawyer (and, for most people, pay for one).

Can happen to anyone. Less common if you're rich and can afford a good lawyer, but far from uncommon.

mothballed•37m ago
>* "Insane." Psychiatric hold. Three days. Due process on paper, not in practice. Police in my town can and do use this if they don't like you.

A friend of mine was committed longer than 3 days without council or the ability to represent themselves in the hearing. Apparently the whole process of being committed is ex parte in practice in some states.

At one point when I was brought to a hospital on false drug charges, they brought in nurses to curse and angrily make criminal accusations at me, trying to bait me into saying something to verbally defend against such brutal personal attacks so they could use it against me (I made a complaint to the board because they also touched me without consent -- the board determined in a very wild conclusion that since police were present they were essentially deputized as police so can't be held liable for violating their nursing license). I watched as two physicians carefully watched for my reaction and managed to just respond in the kindest way possible because I realized they were basically baiting me to do something to write up as evidence to support a committal.

abduhl•34m ago
This is a bit hyperbolic and the exaggerations really undermine what I think is your broader point (that there is rarely recourse when you're held for short to moderate amounts of time). It is hard for me to believe that someone was held for 16 years on civil contempt without due process or that someone was held for half a year without due process after being deemed dangerous. The reason that is hard for me to believe is that the due process is implicit in the action you describe. Civil contempt is from a judge which implies that you're already in court - that's due process. Someone being labeled "dangerous" implies that a finding was made by a neutral party - that's due process.

Just because you disagree with the outcome doesn't mean that due process wasn't given.

mothballed•29m ago
Yeah it's "due process." In civil contempt the judge is a witness and prosecutor in the very "process" they're judging. That's the most perverted form of due process imaginable.

A judge should have to recuse themselves if they are acting as witness to the supposed infraction.

abduhl•19m ago
Civil contempt isn't some roving criminal charge that jumps out of the jury box randomly. It's meant to make somebody comply with a court order. Anybody in civil contempt holds the keys to the jailhouse door in their own hands, all they have to do is comply.

This statement should make you uncomfortable. It makes me uncomfortable because it is a pure expression of the power of the state. But it's still due process.

Jtsummers•29m ago
I don't know what you're responding to, but I don't think it's my comment.

Qualified immunity protects individuals, not departments, from liability.

The particular thread (in this thread) that I was responding to:

>> I hope she wrings at least several million dollars out of the government.

> With all the lovely qualified immunity doctrine? That's wishful thinking.

I was responding to the claim that qualified immunity protected the government, it does not.

djfobbz•59m ago
Criminal immunity? Sure. Civil immunity? Nope! She could definitely make a nice buck.
theLiminator•57m ago
Off of taxpayer money sadly. Imo we really need a fix for this. When cops are grossly negligent the money should come out of their aggregate pension fund (or at least partially).
SunshineTheCat•51m ago
Yes, this is the key point. Tax payers get a nice big bill while the people who caused the problem get a nice paid vacation while they conduct an internal "investigation" that typically finds they did nothing wrong.
vkou•50m ago
There is a fix to it. Elect people who will hold them accountable.

As long as you keep electing clowns that let the police do whatever they want, the police will... Do whatever they want.

theLiminator•45m ago
Yeah, of course they need to held accountable, and we need to vote in people who will do so. What I'm suggesting is an alignment of incentives that will ensure that police will try to do their best to not be negligent.

Of course there's a balance that has to be struck so that police are empowered enough to act. So perhaps something like settlements against the police being 30% borne by the police pension fund and 70% by taxpayers is sufficient. I think this will also make police very enthusiastic about bodycams and holding each other accountable.

rectang•45m ago
“Tough on crime” -> lenient on police -> innocent grandmas in jail.
GuinansEyebrows•13m ago
despite this being something practically everybody wants, the fact that it hasn't happened is not a coincidence and speaks to the power of police unions/guilds and their lobbying arms. outside a few toothless instances, those groups are extremely good at reframing these attempts and mobilizing their bases to vote against the broader public interest.

it sucks.

JumpCrisscross•48m ago
> we really need a fix for this. When cops are grossly negligent the money should come out of their aggregate pension fund

This is on us as voters. If we didn’t piss our pants every time a police union sneezed, we’d realize wholesale restarting police departments is precedents in even our largest cities.

lotsofpulp•15m ago
Almost all taxpayer funded pension funds are already underfunded. It makes no difference if the funding decreases or increases, the government employee will still get their benefit. The government would have to go through bankruptcy to get the benefit amount reduced.
opo•37m ago
Qualified immunity doesn't apply to criminal cases. It is used to defend against civil suits. It is unfortunately very easy to find many cases where it leads to injustice. For example:

>...Abby Tiscareno, a licensed daycare provider in Utah, was wrongfully convicted of felony child abuse when a child under her care suffered brain hemorrhaging. After calling emergency services, subsequent medical tests supported these findings. However, during her trial, requested medical records from the Utah Division of Child and Family Services (DCFS) were not provided. It wasn’t until a civil suit that Ms. Tiscareno saw pathology reports suggesting the injury could have occurred outside of her care. She was granted a new trial and acquitted. Her subsequent lawsuit for due process violations, alleging that DCFS failed to provide exculpatory evidence, was dismissed due to lack of precedent indicating DCFS’s obligation to produce such evidence.

https://innocenceproject.org/news/what-you-need-to-know-abou...

anigbrowl•1h ago
imho the US Marshals are the only innocent party here, as my understanding is they don't do investigations and just serve warrants without any knowledge of the underlying case.
john_strinlai•54m ago
>I hope she wrings at least several million dollars out of the government.

which the citizens end up footing the bill for. yay.

eek2121•51m ago
Maybe the citizens will learn to elect better leaders.
phendrenad2•41m ago
Maybe they'll realize votes have consequences.
JumpCrisscross•45m ago
“Unable to pay her bills from jail, she lost her home, her car and even her dog.”

Who stole her dog?!

quickthrowman•1h ago
It’s obvious from the one photo they posted of the actual suspect that the lady they arrested is about 20-30 years older than the woman in the bank photo. The woman in the photo is maybe 25-30 years old, this grandma looks like she’s 65-70 (actual age of 50).

Absolutely ridiculous, I hope she wins her civil case.

Aardwolf•1h ago
This reminds me of the British Post Office Scandal: https://en.wikipedia.org/wiki/British_Post_Office_scandal
pdpi•52m ago
I followed the inquiry when it was ongoing — all of the depositions were live on YouTube. The level of both hubris and incompetence involved in that case was breathtaking.
kawsper•20m ago
If you can get your hands on it, I recommend the 4 episode BAFTA-winning mini-series about it: https://en.wikipedia.org/wiki/Mr_Bates_vs_The_Post_Office
anigbrowl•1h ago
It is an AI error, but also an error on the part of the cops, the prosecutors, the judge, and the county sheriff (who is responsible for the jail inmates). I hope everyone involved in this travesty is sued into oblivion and unable to hide behind their immunity defenses. Facial recognition should never be the sole basis for a warrant.
causal•53m ago
This x1000. We need to suspend this shared fiction that AI has any agency. Only humans can be responsible. Full stop.
irishcoffee•22m ago
ICE detains innocent woman 1200 miles away based on AI

Same comment?

GuinansEyebrows•6m ago
respectfully, can you elaborate on why the answer would not be yes? or am i just misreading your comment?
idle_zealot•45m ago
> It is an AI error, but also an error on the part of the cops, the prosecutors, the judge, and the county sheriff

Yes, it's critical to remember that multiple parties can be at fault. In a case like this, it is true that

a) law enforcement misused a tool and demonstrated extreme negligence

b) the judiciary didn't catch this, which suggests systemic negligence there too when it comes to their oversight responsibilities

c) the company selling/providing this AI tool should have known it was likely to be misused and is responsible for damages caused by such predictable usage

We cannot have a just world until our laws and norms result in loss of jobs and legitimacy as punishment for this sort of normalized failure, from all three parties. Immunity is a failed experiment.

recursivecaveat•40m ago
Even if she was a read ringer (clearly not the same person to any human who glances at the image), common sense should tell you that among 340,000,000 Americans there are a lot of lookalikes. Clearly there's a kind of stupid belief in the mystic powers of an AI and a callous disregard for the well being of suspects. No one should be dragged 1000 miles and held for months based on a facial match, especially when exculpatory evidence was easily available.
jchama•1h ago
The movie "Brazil" was right!
_doctor_love•55m ago
We do the work, you do the pleasure!
Pxtl•55m ago
Except in "Brazil" it was a mechanical error in a deterministic machine caused by an invasive outside actor. It would be reasonable to trust that the autotypewriter/printer would faithfully output the correct text.

Modern AI seems incapable of any respectable amount of accuracy or precision. Trusting that to destroy somebody's life is even more farcical than the oppressive police in "Brazil".

hsbauauvhabzb•58m ago
Why the fuck does a newspaper need a ‘notifications’ icon in the top right hand corner?
acuozzo•46m ago
How else can they report on BREAKING NEWS if it doesn't at least break your concentration?
kazinator•39m ago
Because it has an updating-feed-like structure, in which new items can appear.

Knowing that there are (N) new items is so useful (to some people), that as far back as the 1990s, we developed technology called "RSS" to give you this superpower over a website that doesn't provide anything of the sort. One that simply updates with new stuff when you hit refresh, with no UI to indicate what is new/changed.

holman•50m ago
Me: Whoa, cool, my hometown is on atop Hacker News!

Also me, reading further: Uh-oh.

The chief of police also resigned today; wouldn't be shocked if this was part of the reasoning.

JumpCrisscross•46m ago
> chief of police also resigned today

Source?

waterhouse•44m ago
Googling "fargo police chief resigns": https://www.inforum.com/news/fargo/zibolski-announces-his-re... among other results.

That said, it's portrayed as a retirement, and doesn't seem to give any hints that it's connected.

JumpCrisscross•40m ago
Out of curiosity, was the guy known for being fast and loose with the rules? Put more simply, was he a good cop? Or did he have a history of being a rogue.
Sl1mb0•11m ago
Are authoritarians good? That's basically what you are asking.
sumeno•3m ago
There are no good cops
PTOB•43m ago
I am from a town that gets national news coverage only for Shenanigans like this.
causal•50m ago
Wait - what was the AI tool and how did it have her face to begin with? If small-town police are doing face-matching searches across national databases then nobody is safe because the number of false positives is going to be MASSIVE by sheer number of people being searched every day.

Pretend the tool is 99.999999% specific. If it searches every face in the USA you're still getting about 3 false positives PER SEARCH.

You will never have a criminal AI tool safe enough to apply at a national scale.

whack•48m ago
> According to the court documents, the Fargo detective working the case then looked at Lipps' social media accounts and Tennessee driver's license photo. In his charging document, the detective wrote that Lipps appeared to be the suspect based on facial features, body type and hairstyle and color.

> Once they were in hand, Fargo police met with him and Lipps at the Cass County jail on Dec. 19. She had already been in jail for more than five months. It was the first time police interviewed her.

How is this the fault of AI? It flagged a possible match. A live human detective confirmed it. And the criminal justice system, for reasons that have nothing to do with AI, let this woman sit in jail for 5 months before doing even interviewing her or doing any due diligence.

There's a reason why we don't let AI autonomously jail people. Instead of scapegoating an AI bogeyman, maybe we should look instead at the professional human-in-the-loop who shirked all responsibility, and a criminal justice system that thinks it is okay to jail people for 5 months before even starting to assess their guilt.

RobRivera•47m ago
I think it's more nuanced; it is one error in a Tragedy of Errors.
themafia•44m ago
> How is this the fault of AI?

It could be the fault of the company that's selling this service. They often make wildly inaccurate claims about the utility and accuracy of their systems. [0]

> There's a reason why we don't let AI autonomously jail people.

Yes we do. [1]

> and a criminal justice system that thinks it is okay to jail people for 5 months before even starting to assess their guilt.

Her guilt was assessed. That's why she had no bail. It assessed it incorrectly, but the error is more complicated than your reaction implies.

[0]: https://thisisreno.com/2026/03/lawsuit-reno-police-ai-polici...

[1]: https://projects.tampabay.com/projects/2020/investigations/p...

rglover•31m ago
> How is this the fault of AI? It flagged a possible match. A live human detective confirmed it.

Because we're seeing the first instances of what reality looks like with AI in the hands of the average bear. Just like the excuse was "but the computer said it was correct," now we're just shifting to "but the AI said it was correct."

Don't underestimate how much authority and thinking people will delegate to machines. Not to mention the lengths they'll go to weasel out of taking responsibility for a screw up like this (saw another comment in this thread about the Chief of Police stepping down but it being framed as "retirement").

obviouslynotme•18m ago
It's not. This is just an acceleration in the unraveling of society facilitated by AI. As someone whose childhood included so many "robots will kill humans" books and movies, I am flabbergasted that the AI apocalypse will be dumb humans overtrusting faulty AI in important matters until everything falls apart.

Most humans cannot distinguish AI from actual intelligence. When you combine that with bureaucrats innate tendency to say, "Computer said so," you end up with bizarre situations like this. If a person had made this facial match, another human would have relentlessly jeered him. Since a computer running AI did it, no one even cared to think about it.

Computers are wildly dangerous, not because of anything innate but because of how humans act around them.

blitzar•17m ago
computer said yes
caconym_•16m ago
This particular "AI bogeyman" isn't just AI; it's cops with AI and in particular cops with facial recognition tools, dragnet LPR surveillance tools, and all this other new technology that essentially picks somebody's name out of a hat to have their life temporarily (or [semi-]permanently) ruined by shithead cops who won't ever face any real accountability.

This keeps happening, and the reason it keeps happening is that shithead cops have these tools and are using them. Until we can find a reliable way to prevent this from happening, which may or may not be possible, cops who may or may not be shitheads should not have access to these tools.

throwaway314155•6m ago
There’s nothing wrong with your comment per se, but it’s almost as if you didn’t even read the comment you’re responding to.
RobRivera•48m ago
>Unable to pay her bills from jail, she lost her home, her car and even her dog. Fargo police say the bank fraud case is still under investigation and no arrests have been made.

I smell a lawsuit

api•47m ago
It's not an AI error. It's a human error in mis-using AI in this way. Saying it's an AI error is like saying a hole in your drywall is a hammer error.

Unfortunately we'll probably see a trend of people using AI and then blaming AI for cases where they mis-used AI in roles it's not good for or failed to review or monitor the AI.

munk-a•42m ago
It's both. It's good to acknowledge that AI is easy to misuse in this manner but it doesn't detract from the fact that the ultimate responsibility lies in those that should be verifying the tool output.

There is far too little skepticism around the magic box that solves all problems which is causing issues like this. It's not the fault of the AI (as if it could be assigned liability) for being misused, but this kind of misuse is far too common right now so scare stories like this are helpful and we should highlight the use of AI in mistakes like this.

zingar•44m ago
“Computers don’t argue” seemed charmingly wrong about how computers work until a few short years ago.

https://nob.cs.ucdavis.edu/classes/ecs153-2019-04/readings/c...

layer8•35m ago
This quote from a 1979 IBM training manual remains applicable:

“A computer can never be held accountable, therefore a computer must never make a management decision.”

(https://www.ibm.com/think/insights/ai-decision-making-where-...)

bethekidyouwant•40m ago
I read the article and I don’t really understand… she was held in a jail in Tennessee but the article states they flew her to North Dakota? And somehow she’s a fugitive so that’s why she doesn’t get bail? but she’s a fugitive held in her own state in a holding facility? But then when they release her, she’s in North Dakota? So if some state says you’re a fugitive your home state will just hold you in jail until they come and put you on an airplane? Is that correct?
janalsncm•15m ago
I read it as her arrested and held in Tennessee temporarily then flown to North Dakota.
jmyeet•28m ago
I will keep saying this til I am blue in the face: we are at a crossroads. One path leads to displacing workers and suppressing wages to further concentrate wealth into the hands of elite Epstein acolytes. The other path leads to making all of our lives better by automating the menial tasks so we all have to work less. We are firmly headed down the first path.

This type of incident isn't new and is only going to get worse. The problem is our governments are doing absolutely nothing about it. I'll give two examples:

1. Hertz implemented a system where they falsely reported cars as being stolen. People were arrested and went to jail for rental cars that were sitting in the Hertz lot. Hertz ultimately had to pay $168 million in a settlement [1]. That's insufficient. If I, as an ordinary citizen, make a false police report that somebody stole my car I can be criminally charged. And rightly so. People should go to jail for this and it will continue until they do. These fines and settlements are just the cost of doing business; and

2. The UK government contracted Fujitsu to produce a new system for their post offices. That system was allowed to produce criminal charges for fraud that were completely false. People committed suicide over this. This went on for what? A decade or more? But resuted in a parliamentary inquiry and settlements. It's known as the British Post Office scandal [2]. Again, people should go to jail for this.

Everyone here is one LLM decision away from having their life ruined where nobody can explain why a decision was made, what the basis of that decision was and what can be done to fix it. No requirements exist to prove such claims. The burden falls on ordinary people to prove the claims are false.

[1]: https://www.npr.org/2022/12/06/1140998674/hertz-false-accusa...

[2]: https://en.wikipedia.org/wiki/British_Post_Office_scandal

temp0826•24m ago
Even in Idiocracy they didn't have this problem
chrisjj•16m ago
There's an opportunity for an "AI" app here. Takes your photo, compares with mugshots on police databases, quotes you for requisite cosmetic surgery.

/i

puppycodes•12m ago
end qualified immunity.

see how fast cops start to do their jobs with care.

The problem is that there are almost never consequences for police, suing them literally ends in your own community, regular citizens paying the bill.

They do not care.