frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

66 million-year-old dinosaur ‘mummy’ skin was actually a perfect clay mask

https://www.cnn.com/2025/10/23/science/duck-billed-dinosaur-mummy-clay-mask
1•breve•1m ago•0 comments

Forgejo v13.0.2 contains critical security fixes

https://codeberg.org/forgejo/forgejo/src/branch/forgejo/release-notes-published/13.0.2.md
1•kassner•3m ago•1 comments

Washington lawyer on furlough lives out dream of running a hot dog cart

https://www.reuters.com/world/us/washington-lawyer-furlough-lives-out-dream-running-hot-dog-cart-...
2•hansmayer•6m ago•0 comments

GenAI Image Editing Showdown

https://genai-showdown.specr.net/image-editing
2•Hard_Space•17m ago•0 comments

Show HN: Project Journal – Give AI coding assistants persistent memory

https://github.com/CursorWP/ai-project-journal
1•CursorWP•18m ago•0 comments

Sandbox Your Program Using FreeBSD's Capsicum [video]

https://www.youtube.com/watch?v=Ne4l5U_ETAw
1•todsacerdoti•18m ago•0 comments

TIL: Figma provides a helper function for gradient transforms

https://wpconverters.com/demystifying-figmas-gradient-transformations-a-developers-guide
1•drzivil•24m ago•1 comments

Scientists are racing to grow human teeth in the lab

https://www.cnn.com/science/lab-grown-human-teeth-spc
1•breve•26m ago•0 comments

We want to move Ruby forward

https://andre.arko.net/2025/10/26/we-want-to-move-ruby-forward/
3•ciconia•29m ago•0 comments

The Magic of Precision Engineering

https://www.hightechinstitute.nl/the-magic-of-precision-engineering/
2•o4c•51m ago•1 comments

Gluing and framing a 9000-piece jigsaw

https://river.me/blog/puzzle-glue-9000/
1•busymom0•57m ago•0 comments

AI Pullback Has Officially Started

https://www.planetearthandbeyond.co/p/ai-pullback-has-officially-started
3•danfritz•1h ago•0 comments

Lampedusa's 1958 Novel The Leopard skewered the super-rich

https://www.bbc.com/culture/article/20250304-the-leopard-the-1958-italian-novel-that-skewered-the...
1•walterbell•1h ago•0 comments

Practical Defenses Against Technofascism

https://micahflee.com/practical-defenses-against-technofascism/
3•HotGarbage•1h ago•0 comments

The Magna Anima Genius Project

https://magnaanimageniusproject.substack.com/
1•jbutlergenius•1h ago•0 comments

Raster Master v5.4 Sprite/Tile/Map Editor 88 Stars on GitHub

https://github.com/RetroNick2020/raster-master/releases/tag/v5.4R121
3•retronick2020•1h ago•0 comments

Salesforce Enterprise Deep Research

https://github.com/SalesforceAIResearch/enterprise-deep-research
2•Raven603•1h ago•2 comments

Operating Systems Written in Free Pascal

https://wiki.freepascal.org/Operating_Systems_written_in_FPC
2•kristianp•1h ago•0 comments

Sustained western growth and Artificial Intelligence

https://datagubbe.se/llmfix/
2•brazukadev•1h ago•0 comments

Tell HN: Don't Vibe Your Design

2•davidtranjs•1h ago•1 comments

Hey LLM, write production-ready code

https://wejn.org/2025/10/llm-write-production-ready-code/
1•wejn•1h ago•1 comments

Student Handcuffed After School's AI System Mistakes a Bag of Chips for a Gun

https://www.theguardian.com/us-news/2025/oct/24/baltimore-student-ai-gun-detection-system-doritos
4•m463•1h ago•0 comments

Show HN: I analyzed 3,465 remote job listings – 72% hide salary information

https://no-commute-jobs.com/blog/remote-work-statistics-2025
1•remimatteo•1h ago•1 comments

Why bosses need to wake up to dark patterns

https://www.economist.com/business/2025/10/16/why-bosses-need-to-wake-up-to-dark-patterns
1•Austin_Conlon•1h ago•0 comments

The Layer 1 Blockchain Built for AI Agent

https://harvestai.co/
1•salkahfi•2h ago•0 comments

Success Always Spawns Haters

https://world.hey.com/dhh/success-always-spawns-haters-75edaede
1•doppp•2h ago•0 comments

DHS Posts Video Featuring Song Popular with Nazi Creators

https://gizmodo.com/dhs-little-dark-age-nazi-video-2000676359
2•nobody9999•2h ago•1 comments

Language Modeling with Hierarchical Reasoning Models: Lessons from 1M Parameters

https://williamthurston.com/ml/language-models/transformers/2025/10/25/language-modeling-with-hie...
2•jhspaybar•2h ago•0 comments

GameStop Declares Console Wars Over

https://twitter.com/gamestop/status/1982213786221109263
2•avonmach•2h ago•1 comments

Quick Dungeon Crawler Update 3.5.0: New Passives, CRIT DMG Nerf

https://dungeon.werkstattl.com/
1•logTom•2h ago•3 comments
Open in hackernews

AI, Wikipedia, and uncorrected machine translations of vulnerable languages

https://www.technologyreview.com/2025/09/25/1124005/ai-wikipedia-vulnerable-languages-doom-spiral/
80•kawera•11h ago

Comments

foxglacier•9h ago
If nobody's reading them and nobody's writing them, then perhaps it doesn't matter. We could let Wikipedia-Greenlandic persist as its own evolved language that forks from the original.

> potentially pushing the most vulnerable languages on Earth toward the precipice as future generations begin to turn away from them.

OK? We have lots of dead languages. It's fine. People use whatever languages are appropriate to them and we don't need to maintain them forever.

aucisson_masque•8h ago
I see that this comment get downvoted but I think we can agree on the facts that languages, just like species, die while other flourish. And that's fine.

Survival of the fittest, right ? Not enough people speaking Greenlandic, too complicated even for it's own population who would rather speak danish ? The very reason I'm speaking English is because it was forced military during the 19th century by the UK and since the 20th by Hollywood.

Just like a virus, if a language doesn't spread, it die.

jiggawatts•7h ago
As an immigrant to an anglophone country, I noticed a few things:

When people have varying levels of capability with languages, they’ll switch to whatever is the lowest common denominator — the language that the group can best communicate in. This tended to be English, even amongst a bunch of native speakers of a common foreign language.

Moreover, this is context dependent: when talking about technical matters (especially computing), the Lingua Franca (pun intended) is English. You’ll hear “locals” switch to either mixed or pure English, even if they’re not great at it. Science, aviation, etc… is the same.

Before English it was French that had this role, and before then it was Latin and Greek.

The thing is, when the whole world speaks one common language like Latin or English, this is a tiny bit sad for some Gaelic tribe that got wiped out culturally, but incredibly valuable for everybody everywhere. International commerce becomes practical. Students can study overseas, spreading ideas further and wider. Books have a bigger market, attracting smarter and better authors. There’s a bigger pool of talented authors to begin with, some of which write educational textbooks of exceptional sparkling quality. These all compound to create a more educated, vibrant, and varied culture… because of, not despite the single language.

don-bright•1h ago
We already see the 'best' LLMs switch between different languages while they are 'thinking'. It seems to me that the more languages it can 'think' in, the better off it will be. Different human languages have different concepts of time, numbers, nature, place, intention, relationships, and so forth and so on.
thaumasiotes•29m ago
> Survival of the fittest on a long time horizon means the more diversity the better the survival rate will be.

This is just a misapplication of the analogy. For a language, "fitness" refers to similarity to whatever language is spoken by people relevant to you. Diversity is the worst quality a language can exhibit, and is the quality that causes dying languages to die.

There is no such concept as an external force coming in that certain languages handle better, allowing them to temporarily outcompete other languages. Existing pools of diversity are not protective against this, because it can't happen.

Also unlike genetic diversity, linguistic diversity does not need to be maintained as a legacy of the past. It is constantly being generated in much larger quantities than are desired. If you managed to perform the opposite of the Tower of Babel miracle and replaced every currently-spoken language everywhere in the world with a perfect monoculture, within 1-2 generations you'd be back to having mutually unintelligible varieties in different regions.

arthurjj•6h ago
This was my take from the article also. These languages are clearly dying and not many people speak them as their primary language so the human suffering is minimal. Which means keeping them around is a past time that some people happen to enjoy (unless there is a Saphir-Whorf hypothesis I'm missing)

But the sentence `well-meaning Wikipedians who think that by creating articles in minority languages they are in some way “helping” those communities` clearly shows the author hasn't really considered the issue.

ratg13•9h ago
It's ironic that the "solution" to the problem is being driven by yet another person that isn't native to Greenland.

While they may be a Greenlandic teacher, it's almost assured that they are teaching western Greenlandic, which is similar to Canadian Inuktitut.

People in the East of Greenland speak a language that has similarities, but is different enough in vocabulary and sounds that it's often considered a separate language and not a dialect.

When people from East and West Greenland come together, they typically speak Danish because they can't understand each other in their own native language.

So we're talking about a country that has 55k people and a portion of them don't even speak the official language.. This guy would have no way of knowing whether something was written poorly by a computer or a poorly educated greenlandic native that maybe isn't so good with the official language.

Given that the majority of the country's citizens do not use the internet at all, it is not even clear what his solution is other than just deciding to be some sort of magic arbiter .. which is not realistic or sustainable.

optionalsquid•9h ago
> Given that the majority of the country's citizens do not use the internet at all

On what do you base this assertion? I was not able to find up-to-date statistics, but 72% of participants in this survey from 2013 had internet access at home, either via PC or via mobile devices, and another 11% had internet access elsewhere:

https://digitalimik.gl/-/media/datagl/old_filer/strategi_201...

Uehreka•9h ago
I wish people on HN would stop acting like “magic arbiter” solutions are “not realistic”, when in reality it’s the only way things have every worked. Are federal judges “magic arbiters”? Yes. Do judges make bad calls? Yes. Do we not like when large numbers of judges who are unfriendly to our side get life appointments? Yes. Has anyone proposed an actual better way of solving these kinds of problems? No.

So to get back to the point: Yes the solution is to appoint someone a magic arbiter, and hope they don’t screw up. The fact that it’s a deeply imperfect way of solving problems doesn’t mean it’s not workable. It just means it will backfire at some point, and someone else will get appointed instead.

AlienRobot•8h ago
As someone who isn't a native English speaker, I believe most people who use the Internet would benefit from simply learning English rather than having an unchecked AI translate things to them. Reddit for example has joined millions of terrible Wordpress websites in auto-translating everything for SEO purposes and Google seems to be fine with this for some reason. It's ironic that it has reached the point that if you search for a "multi-language" plugin for Wordpress, most of the results aren't about letting you write an article in multiple languages, they're just about automatically translating a single article to 30 languages with machine translation.

The reason none of this makes sense to me is that it's intellectually crippling Internet users. Computers and the Internet are tools. If you want something machine translated to you, you can use a tool like Google translate to translate it for you. If the webmaster does this, it robs people from the opportunity to learn to use those tools and they become dependent on third parties to do this for them when they would have a lot more freedom if they just did it themselves (or if they learned English).

Teach a man to fish...

spookie•4h ago
A lot of written text out there in other languages isn't available in English, simply put you have many eco chambers of singular languages out there. Most people are ok with just reading what they understand.
bawolff•7h ago
> People in the East of Greenland speak a language that has similarities, but is different enough in vocabulary and sounds that it's often considered a separate language and not a dialect.

If this is true, then the easy solution would be to just have two separate wikipedia editions (assuming there is interest).

After all if we have en, sco, jam and ang, surely there is room for two greenlandics. The limitting factor is user interest.

thaumasiotes•32m ago
> the easy solution would be to just have two separate wikipedia editions (assuming there is interest)

That's... a reach.

An easier, and much more realistic, solution would be to just have one edition in Danish, which was already noted as the language Greenlanders have in common.

simonw•9h ago
I'm surprised this story didn't mention the scandal with Scots Wikipedia: https://www.theguardian.com/uk-news/2020/aug/26/shock-an-aw-...

> an American teenager – who does not speak Scots, the language of Robert Burns – has been revealed as responsible for almost half of the entries on the Scots language version of Wikipedia

It wasn't malicious either, it was someone who started editing Wikipedia at 12 and naively failed to recognise the damage they were doing.

TZubiri•6h ago
The Cebuano wiki is a similar case, not spoken often, but it was a personal project of an editor that was mad at political articles and started making animal articles in the Cebuano wiki.

The solution is to differentiate and tag inputs and outputs, such that outputs can't be fed as inputs recursively. Funnily enough, wikipedia's sourcing policy does this perfectly, not only are sources the input and page content is just an output, but page content is a tertiary source, and sources by policy should be secondary (and sometimes primary) sources, so the system is even protected against cross tertiary source pollution (say an encyclopedia feeding off wikipedia and viceversa).

It is only when articles posing as secondary sources fail to cite wikipedia that a recursive quality loss can occur, see [[citogenesis]]

galagawinkle489•4h ago
Many sources for Wikipedia articles refer to Wikipedia without citing it. Many journalists will work from Wikipedia, and most of Wikipedia's sources are journalistic articles. It happens to be that often this isn't noticed because the information obtained this way is true and uncontroversial. Citogenesis only documents examples where, by bad luck, the result is untrue information.
thaumasiotes•34m ago
> It is only when articles posing as secondary sources fail to cite wikipedia that a recursive quality loss can occur

I've seen a college professor cite wikipedia in support of a false claim. On investigation, the text in wikipedia was cited to an earlier blog post by that same professor.

I wasn't convinced.

aucisson_masque•9h ago
> Wehr, who now teaches Greenlandic in Denmark, speculates that perhaps only one or two Greenlanders had ever contributed.

That's the core issue, it's not those who use AI translator or worst like Google translate. If there isn't any Greenlander to contribute to their Wikipedia, they don't deserve to have one and instead must rely on other languages.

The difference between an empty Wikipedia and one filled with translated articles that contains error isn't much. They should instead close that version of Wikipedia until there are enough volunteers.

consp•8h ago
That last part creates a chicken and egg problem. You can argue about it but I will bet it will never get traction if there is no basis to start from.
bawolff•7h ago
Wikipedia has an "incubator" setup where people can start working on a language in the incubator until it demonstrates enough interest.
Symbiote•8h ago
The end of the article says they have closed it.
Mars008•7h ago
> who use AI translator or worst like Google translate

It's the same. Google translate uses trained AI models.

tomlockwood•2h ago
> they don't deserve to have one

By what unholy pact have you been beknighted as the bestower of wikis, my friend?

mslt•2h ago
Not the commenter but in this instance it seems like if you want something you need to either be able make/maintain it or fund someone who will, no?
tjwebbnorfolk•1h ago
If the original authors stop maintaining an OSS project, and you are one of only a very few users, you have two options: do the work yourself, or watch it die. If you are unwilling to do the work yourself, then that's a signal it isn't important enough for anyone else to do the work either.

Why should a wiki be any different?

johnea•8h ago
Unlike what the title of the post implies, I would say Wikipedia bears 0% of the blame for this issue.

I would put 50% of the blame on goggle, for offering up translations that are wholly or partially in error, without any indication such as a warning message to that effect.

Then I would assign 40% of the blame on LLM text generation based on models where the model creators performed no review of their training data.

The final 10% of blame goes to anyone who would post rubbish without first hand knowledge that at least the translation was correct.

Except for that final 10%, all of the blame goes to the profit motive. Foisting shit on the world for the sole purpose of profit.

And lets face it, this isn't exactly the first time marginalized people, or their languages, have suffered because of western capitalism...

p.s. fan-bois kool-aid drinkers, feel free to start your down-voting now...

strogonoff•7h ago
Wikipedia editors is among the many communities that have for a long time mostly successfully relied on the tendency of relatively superficial, easy to validate capabilities (such as being able to use a website, write something resembling real language, and handle basic communication) to correlate with more valuable but harder to validate qualities (such as ability to write reasonably well and follow rules/guidelines, and generally being a well-intentioned person) as one of their main barriers to entry. Attributable to the deluge of commercial LLMs[0] available at such low prices that their operators lose millions to billions of dollars in order to gain market share and ultimately profit, such communities may not be able to continue to exist as is for long, I suspect: either they would be forced to institute more intrusive barriers (be that ID verification, invite-only memberships, or something else) while the deluge lasts, or they may be effectively destroyed when members secretly lacking the requisite qualities and act in bad faith become a majority, damage community’s reputation, and drive out the existing members.

[0] Which paradoxically to a significant degree exist thanks to the unpaid work of volunteers in many of such communities.

haiji2025•4h ago
yes
orbital-decay•8m ago
That happens by default in low-resource languages, no bad translations needed. They don't have enough either written material to train an LLM, or labels for time periods and various dialects in a continuum. For example even the best multilanguage models will lump up all Berber languages into one unstable abomination nobody speaks, usually writing it in Neo-Tifinagh. Not much can be done about that, training a model in all varieties of these would require a huge specialized effort.