While they may be a Greenlandic teacher, it's almost assured that they are teaching western Greenlandic, which is similar to Canadian Inuktitut.
People in the East of Greenland speak a language that has similarities, but is different enough in vocabulary and sounds that it's often considered a separate language and not a dialect.
When people from East and West Greenland come together, they typically speak Danish because they can't understand each other in their own native language.
So we're talking about a country that has 55k people and a portion of them don't even speak the official language.. This guy would have no way of knowing whether something was written poorly by a computer or a poorly educated greenlandic native that maybe isn't so good with the official language.
Given that the majority of the country's citizens do not use the internet at all, it is not even clear what his solution is other than just deciding to be some sort of magic arbiter .. which is not realistic or sustainable.
On what do you base this assertion? I was not able to find up-to-date statistics, but 72% of participants in this survey from 2013 had internet access at home, either via PC or via mobile devices, and another 11% had internet access elsewhere:
https://digitalimik.gl/-/media/datagl/old_filer/strategi_201...
So to get back to the point: Yes the solution is to appoint someone a magic arbiter, and hope they don’t screw up. The fact that it’s a deeply imperfect way of solving problems doesn’t mean it’s not workable. It just means it will backfire at some point, and someone else will get appointed instead.
The reason none of this makes sense to me is that it's intellectually crippling Internet users. Computers and the Internet are tools. If you want something machine translated to you, you can use a tool like Google translate to translate it for you. If the webmaster does this, it robs people from the opportunity to learn to use those tools and they become dependent on third parties to do this for them when they would have a lot more freedom if they just did it themselves (or if they learned English).
Teach a man to fish...
If this is true, then the easy solution would be to just have two separate wikipedia editions (assuming there is interest).
After all if we have en, sco, jam and ang, surely there is room for two greenlandics. The limitting factor is user interest.
> an American teenager – who does not speak Scots, the language of Robert Burns – has been revealed as responsible for almost half of the entries on the Scots language version of Wikipedia
It wasn't malicious either, it was someone who started editing Wikipedia at 12 and naively failed to recognise the damage they were doing.
Papers and books will be written about naive heavy handed online censorship creating echo chambers and driving the US into fascism.
1. https://en.wikipedia.org/wiki/On_the_Internet,_nobody_knows_...
Please consider users of screen readers and other assistive technologies, as your nonstandard usage of nonstandard characters makes parsing your comment difficult if not impossible. Not a slight or a correction, as I am a fan of Zalgo text myself, but after being informed by others about how inscrutable it can be to the differently abled, I have reconsidered using it.
I wonder if the future of screen reading applications is bypassing these issues + avoiding parsing weird websites by just doing AI driven OCR.
AI has a hard time deriving how many r’s are in strawberry, so I won’t expect it to parse my text on behalf of others any time soon, though I don’t think you meant any harm. In the interest of respect for those who don’t have a choice in using tech to help them do what comes easily and naturally to me, I thought I’d pay forward the knowledge of how the world and our perceptions of it is as unique as every individual.
But you do you.
The solution is to differentiate and tag inputs and outputs, such that outputs can't be fed as inputs recursively. Funnily enough, wikipedia's sourcing policy does this perfectly, not only are sources the input and page content is just an output, but page content is a tertiary source, and sources by policy should be secondary (and sometimes primary) sources, so the system is even protected against cross tertiary source pollution (say an encyclopedia feeding off wikipedia and viceversa).
It is only when articles posing as secondary sources fail to cite wikipedia that a recursive quality loss can occur, see [[citogenesis]]
That's the core issue, it's not those who use AI translator or worst like Google translate. If there isn't any Greenlander to contribute to their Wikipedia, they don't deserve to have one and instead must rely on other languages.
The difference between an empty Wikipedia and one filled with translated articles that contains error isn't much. They should instead close that version of Wikipedia until there are enough volunteers.
It's the same. Google translate uses trained AI models.
I would put 50% of the blame on goggle, for offering up translations that are wholly or partially in error, without any indication such as a warning message to that effect.
Then I would assign 40% of the blame on LLM text generation based on models where the model creators performed no review of their training data.
The final 10% of blame goes to anyone who would post rubbish without first hand knowledge that at least the translation was correct.
Except for that final 10%, all of the blame goes to the profit motive. Foisting shit on the world for the sole purpose of profit.
And lets face it, this isn't exactly the first time marginalized people, or their languages, have suffered because of western capitalism...
p.s. fan-bois kool-aid drinkers, feel free to start your down-voting now...
[0] Which paradoxically to a significant degree exist thanks to the unpaid work of volunteers in many of such communities.
foxglacier•5h ago
> potentially pushing the most vulnerable languages on Earth toward the precipice as future generations begin to turn away from them.
OK? We have lots of dead languages. It's fine. People use whatever languages are appropriate to them and we don't need to maintain them forever.
aucisson_masque•4h ago
Survival of the fittest, right ? Not enough people speaking Greenlandic, too complicated even for it's own population who would rather speak danish ? The very reason I'm speaking English is because it was forced military during the 19th century by the UK and since the 20th by Hollywood.
Just like a virus, if a language doesn't spread, it die.
jiggawatts•3h ago
When people have varying levels of capability with languages, they’ll switch to whatever is the lowest common denominator — the language that the group can best communicate in. This tended to be English, even amongst a bunch of native speakers of a common foreign language.
Moreover, this is context dependent: when talking about technical matters (especially computing), the Lingua Franca (pun intended) is English. You’ll hear “locals” switch to either mixed or pure English, even if they’re not great at it. Science, aviation, etc… is the same.
Before English it was French that had this role, and before then it was Latin and Greek.
The thing is, when the whole world speaks one common language like Latin or English, this is a tiny bit sad for some Gaelic tribe that got wiped out culturally, but incredibly valuable for everybody everywhere. International commerce becomes practical. Students can study overseas, spreading ideas further and wider. Books have a bigger market, attracting smarter and better authors. There’s a bigger pool of talented authors to begin with, some of which write educational textbooks of exceptional sparkling quality. These all compound to create a more educated, vibrant, and varied culture… because of, not despite the single language.
arthurjj•2h ago
But the sentence `well-meaning Wikipedians who think that by creating articles in minority languages they are in some way “helping” those communities` clearly shows the author hasn't really considered the issue.