frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Open in hackernews

Show HN: Han – A Korean programming language written in Rust

https://github.com/xodn348/han
55•xodn348•1h ago
A few weeks ago I saw a post about someone converting an entire C++ codebase to Rust using AI in under two weeks.

That inspired me — if AI can rewrite a whole language stack that fast, I wanted to try building a programming language from scratch with AI assistance.

I've also been noticing growing global interest in Korean language and culture, and I wondered: what would a programming language look like if every keyword was in Hangul (the Korean writing system)?

Han is the result. It's a statically-typed language written in Rust with a full compiler pipeline (lexer → parser → AST → interpreter + LLVM IR codegen).

It supports arrays, structs with impl blocks, closures, pattern matching, try/catch, file I/O, module imports, a REPL, and a basic LSP server.

This is a side project, not a "you should use this instead of Python" pitch. Feedback on language design, compiler architecture, or the Korean keyword choices is very welcome.

https://github.com/xodn348/han

Comments

raaspazasu•1h ago
I don't know Korean at all, but this looks cool and a fun project. I'm curious if this reduces typing or has any benefits being in Hangul vs Latin?
xodn348•1h ago
Thanks! One thing that motivated me was curiosity about prompt efficiency in the AI era. Hangul is beautifully dense — a single syllable block packs initial consonant + vowel + final consonant into one character. I wondered if Korean-keyword code might produce shorter prompts for LLMs.

I actually tested this with GPT-4o's tokenizer, and the result was the opposite — Korean keywords average 2-3 tokens vs 1 for English. A fibonacci program in Han takes 88 tokens vs 54 in Python.

The reason comes down to how LLM tokenizers work. They use BPE (Byte Pair Encoding), which starts with raw bytes and repeatedly merges the most frequent pairs into single tokens. Since training data is predominantly English, words like `function` and `return` appear billions of times and get merged into single tokens.

Korean text appears far less frequently, so the tokenizer doesn't learn to merge Hangul syllables — it falls back to splitting each character into 2-3 byte-level tokens instead.

It's a tokenizer training bias, not a property of Hangul itself. If a tokenizer were trained on a Korean-heavy corpus, `함수` could absolutely become a single token too.

So no efficiency benefit today. But it was a fun exploration, and Korean speakers can read the code like natural language. It could also be a fun way for people learning Korean to practice reading Hangul in a different context — every keyword is a real Korean word with meaning.

topce•41m ago
Very Interesting...

I have similar idea to train LLM in Serbian, create even new encoding https://github.com/topce/YUTF-8 inspired by YUSCII. Did not have time and money ;-) Great that you succeed. Idea if train in Serbian text encoded in YUTF-8 (not UTF-8) it will have less token when prompt in Serbian then English, also Serbian Cyrillic characters are 1 byte in YUTF-8 instead of 2 in UTF.Serbian language is phonetic we never ask how you spell it.Have Latin and Cyrillic letters.

xodn348•34m ago
Really interesting approach — attacking token efficiency at the encoding level is more fundamental than what I did.

Even without retraining BPE from scratch, starting with YUTF-8 and measuring how existing tokenizers handle it would already be a worthwhile experiment.

Hope you find the time to build it, good luck!

ralferoo•20m ago
I don't know how to read Hangul (I know the general idea about how the character is composed). To me just looking at the examples, it doesn't seem as obvious what the structure of the code is, compared to Latin letters and punctuation. Actually, most punctuation looked OK, but the first couple of examples used arrays and [ and ] seemed to just blend in with the identifiers wherever they appeared. I'm not sure how distinct they look with familiarity with Hangul characters. I'm sure it's also nothing that colour syntax highlighting wouldn't make easier.
xodn348•12m ago
Fair point that [ ] can blend in.

For Korean readers the character systems look quite different, but I can see how it's hard to parse visually without familiarity.

As you said, syntax highlighting helps a lot — there's a colored screenshot at the top of the README showing how it looks in practice.

bbrodriguez•35m ago
Korean doesn’t reduce typing compared to English from my experience. What looks like a “character” is actually a syllable block called “eumjeol” that’s made up of consonants (moeum)and vowels (jaeum). You can’t have a vowel only syllable either so you always have to pair it with a null consonant no matter what (which kinda looks like a zero: ㅇ) and while nouns can be much more concise compared to English, verbs can get verbose.

The main benefit of Korean actually comes from the fact that the language itself fits perfectly into a standard 27 alphabet keys and laid out in such a way that lets you type ridiculously fast. The consonant letters are always situated in the left half and the vowels are in the right half of the keyboard. This means it is extremely easy to train muscle memory because you’re mostly alternating keystrokes on your left hand and right hand.

Anecdotally I feel like when I’m typing in English, each half of my brain needs to coordinate more compared to when I’m typing in Korean, the right brain only need to remember the consonant positions for my left hand and my left brain only need to remember the vowel positions.

xodn348•28m ago
만나서 반가워요!

What you talked is mostly right and I did not know about typing in Korean, the left-hand side and right-hand side. Btw, Consonant(Jaeum) and vowel(Moeum).

In experience-wise, what you had would be precise.

danparsonson•13m ago
Wonderful! What a cool idea. For anyone interested, you can learn the whole of Hangul in an afternoon; it's cleverly designed to be very logical and has some handy mnemonics: https://korean.stackexchange.com/a/213
xodn348•9m ago
That is a deep knowledge that even Korean-natives would not know. I will add this site as a reference to Github. I am glad that I have you as a supporter!
xodn348•6m ago
Just added that link to the README — it fits perfectly in the "Beauty of Hangul" section.
apt-apt-apt-apt•11m ago
A simple translation of keywords seems straightforward, I wonder why it's not standard.

    # def two_sum(arr: list[int], target: int) -> list[int]:
    펀크 투섬(아래이: 목록[정수], 타개트: 정수) -> 목록[정수]:
    # n = len(arr)
    ㄴ = 길이(아래이)

    # start, end = 0, n - 1
    시작, 끝 = 0, ㄴ - 1
    # while start < end:
    동안 시작 < 끝:
Code would be more compact, allowing things like more descriptive keywords e.g. AbstractVerifiedIdentityAccountFactory vs 실명인증계정생성, but we'd lose out on the nice upper/lowercase distinction.

I hear that information processing speed is nearly the same across all languages though regardless of density, so in terms of processing speed, may not make much difference.

xodn348•2m ago
Good point about compactness — 실명인증계정생성 vs AbstractVerifiedIdentityAccountFactory is a real example where Korean shines.

One distinction though: Han uses actual Korean words, not transliterations. 함수 means "function" in Korean, 만약 means "if" — they're real words Korean speakers already know.

Your example uses transliterations like 펀크 and 아래이 which would look odd to a Korean reader. That difference matters for readability.

xodn348•1m ago
funny examples, though.
marysminefnuf•7m ago
My dream is to one day make a chaldean programming language for my kids. Stuff like this is inspiring
xodn348•5m ago
The fact that you're already thinking about it means you can do it. Go for it!

Major investor 'shocked and sad' that the games industry is 'demonizing' gen AI

https://www.pcgamer.com/software/ai/major-investor-is-shocked-and-sad-that-the-games-industry-is-...
2•stalfosknight•2m ago•0 comments

Quantum Teleportation Breakthrough Brings the Quantum Internet Closer

https://scitechdaily.com/quantum-teleportation-breakthrough-brings-the-quantum-internet-closer/
1•HardwareLust•5m ago•0 comments

Show HN: NumenText, a non-modal editing terminal IDE with LSP/DAP

https://github.com/numentech-co/numentext
1•rlogman•6m ago•0 comments

Airbus is preparing two uncrewed combat aircraft

https://www.airbus.com/en/newsroom/press-releases/2026-03-airbus-is-preparing-two-uncrewed-combat...
2•phasnox•6m ago•0 comments

RFC 4180 – CSV (2005)

https://www.rfc-editor.org/rfc/rfc4180
1•basilikum•12m ago•0 comments

Laws of Nature and Chances: What Breathes Fire into the Equations

https://ndpr.nd.edu/reviews/laws-of-nature-and-chances-what-breathes-fire-into-the-equations/
1•hhs•14m ago•0 comments

The slow death of the English boarding school

https://www.ft.com/content/cc7eb665-b689-4e2f-9e35-c6ab5dbf3980
2•bookofjoe•14m ago•1 comments

Show HN: I let the internet control my iPad with AI

https://play.thomaskidane.com/
1•meneliksecond•15m ago•1 comments

The Sound of Contamination: Headphones Contain Ing Hormone-Disrupting Chemicals

https://arnika.org/en/news/the-sound-of-contamination-all-analysed-headphones-on-the-central-euro...
2•microflash•17m ago•1 comments

'In Search of Now’ review: Blurring forever and a day

https://www.wsj.com/arts-culture/books/in-search-of-now-review-time-out-of-mind-2e33a184
1•hhs•19m ago•0 comments

Tokenizing Arithmetic Expressions

https://xnacly.me/posts/2023/calculator-lexer/
1•ibobev•20m ago•0 comments

Show HN: Nia CLI, an OSS CLI for agents to index, search, and research anything

https://github.com/nozomio-labs/nia-cli
1•jellyotsiro•20m ago•0 comments

Mojo's Not (Yet) Python

https://theconsensus.dev/p/2026/03/12/mojos-not-yet-python.html
1•ibobev•20m ago•0 comments

Introduction to SQLAlchemy 2 in Practice

https://blog.miguelgrinberg.com/post/introduction-to-sqlalchemy-2-in-practice
2•ibobev•21m ago•0 comments

What can change the nature of an AI?

https://onatm.dev/2026/03/14/what-can-change-the-nature-of-an-ai/
1•onatm•24m ago•0 comments

Siepr Economic Summit 2026 – California Wealth Tax [video]

https://www.youtube.com/watch?v=H54P6j7ER28
1•skmurphy•25m ago•1 comments

Does anyone else struggle to search inside large YouTube playlists?

https://chromewebstore.google.com/detail/todij-playlist-manager/fboiimochokolojefdohahhiapkkpccg
1•seyfigo•28m ago•1 comments

Plan 9's Acme: The Un-Terminal and Text-Based GUIs

https://www.danielmoch.com/posts/2025/01/acme/
2•birdculture•29m ago•1 comments

Tools Collection

https://theultimatewebtools.com/
1•Plaz•31m ago•0 comments

Verge Electric TS Pro Is a Revolutionary Motorcycle

https://www.autoweek.com/news/a60944655/verge-electric-ts-pro-motorcycle-review/
1•thunderbong•32m ago•0 comments

The datacenter where the day starts with topping up cerebrospinal fluid

https://www.theregister.com/2026/03/14/cortical_labs_biological_cloud/
2•spzb•33m ago•0 comments

GrobPaint: Somewhere Between MS Paint and Paint.net. Multiplatform by Default

https://github.com/groverburger/grobpaint
2•__grob•35m ago•0 comments

Show HN: GLinksWWW – A browser for power users tired of repetitive copy-pasting

https://github.com/rio719/gLinksWWW-browser
2•glinkswww•39m ago•1 comments

Show HN: A 2D football SIM focused on real-time coaching

https://www.touchlineshouting.com
1•pipnonsense•40m ago•0 comments

Off-Grid Electricity and Hot Water from Scrap Wood

https://hackaday.com/2026/03/14/off-grid-electricity-and-hot-water-from-scrap-wood/
4•toomuchtodo•44m ago•1 comments

Detexify

https://detexify.kirelabs.org/classify.html
1•jruohonen•44m ago•0 comments

Show HN: The best way to manage your skills – Better-Skills

3•trapani•46m ago•2 comments

Dependency cooldowns would be a good idea for Go

https://utcc.utoronto.ca/~cks/space/blog/programming/GoDependencyCooldownsGood
2•ingve•47m ago•1 comments

Trump Adviser Warns of Possible Israel Nuclear Escalation in Iran Conflict

https://www.newsweek.com/david-sacks-trump-administration-israel-nuclear-escalation-iran-war-1167...
4•jacquesm•47m ago•0 comments

Invariant Risk Minimization (2020)

https://arxiv.org/abs/1907.02893
1•gone35•49m ago•0 comments