frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

France's homegrown open source online office suite

https://github.com/suitenumerique
1•nar001•1m ago•0 comments

SpaceX Delays Mars Plans to Focus on Moon

https://www.wsj.com/science/space-astronomy/spacex-delays-mars-plans-to-focus-on-moon-66d5c542
1•BostonFern•2m ago•0 comments

Jeremy Wade's Mighty Rivers

https://www.youtube.com/playlist?list=PLyOro6vMGsP_xkW6FXxsaeHUkD5e-9AUa
1•saikatsg•2m ago•0 comments

Show HN: MCP App to play backgammon with your LLM

https://github.com/sam-mfb/backgammon-mcp
1•sam256•4m ago•0 comments

AI Command and Staff–Operational Evidence and Insights from Wargaming

https://www.militarystrategymagazine.com/article/ai-command-and-staff-operational-evidence-and-in...
1•tomwphillips•4m ago•0 comments

Show HN: CCBot – Control Claude Code from Telegram via tmux

https://github.com/six-ddc/ccbot
1•sixddc•5m ago•1 comments

Ask HN: Is the CoCo 3 the best 8 bit computer ever made?

1•amichail•8m ago•0 comments

Show HN: Convert your articles into videos in one click

https://vidinie.com/
1•kositheastro•10m ago•0 comments

Red Queen's Race

https://en.wikipedia.org/wiki/Red_Queen%27s_race
2•rzk•10m ago•0 comments

The Anthropic Hive Mind

https://steve-yegge.medium.com/the-anthropic-hive-mind-d01f768f3d7b
2•gozzoo•13m ago•0 comments

A Horrible Conclusion

https://addisoncrump.info/research/a-horrible-conclusion/
1•todsacerdoti•13m ago•0 comments

I spent $10k to automate my research at OpenAI with Codex

https://twitter.com/KarelDoostrlnck/status/2019477361557926281
2•tosh•14m ago•0 comments

From Zero to Hero: A Spring Boot Deep Dive

https://jcob-sikorski.github.io/me/
1•jjcob_sikorski•15m ago•0 comments

Show HN: Solving NP-Complete Structures via Information Noise Subtraction (P=NP)

https://zenodo.org/records/18395618
1•alemonti06•20m ago•1 comments

Cook New Emojis

https://emoji.supply/kitchen/
1•vasanthv•22m ago•0 comments

Show HN: LoKey Typer – A calm typing practice app with ambient soundscapes

https://mcp-tool-shop-org.github.io/LoKey-Typer/
1•mikeyfrilot•25m ago•0 comments

Long-Sought Proof Tames Some of Math's Unruliest Equations

https://www.quantamagazine.org/long-sought-proof-tames-some-of-maths-unruliest-equations-20260206/
1•asplake•26m ago•0 comments

Hacking the last Z80 computer – FOSDEM 2026 [video]

https://fosdem.org/2026/schedule/event/FEHLHY-hacking_the_last_z80_computer_ever_made/
2•michalpleban•27m ago•0 comments

Browser-use for Node.js v0.2.0: TS AI browser automation parity with PY v0.5.11

https://github.com/webllm/browser-use
1•unadlib•28m ago•0 comments

Michael Pollan Says Humanity Is About to Undergo a Revolutionary Change

https://www.nytimes.com/2026/02/07/magazine/michael-pollan-interview.html
2•mitchbob•28m ago•1 comments

Software Engineering Is Back

https://blog.alaindichiappari.dev/p/software-engineering-is-back
2•alainrk•29m ago•1 comments

Storyship: Turn Screen Recordings into Professional Demos

https://storyship.app/
1•JohnsonZou6523•29m ago•0 comments

Reputation Scores for GitHub Accounts

https://shkspr.mobi/blog/2026/02/reputation-scores-for-github-accounts/
2•edent•32m ago•0 comments

A BSOD for All Seasons – Send Bad News via a Kernel Panic

https://bsod-fas.pages.dev/
1•keepamovin•36m ago•0 comments

Show HN: I got tired of copy-pasting between Claude windows, so I built Orcha

https://orcha.nl
1•buildingwdavid•36m ago•0 comments

Omarchy First Impressions

https://brianlovin.com/writing/omarchy-first-impressions-CEEstJk
2•tosh•41m ago•1 comments

Reinforcement Learning from Human Feedback

https://arxiv.org/abs/2504.12501
7•onurkanbkrc•42m ago•0 comments

Show HN: Versor – The "Unbending" Paradigm for Geometric Deep Learning

https://github.com/Concode0/Versor
1•concode0•43m ago•1 comments

Show HN: HypothesisHub – An open API where AI agents collaborate on medical res

https://medresearch-ai.org/hypotheses-hub/
1•panossk•46m ago•0 comments

Big Tech vs. OpenClaw

https://www.jakequist.com/thoughts/big-tech-vs-openclaw/
1•headalgorithm•48m ago•0 comments
Open in hackernews

A Basic Just-In-Time Compiler (2015)

https://nullprogram.com/blog/2015/03/19/
105•ibobev•1mo ago

Comments

possiblywrong•1mo ago
Following is the closed form solution linked in the article (from a since-deleted Reddit comment):

    from functools import reduce

    def recurrence(ops, a0, n):
        def transform(x, op):
            return eval(repr(x) + op)
        ops = ops.split()
        m = reduce(transform, [op for op in ops if op[0] in ('*', '/')], 1)
        b = reduce(transform, ops, 0)
        for k in range(n + 1):
            print('Term 0: ', a0 * m ** k + b * (m ** k - 1) / (m - 1))
> This is really only interesting if a particular (potentially really large) term of the sequence is desired, as opposed to all terms up to a point. The key observation is that any sequence of the given set of operations reduces to a linear recurrence which has the given solution.
kscarlet•1mo ago
It seems that JIT is overloaded with at least 2 meaning.

The canonical definition of JIT is "compilation during execution of a program". Usually, a program is being interpreted first, then switches to compiled code in the middle of execution. This is not what this article does.

What this article does is sometimes called on-the-fly AOT, or just on-the-fly compilation. I'd prefer not overloading the term "JIT".

MobiusHorizons•1mo ago
I always took the distinction to be the executable memory bit as opposed to writing an executable and then launching it in the usual way. Of course high performance runtimes that contain JITs do typically interpret first in order to get type information and reduce startup latency, but that’s more a function of the language not being efficiently AOT compileable rather than fundamental to the concept of a JIT
kscarlet•1mo ago
In that sense almost every compiled Lisp/Scheme implementation, GHC, etc. or any other interactive programming system, count as JIT. But virtually nobody in those circles refer to such implementations as JIT. Instead, people says "I wish our implementation was JIT to benefit from all those optimizations it enables"!
MobiusHorizons•1mo ago
Do they generate machine code in ram and jump to it? Or do they interpret byte code?

EDIT: at least GHC seems to be a traditional AOT compiler.

kscarlet•1mo ago
They generate native machine code.
throwaway17_17•1mo ago
Can you explain what mean by a “language not being efficiently AOT compileable”? I am guessing you are referring to languages that emphasize REPL based development or languages that are ‘image’ based (I think Smalltalk is the most common example).

Like the guesses above, I can understand difficulty with AOT compilation in conjunction with certain use cases; however, I can not think of a language that based on its definition would be less amenable to AOT compilation.

Nevermark•1mo ago
The more context is narrowed down, the more optimizations that can applied during compilation.

AOT situations where a lot of context is missing:

• Loosly typed languages. Code can be very general. Much more general than how it is actually used in any given situation, but without knowing what the full situation is, all that generality must be complied.

• Increment AOT compilation. If modules have been compiled separately, useful context wasn't available during optimization.

• Code whose structure is very sensitive to data statistics or other conditional runtime information. This is the prime advantage of JIT over AOT. Unless the AOT compiler is working in conjunction with representative data and a profiler.

Those are all cases where JIT has advantages.

A language where JIT is optimal, is by definition, less amenable to AOT compilation.

MobiusHorizons•1mo ago
What I meant was dynamic typing primarily. Think JavaScript or lua. These are both languages where the choice is between interpreter and JIT rather than JIT or AOT. VM based languages like Java also fall in that category, not because of dynamic typing so much as because of shipping bytecode to the client for portability reasons.
MobiusHorizons•1mo ago
I should probably have said, not compilable to efficient code. rather than not efficiently compilable. Basically I was referring to dynamic typing. Typically such languages are interpreted, although there is probably a way to compile the same sequence the interpreter would take as an AOT compiler. When types are not known until runtime, the compiler can't emit efficient code, and instead has to emit code that checks the type on each function invocation which is basically exactly what the interpreter was already doing. This is where JITs really shine, since they can produce one or more compilations of the same function specialized to types that have been observed to be used frequently. the interpreter can then call those specialized versions of the functions directly when appropriate.
dataflow•1mo ago
> Usually, a program is being interpreted first, then switches to compiled code in the middle of execution.

I agree what they have isn't JIT compilation, but not for that reason. Tiered execution was never a central part of JIT compilation either. It was a fairly new invention in comparison.

The reason what they describe isn't JIT compilation is IMO fairly boring: it's not compiling the input program in any meaningful way, but simply writing hard-coded logic into executable memory that it already knows the program intended to perform. Sure there's a small degree of freedom based on the particular arithmetic operations being mentioned, but that's... very little. When your compiler already knows the high-level source code logic before it's even read the source code, it's... not a compiler. It's just a dynamic code emitter.

As to the actual difference between JIT vs. AOT... it may just come down to accounting. That is, on whether you can always exclude the compilation time/cost from the overall program execution time/cost or not. If so, you're compiling ahead of (execution) time. If not, you're compiling during execution time.

ordu•1mo ago
> it's not compiling the input program in any meaningful way, but simply writing hard-coded logic into executable memory that it already knows the program intended to perform.

The program reads the logic from stdin and translates it into machine instructions. I can agree that there is not a lot of a freedom in what can be done, but I think it just means that the source language is not Turing complete. I don't believe that compiler needs to deal with a Turing complete language to claim the title "JIT compiler".

dataflow•1mo ago
> The program reads the logic from stdin and translates it into machine instructions. I can agree that there is not a lot of a freedom in what can be done, but I think it just means that the source language is not Turing complete. I don't believe that compiler needs to deal with a Turing complete language to claim the title "JIT compiler".

"Not Turing-complete" is quite the understatement.

A "compiler" is software that translates computer code from one programming language into another language. Not just any software that reads input and produces output.

The input language here is... not even a programming language to begin with. Literally all it can express is linear functions. My fixed-function calculator is more powerful than that! If this is a programming language then I guess everyone who ever typed on a calculator is a programmer too.

tgv•1mo ago
A compiler takes some language and translates it into something close(r) to the hardware. And that's what the OP does. And since it compiles in process and executed it too, it's JIT, as opposed to AOT.

These terms are not related to the complexity of the problem. The first compilers could only translate for formulas, hence FORTRAN.

Brian_K_White•1mo ago
That's not compilation merely substitution.
dataflow•1mo ago
I'm not sure where you got your definition, but I basically copied Wikipedia's.
ordu•3w ago
> A "compiler" is software that translates computer code from one programming language into another language.

Yes, of course.

> Literally all it can express is linear functions.

And why a language to express linear functions can't be a programming language?

> My fixed-function calculator is more powerful than that!

But it isn't compiling, it is interpreting, is it? So your fixed-function calculator is not a compiler, it is an interpreter. It is irrelevant how powerful it is. There are much even more powerful interpreters and less powerful compilers.

The example we see gets computer code in one language and translates it into machine instructions. Talking about 'understatement' you are adding to this definition more constraints and these are very fuzzy constraints on what counts as a programming language.

kscarlet•1mo ago
> As to the actual difference between JIT vs. AOT... it may just come down to accounting. That is, on whether you can always exclude the compilation time/cost from the overall program execution time/cost or not. If so, you're compiling ahead of (execution) time. If not, you're compiling during execution time.

Well, this includes what I refer to as "on-the-fly" AOT, like SBCL, CCL, Chez Scheme... Even ECL can be configured to work this way. As I mentioned in another comment, people in those circles do not refer to these as "JIT" at all, instead saying "I wish my implementation was JIT instead of on-the-fly AOT"!

orthoxerox•1mo ago
I think if your program starts to execute machine code that wasn't present in the binary, then it counts as having a JIT compiler.

Of course, there are edge cases like embedding libtcc, but I think it's a reasonable definition.

ignoramous•1mo ago
> This is not what this article does.

Discounting books, many other well written articles on JIT have been shared on HN over the years [0][1][2]; the one I particularly liked as it introduces the trinity in a concise way: Compiler, Interpreter, JIT https://nickdesaulniers.github.io/blog/2015/05/25/interprete... / https://archive.vn/HaFlQ (2015).

[0] How to JIT - an introduction, https://eli.thegreenplace.net/2013/11/05/how-to-jit-an-intro... (2013).

[1] Bytecode compilers and interpreters, https://bernsteinbear.com/blog/bytecode-interpreters/ (2019).

[2] Let's build a Simple Interp, https://ruslanspivak.com/lsbasi-part1/ (2015).

MobiusHorizons•1mo ago
Is it just me or is the author using static wrong? Someone mentioned it in the thread on the previous post, where it felt more like an oversight. But in this article it seems much more like an actual misunderstanding. Should it actually be the function level static variable? I feel like in the right context the optimizer might be able to leave that in a register if there the calling function is in the compilation unit.
the-smug-one•1mo ago
Man, I friggin despise how hard (in a very not fun way) generating x86 assembly is :-). Generating Aarch64 is a lot easier.
mananaysiempre•1mo ago
I mean, I can’t do it by heart, but it’s not fundamentally very hard? Thumb-2 is definitely much more annoying.
senko•1mo ago
Ah, I thought this was going to be about a JITted BASIC. No such luck.
mananaysiempre•1mo ago
> On x86-64, pages may be 4kB, 2MB, or 1GB

But I believe sysconf(_SC_PAGESIZE) will always be 4KB, because the “may” is at the user’s discretion, not the system’s. Except on Cosmopolitan where it will always be 64KB, because Windows NT for Alpha (yes, seriously).