frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Reverse Engineering Medium.com's Editor: How Copy, Paste, and Images Work

https://app.writtte.com/read/gP0H6W5
1•birdculture•1m ago•0 comments

Go 1.22, SQLite, and Next.js: The "Boring" Back End

https://mohammedeabdelaziz.github.io/articles/go-next-pt-2
1•mohammede•7m ago•0 comments

Laibach the Whistleblowers [video]

https://www.youtube.com/watch?v=c6Mx2mxpaCY
1•KnuthIsGod•9m ago•1 comments

I replaced the front page with AI slop and honestly it's an improvement

https://slop-news.pages.dev/slop-news
1•keepamovin•13m ago•1 comments

Economists vs. Technologists on AI

https://ideasindevelopment.substack.com/p/economists-vs-technologists-on-ai
1•econlmics•15m ago•0 comments

Life at the Edge

https://asadk.com/p/edge
2•tosh•21m ago•0 comments

RISC-V Vector Primer

https://github.com/simplex-micro/riscv-vector-primer/blob/main/index.md
3•oxxoxoxooo•25m ago•1 comments

Show HN: Invoxo – Invoicing with automatic EU VAT for cross-border services

2•InvoxoEU•25m ago•0 comments

A Tale of Two Standards, POSIX and Win32 (2005)

https://www.samba.org/samba/news/articles/low_point/tale_two_stds_os2.html
2•goranmoomin•29m ago•0 comments

Ask HN: Is the Downfall of SaaS Started?

3•throwaw12•30m ago•0 comments

Flirt: The Native Backend

https://blog.buenzli.dev/flirt-native-backend/
2•senekor•32m ago•0 comments

OpenAI's Latest Platform Targets Enterprise Customers

https://aibusiness.com/agentic-ai/openai-s-latest-platform-targets-enterprise-customers
1•myk-e•34m ago•0 comments

Goldman Sachs taps Anthropic's Claude to automate accounting, compliance roles

https://www.cnbc.com/2026/02/06/anthropic-goldman-sachs-ai-model-accounting.html
2•myk-e•37m ago•5 comments

Ai.com bought by Crypto.com founder for $70M in biggest-ever website name deal

https://www.ft.com/content/83488628-8dfd-4060-a7b0-71b1bb012785
1•1vuio0pswjnm7•38m ago•1 comments

Big Tech's AI Push Is Costing More Than the Moon Landing

https://www.wsj.com/tech/ai/ai-spending-tech-companies-compared-02b90046
4•1vuio0pswjnm7•40m ago•0 comments

The AI boom is causing shortages everywhere else

https://www.washingtonpost.com/technology/2026/02/07/ai-spending-economy-shortages/
2•1vuio0pswjnm7•41m ago•0 comments

Suno, AI Music, and the Bad Future [video]

https://www.youtube.com/watch?v=U8dcFhF0Dlk
1•askl•43m ago•2 comments

Ask HN: How are researchers using AlphaFold in 2026?

1•jocho12•46m ago•0 comments

Running the "Reflections on Trusting Trust" Compiler

https://spawn-queue.acm.org/doi/10.1145/3786614
1•devooops•51m ago•0 comments

Watermark API – $0.01/image, 10x cheaper than Cloudinary

https://api-production-caa8.up.railway.app/docs
1•lembergs•53m ago•1 comments

Now send your marketing campaigns directly from ChatGPT

https://www.mail-o-mail.com/
1•avallark•56m ago•1 comments

Queueing Theory v2: DORA metrics, queue-of-queues, chi-alpha-beta-sigma notation

https://github.com/joelparkerhenderson/queueing-theory
1•jph•1h ago•0 comments

Show HN: Hibana – choreography-first protocol safety for Rust

https://hibanaworks.dev/
5•o8vm•1h ago•1 comments

Haniri: A live autonomous world where AI agents survive or collapse

https://www.haniri.com
1•donangrey•1h ago•1 comments

GPT-5.3-Codex System Card [pdf]

https://cdn.openai.com/pdf/23eca107-a9b1-4d2c-b156-7deb4fbc697c/GPT-5-3-Codex-System-Card-02.pdf
1•tosh•1h ago•0 comments

Atlas: Manage your database schema as code

https://github.com/ariga/atlas
1•quectophoton•1h ago•0 comments

Geist Pixel

https://vercel.com/blog/introducing-geist-pixel
2•helloplanets•1h ago•0 comments

Show HN: MCP to get latest dependency package and tool versions

https://github.com/MShekow/package-version-check-mcp
1•mshekow•1h ago•0 comments

The better you get at something, the harder it becomes to do

https://seekingtrust.substack.com/p/improving-at-writing-made-me-almost
2•FinnLobsien•1h ago•0 comments

Show HN: WP Float – Archive WordPress blogs to free static hosting

https://wpfloat.netlify.app/
1•zizoulegrande•1h ago•0 comments
Open in hackernews

Estimating Logarithms

https://obrhubr.org/logarithm-estimation
103•surprisetalk•8mo ago

Comments

xeonmc•8mo ago
Protip: since halving and doubling are the same logarithmic distance on either sides of unity, and the logarithmic distance of 2.0 to 5.0 is just a tiny bit larger than that of doubling, this means that you can roughly eyeball the infra-decade fraction by cutting them into thirds
thechao•8mo ago
I don't know about powers-of-10; but, you can use something similar to bootstrap logs-in-your-head.

So, 2^10=1024. That means log10(2)~3/10=0.3. By log laws: 1 - .3 = 0.7 ~ log10(5).

Similarly, log10(3)*9 ~ 4 + log10(2); so, log10(3) ~ .477.

Other prime numbers use similar "easy power rules".

Now, what's log10(80)? It's .3*3 + 1 ~ 1.9. (The real value is 1.903...).

The log10(75) ~ .7*2+.477 = 1.877 (the real answer is 1.875...).

Just knowing some basic "small prime" logs lets you rapidly calculate logs in your head.

madcaptenor•8mo ago
For log(3) I prefer the "musical" approximation 2^19 ~ 3^12. This is a "musical" fact because it translates into 2^(7/12) ~ 3/2 - that is, seven semitones make a perfect fifth). Together with log(2) ~ 3/10 that gives log(3) ~ 19/40.

Also easy to remember: 7^4 = 2401 ~ 2400. log(2400) = log(3) + 3 log(2) + 2 ~ 19/40 + 3 * 12/40 + 2 = 135/40, so you get log(7) ~ 135/160 = 27/32 = 0.84375.

thechao•8mo ago
These are both great! I learned most of these old tricks from my dad & grandfather.
thechao•8mo ago
I'll double-reply, since I think people might also appreciate this... there's a pretty easy way to find the square of 2-leading-digit numbers (and square-roots, too.)

Direct square: 54^2 = (50 + 4)^2 = 2500 + 2 * 50 * 4 + 4^2.

Direct square-root: 3800 = (61^2=3721) + r ~ (61 + (3800-3721)/(2*61) = 61 + 79/122. To check: (61 + 79/122)^2= 3,800.419. You can estimate the "overhead" by noting that 79/122 ~ 2/3 => (2/3)^2 = 0.444.

Of course, my grandfather would've just used a slide-rule directly.

briian•8mo ago
So much of economics maths/stats is built on this one little trick.

It's still pretty cool to me that A this works and B it can be used to do so much.

saulpw•8mo ago
Here's all you really need to know about logs when estimating in your head:

The number of digits minus one is the magnitude (integer). Then add the leading digit like so:

1x = ^0.0

2x = ^0.3 (actually ^0.301...)

pi = ^0.5 (actually ^0.497...)

5x = ^0.7 (actually ^0.699...)

Between these, you can interpolate linearly and it's fine for estimating. Also 3x is close enough to pi to also be considered ^0.5.

In fact, if all you're doing is estimating, you don't even really need to know the above log table. Just use the first digit of the original number as the first digit past the decimal. So like 6000 would be ^3.6 (whereas it's actually ^3.78). It's "wrong" but not that far off if you're using logarithmetic for napkin math.

xeonmc•8mo ago
And this is also the basis of the fast inverse square root algorithm. Floating point numbers are just linear interpolations between octaves.
thomasahle•8mo ago
What is this ^notation?

Looks like 5x=^0.699 means log_10(5)=0.699.

xeonmc•8mo ago
5 = 10^0.699
saulpw•8mo ago
It's magnitude notation. ^X is short for 10^X.
dhosek•8mo ago
The pi = ^0.5 bit reminds me of a college physics professor who was fond of using the shortcut π²=10.
madcaptenor•8mo ago
This is wrong, π² = g, the acceleration due to gravity.
brucehoult•8mo ago
Slightly more accurate ..

    1: 0.0
    2: 0.3
    3: 0.475
    4: 0.6 (2*2)
    5: 0.7
    6: 0.775 (2*3)
    7: 0.85
    8: 0.9 (2*2*2)
    9: 0.95 (3*3)
The biggest error is 7 at +0.577% ... 0.845 is almost perfect. The others are maximum +0.45% off.

So you only need to remember:

    2: 0.3
    7: 0.85
    9: 0.95

    1.4 = sqrt(2) -> 0.15
    3 = sqrt(9) -> 0.95/2 = 0.475
    pi = ~sqrt(10) -> 0.5
    4 = 2*2 -> 0.6
    5 = 10/2 -> 0.7
    6 = 2*3
    2pi = 2*sqrt(10) -> 0.3+0.5 = 0.8
    8 = 2*2*2 -> 0.9
madcaptenor•8mo ago
You barely even have to remember

  9: 0.95
since you can get it by interpolation between 8 and 10.
brucehoult•8mo ago
Yup.

Put the effort into remembering a 3 digit log for 7 instead?

Or keep the same precision with ...

    7 = ~sqrt(100/2) -> (2-0.3)/2 = 1.7/2 = 0.85
Log 2 is all you need?

Or even ...

    7 = sqrt(sqrt(2401)) = (3*8*100)^(1/4) -> 0.95/8 + 0.3*3/4 + 2/4 = 0.12 + 0.225 + 0.5 = 0.845
VERY accurate, but that's getting to be too much.
arkeros•8mo ago
You also don't need to remember 7, as 2*7^2 =~ 100 => log 7 =~ (2 - 0.3) / 2 = 0.85
brucehoult•8mo ago
Yes, as I mentioned in one of my other replies. Or 2400^0.25.

Same for 9: sqrt(81) = sqrt(8 * 10) -> (0.3*3 + 1) / 2 = 0.95

At some point it's easier to remember a few numbers than calculate a lot of formulas.

adrian_b•8mo ago
I find much more useful to memorize the inverse of that table, i.e. the decibel table.

The numbers corresponding approximately to 0 dB, 1 dB, 2 dB, 3 dB etc. (1.0, 1.25, 1.6, 2.0 etc.; more accurate values, like 1.26 or 1.58 instead of 1.25 and 1.6 are typically not needed) are also frequently encountered as such in engineering practice as belonging to various series of standard nominal values, which makes them more useful to have in mind.

Then for any mental computation, it is trivial to convert mentally a given number to the corresponding dB value, perform the computation with additions and subtractions instead of multiplications and divisions, then convert back the dB value to the linear number corresponding to the result.

Doing mental computations in this way, even if it has only about 2 correct digits at most, is enough for debugging various hardware problems, or even for catching software bugs, where frequently an impressive numbers of significant digits is displayed, but the order of magnitude can be completely wrong.

madcaptenor•8mo ago
Since 3 dB is so nearly 2.0, you can derive that series by taking small powers of two, say from 2^(-4) to 2^5 or 2^6:

0.0625, 0.125, 0.25, 0.5, 1, 2, 4, 8, 16, 32, (64)

and then multiplying everything by factors of 10 to get it between 0 and 10:

1, 1.25, 1.6, 2, 2.5, 3.2, 4, 5, 6.25 (or 6.4), 8

adrian_b•8mo ago
You are right, so this is how the series of standardized nominal values that has been traditionally used for some quantities, like maximum dissipated powers for resistors, maximum operating voltages for capacitors, maximum powers for motors, hexagonal nut sizes for some ancient packages for stud-mounted power semiconductor devices, etc., has been obtained.

Apparently whoever have standardized first this series of values have not been able to make their mind to choose between 6.25 and 6.4, therefore 6.3 has been chosen instead of any of those 2 values.

gus_massa•8mo ago
Typo near the top, in case someone knows the author:

> log(100)≤log(N)<log(100)

There is a missing 0 in the last log. It should be

> log(100)≤log(N)<log(1000)

obrhubr•8mo ago
Thanks for pointing this out :), I fixed it!
stevefan1999•8mo ago
Related Wikipedia entry: https://en.wikipedia.org/wiki/Logarithmic_number_system

Also related: https://blog.timhutt.co.uk/fast-inverse-square-root/

(I see that someone already mentioned fast inverse square root algorithm is related to this, which is famously used by John Carmack which is one of my hero who led me into tech industry, despite I didn't end up in gaming industry)

teo_zero•8mo ago
The artcle makes a weird use of the notation for successive exponentiations. It's difficult to represent the concept here as only text is allowed, but shortly the author uses a^b^c to mean (a^b)^c. This is counter intuitive as in most mathematical contexts exponentiation is right-associative, that is a^b^c means a^(b^c). A pair of parentheses would remove any doubts!