frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

Open in hackernews

Posit floating point numbers: thin triangles and other tricks (2019)

http://marc-b-reynolds.github.io/math/2019/02/06/Posit1.html
34•fanf2•4h ago

Comments

antiquark•3h ago
This seems to be related to the "type III unum":

https://en.wikipedia.org/wiki/Unum_(number_format)#Posit_(Ty...

andrepd•3h ago
Posit is the name of the 3rd in a series of John Gustafson's proposals of an alternative to ieee floats.
andrepd•3h ago
Great dive! I'm very interested in posits (and ieee float replacements in general) and never read this post before. Tons of insightful points.
adrian_b•2h ago
The example where computing an expression with posits has much better accuracy than when computing with IEEE FP32 is extremely misleading.

Regardless whether you use 32-bit posits or IEEE FP32, you can represent only the same count of numbers, i.e. of points on the "real" numbers axis.

When choosing a representation format, you cannot change the number of representable points, you can just choose to distribute the points in different places.

The IEEE FP32 format distributes the points so that the relative rounding error is approximately constant over the entire range.

Posits crowd the points into the segment close to zero, obtaining there a better rounding error, with the price that the segments distant to zero have very rare points, i.e. very high rounding errors.

Posits behave pretty much like a fixed-point format that has gradual overflow instead of a sharp cut-off. For big numbers you do not get an overflow exception that would stop the computation, but the accuracy of the results becomes very bad. For small numbers the accuracy is good, but not as good as for a fixed-point number, because some bit patterns must be reserved for representing the big numbers, in order to avoid overflow.

The example that demonstrates better accuracy for posits is manufactured by choosing values in the range where posits have better accuracy. It is trivial to manufacture an almost identical example where posits have worse accuracy, by choosing values in an interval where FP32 has better accuracy.

There are indeed problems where posits can outperform IEEE FP32, but it is quite difficult to predict which are those problems, because for a complex problem it can be very difficult to predict which will be the ranges for the intermediate results. This is the very reason why floating-point numbers are preferred over fixed-point numbers, to avoid the necessity of such analyses.

While for IEEE formats it is possible to make estimates of the relative errors of the results of a long computation, due to the guaranteed bounds for the relative error of each operation, that is pretty much impossible for posits, where the relative error is a function of the values of the operands, so you cannot estimate it without actually doing the computation.

For scientific and technical computations, posits are pretty much useless, because those have very wide value ranges for their data, also because those computations need error estimations, and also because posits can have significant advantages only for small number formats, of 32 bits or less, while those computations need mostly 64-bit numbers or even bigger.

Nevertheless, for special problems that are very well characterized, i.e. you know with certainty some narrow ranges for the values of the input data and of the intermediate results, posits could get much more accuracy than IEEE FP32, but they could have good performance only if they were implemented in hardware.

wat10000•2h ago
Isn’t that pretty much the entire point of this article?
andrepd•10m ago
> The example where computing an expression with posits has much better accuracy than when computing with IEEE FP32 is extremely misleading.

Did you not rtfa or am I missing something?

dnautics•2h ago
One of the creators of posits here (I came up with the name and i think ES is my idea, did the first full soft versions in julia, and designed the first circuits, including a cool optimization for addition). my personal stance is that posits are not great for scientific work precisely because of the difficulties with actually solving error propagation. Hopefully i can give a bit more measured insights into why the "parlor tricks" appear in the posits context.

John's background is in scientific compute/HPC and he previously advocated for using unums (which do fully track errors) and there is a version of posits (called valids) which do track errors, encouraging the user to combine with other techniques to cut the error bounds using invariants, but that requires algorithmic shift. Alas, a lot of examples were lifted from the unums book and sort of square peg/round holed into posits. you can see an example of algorithmic shift in the demo of matrix multiplication in the stanford talk (that demo is me; linked in OP).

as for me, i was much more interested in lower bit representation for ml applications where you ~don't care about error propagation. this also appears in the talk.

as it wound up, Facebook took some interest in it for AI but they nih'd it and redid the mantissa as logarithmic (which i think was a mistake).

and anyway redoing your silicon it turns out to be a pain in the ass (quires only make sense in the burn-the-existing-world perspective and are not so bad for training pipelines, where iirc kronecker product dominates), but the addition operation takes up quite a bit more floorspace, and just quantizing to int4 is with grouped scaling factors is easier with existing gpu pipelines, even custom hardware too.

fun side fact: Positron.ai, was so-named on the off chance that using posits makes sense (you can see the through line to science fiction that i was attempting to manifest when i came up with the name)

dnautics•2h ago
turns out only the slides are linked in op. here is the live recording:

https://youtu.be/aP0Y1uAA-2Y?feature=shared

andrepd•12m ago
> and designed the first circuits, including a cool optimization for addition

Curious, what trick? :)

Wishing for mainstream CPU support for anything but IEEE numbers was always a pipe dream on anything but the decades-long term, but I gotta be honest, I was hoping the current AI hype wave would bring some custom silicon for alternative float formats, Posits included.

> the addition operation takes up quite a bit more floorspace, and just quantizing to int4 is with grouped scaling factors is easier with existing gpu pipelines

Can you elaborate on this part?

burnt-resistor•1h ago
Condensed IEEE-like formats cheat sheet I threw together and tested:

https://pastebin.com/aYwiVNcA

mserdarsanli•1h ago
A while ago I built an interactive tool to display posits (Also IEEE floats etc.): https://mserdarsanli.github.io/FloatInfo/

It is hard to understand at first but after playing with this a bit it will make sense. As with everything, there are trade offs compared to IEEE floats, but having more precision when numbers are close to 1 is pretty nice.

Ask HN: Technology to protect immigrants and public lands?

1•bix6•2m ago•0 comments

SpaceX's next Starship just blew up on its test stand in South Texas

https://arstechnica.com/space/2025/06/starships-rough-year-gets-worse-after-a-late-night-explosion-in-south-texas/
1•boilerupnc•3m ago•1 comments

Implementing Notion-Styled URLs

https://maxleiter.com/blog/sluggified-urls
1•MaxLeiter•3m ago•0 comments

AI Skills Now Essential in Every Industry, Not Just Tech

https://gazeon.site/ai-skills-now-essential-in-every-industry-not-just-tech/
1•eligrid•6m ago•0 comments

Can't We Have Nice Things?

https://queue.acm.org/detail.cfm?id=3733697
1•gtirloni•8m ago•0 comments

Social media and Blog Post on autopilot Daily – good quality

https://aicontent.live/
1•AlejandroArce•10m ago•0 comments

DRY, KISS and YAGNI (back to basics)

https://metayeti.net/blog/dry-kiss-and-yagni
2•metayeti•10m ago•0 comments

The Consumer Authentication Strength Maturity Model (Casmm) V6

https://danielmiessler.com/blog/casmm-consumer-authentication-security-maturity-model
1•emegeve83•12m ago•0 comments

100 Blog posts, 6 years, 5M views

https://austinhenley.com/blog/100.html
1•sebg•14m ago•0 comments

A Guide to Developer Advocacy

https://cameron.pfiffer.org/blog/devrel/
1•sebg•14m ago•0 comments

The administration's anti-consensus Mars plan will fail

https://spacenews.com/the-administrations-anti-consensus-mars-plan-will-fail/
1•rbanffy•15m ago•0 comments

UK government allows teachers to use AI for marking

https://www.bbc.com/news/articles/c1kvyj7dkp0o
1•louiscb•15m ago•0 comments

How The Roottrees are Dead ditched AI and became a hit

https://www.theverge.com/ai-artificial-intelligence/686651/roottrees-ai-original-illustrator-replacement
1•Skillset•15m ago•0 comments

Guess the Correlation

https://www.guessthecorrelation.com/
1•sebg•16m ago•0 comments

4 o4-mini-high Prompts saved me $100-200/yr and yet another SaaS app

https://suthakamal.substack.com/p/how-10-prompts-replaced-an-afternoon
1•suthakamal•19m ago•1 comments

16B Apple, Facebook, Google passwords leaked in largest data breach

https://cryptorank.io/news/feed/b83db-16-billion-passwords-leaked-data-breach
4•vednig•22m ago•1 comments

Bibliography of Scheme-related Research (2012)

https://web.archive.org/web/20180625085633/http://library.readscheme.org/
1•todsacerdoti•22m ago•0 comments

GPS tracker detection made easy with off-the-shelf hardware

https://www.helpnetsecurity.com/2025/06/19/gps-tracker-detection/
1•01-_-•23m ago•0 comments

Context7: Up-to-date documentation for LLMs and AI code editors

https://context7.com/
1•handfuloflight•24m ago•0 comments

I Want to Love Linux. It Doesn't Love Me Back: Post 4 – Wayland Is Growing Up

https://fireborn.mataroa.blog/blog/i-want-to-love-linux-it-doesnt-love-me-back-post-4-wayland-is-growing-up-and-now-we-dont-have-a-choice/
2•todsacerdoti•24m ago•0 comments

The Search for Advanced Civilizations Is Going Real-Time

https://www.universetoday.com/articles/the-search-for-advanced-civilizations-is-going-real-time
1•rbanffy•28m ago•0 comments

Midjourney launches an AI video generator

https://www.theverge.com/news/690055/midjourney-ai-video-generator-launch
3•01-_-•29m ago•0 comments

"Do Not Fold, Spindle or Mutilate": A Cultural History of the Punch Card [pdf]

https://www.cs.mun.ca/~harold/Courses/Old/CS1400.W15/Diary/Lubar1992.pdf
1•rbanffy•29m ago•0 comments

Running Pro*COBOL Programs in Kubernetes

https://github.com/defenseunicorns-labs/cobol-demo
3•willswire•30m ago•0 comments

Collabora takes first place at ICME 2025 Grand Challenge

https://www.collabora.com/news-and-blog/news-and-events/collabora-takes-first-place-at-icme-2025-grand-challenge.html
1•losgehts•31m ago•0 comments

Now you cand change any font in the web

https://github.com/YelielGerardK/font-changer-extension
2•H0009•32m ago•0 comments

Challenges for Artificial Intelligence in Medicine

https://www.oreilly.com/radar/3-challenges-for-artificial-intelligence-in-medicine/
1•brandonb•33m ago•0 comments

Replacing OTel to scale our Observability platform beyond 100 Petabytes

https://clickhouse.com/blog/scaling-observability-beyond-100pb-wide-events-replacing-otel
1•rorycrispin•34m ago•0 comments

The notoriously difficult civil-service exams in China and India

https://www.economist.com/interactive/asia/2025/06/13/can-you-pass-the-toughest-tests-in-the-world
2•kensai•36m ago•1 comments

macOS email clients that might work better for you than Apple Mail

https://www.zdnet.com/article/4-macos-email-clients-that-might-work-better-for-you-than-apple-mail/
1•eligrid•39m ago•1 comments