Favorites out of those are XCharter, ScholaX, Etbb and Erewhon.
Also have a look at Algol Revived, which is a remake of a font made for French Algol 60 books by famed type designer, Adrian Frutiger.
The blog post does this some justice though, the author does live in the current century, and does have nice examples.
/me is former LaTeX user from uni, not overly fond of it, for usability issues not unrelated to its legacy.
It's kind of ironic that a system that ships with Computer Modern doesn't end up creating more Bodoni/Didone fans.
Really, the whole Stanley Morison–era catalog of Monotype designs is great stuff. I think the only collection of type designs that rivals it is perhaps the Sumner Stone–era Adobe originals.
The former is currently sitting in my car, and I'll be trying to offload it to someone who actually wants it.
OTOH, I recall Tufte going on and on about cutting the "data-ink ratio" to the point of making graphs that we generally understand at a glance suddenly very unintuitive. I can dig into the book again if necessary, but I recall he essentially argued that box-and-whisker plots became just a few dots. There's meaning conveyed by the boxes and the whiskers, and changing that convention - even if it uses more ink than absolutely necessary - adds significant cognitive load.
That said, intuition is heavily influenced by existing practices and culture. My understanding is that Tufte wrote his first book as a reaction to existing practices and hence the notion was that what was considered intuitive was different than what we both imagine and also suboptimal.
There is a book called Graphis Diagrams released a few years before Tufte's The Visual Display of Quantitative Information that compiled what people previously considered "good" data visualizations pre-Tufte. I'd call Graphis Diagrams more of a collection of art pieces than a collection of good data visualizations, but that was the field before Tufte's work. Some of the visualizations are interesting, but many seem incredibly dated and make heavy use of "chartjunk" (Tufte's term). I'd argue that we wouldn't consider those supposed "good" examples very readable and useful by modern standards, and that just goes to show that intuition can change (for better or for worse).
I do also agree that Tufte tends towards a certain kind of performative minimalism that seems excessive at times to me. Often times the better solution is to present the data in a different way --- transform it using mathematics, use a different visualization, or something else --- rather than just reduce the amount of ink you are using. Box plots and pie charts are just useless to me as media since they simply cannot account for complexity, and no amount of minimalism is going to solve that. The answer is to use another kind of visualization (or perhaps use parallelism/small multiples to build up those kind of simple visualizations into something more meaningful).
I personally find Tufte's book Visual Explanations to be his best book. It focuses much more on how good visualizations can help you explore data more readily (see what otherwise could not be seen), especially in regards to cause and effect. The Visual Display of Quantitative Information is certainly more famous but it focuses on much more low-level implementation details like the "data-ink ratio", etc. and not bigger picture things, in my opinion. I recommend giving Visual Explanations a read if you are interested.
https://tex.stackexchange.com/questions/654089/microtypograp...
So in, for example, where https://dercuano.github.io/notes/finite-function-circuits.ht... says "Sᵢ ∈ Σ", the "S" is noticeably shorter than the other full-height characters. It looks a little bit better in the half-assed PDF rendering I produced with my hurriedly-written HTML-to-PDF renderer: http://canonical.org/~kragen/dercuano.20191230.pdf#page=1572
The other big problem you can see on that PDF page is that I chose Latin Modern Typewriter Condensed (lmtlc) for fixed-width text so that I could get 80 columns onto the narrow cellphone screens I was targeting with the PDF, but lmtlc completely omits, for example, Greek, so the examples using Greek are totally screwed up.
The formula display in that note is definitely worse than LaTeX would do, but I flatter myself to think that my half-assed Python script still produced better-looking math output than I usually see from Microsoft Word.
Like the OP, I used to care a lot about fonts. Heck, at some point my Windows boot time got slowed down because of the sheer number of fonts it had to load!
I used to think the default Latex font gives off a "serious" and "scientific" vibe. And I thought to myself: why would anyone ever use TNR when more "soulful" fonts exist?
Now that I'm older (33), I resort back to TNR or TeX Gyre Termes but with one change: I add "FakeBold" to text to make it look like old papers and books: https://x.com/OrganicGPT/status/1920202649481236745/photo/1. I just want my text to convey my thoughts, and I don't want any fancy "serifness" get in the way (so no to Bembo and Palatinno).
Something like Palatino (or even Computer Modern Roman) for body text.
But for headings, humble Helvetica looks good, and a bit less "academic". (I really dislike the default CMR at large point sizes.)
For monospace bits, again I dislike the unusual-looking TeX default, so something serifed or otherwise clearly unambiguous (for "1" and "l", "0" and "O"), and thick enough to be legible (some Courier are too thin). Inline, at a slightly smaller point size than body text, to look proportional, and maybe a little smaller in code blocks.
For a book, I was thinking something slightly flashier for headings, at least on chapters, maybe Linux Biolinum.
> This survey focuses on serif fonts as these are the usual choice for longer documents such as articles or books (although sans-serifs have become more popular for longer text in recent years). However, in keeping with the reasoning above, I have also selected accompanying sans-serif fonts for each of the seven roman choices below (all of which have maths support of some form or another).
For many of the older generations, who have spent their childhood reading thousands of books printed using serif fonts, at a time when there was no easy access to computer terminals, serif fonts are easier to read.
In general, the readability of a typeface is determined more by other features than by whether it is has or it does not have serifs.
The sans serif typefaces have appeared first after the Napoleonic wars, as simplified typefaces, suitable for low-quality printing used for titles or advertisements that should be readable from a distance.
Having simplified letter forms, sans serif typefaces remain preferable for low-resolution displays of for very small text or for text that must be read from a great distance.
However, until WWII, the simplification of the letter forms has been pushed too far, resulting in many letters that are too similar, so they can no longer be distinguished. There are many such sans serif typefaces that have been modified very little from their pre-WWII ancestors, like Helvetica and Arial, which are too simplified so that they should be avoided in any computing applications due to the great probability of misreading anything that is not plain English.
After WWII, and especially after 1990, there has been a reversal in the evolution of the sans serif typefaces, away from excessive simplification and towards making them more similar to serif typefaces, except for the serifs.
A well-known example of such a sans-serif typeface is FF Meta (Erik Spiekermann, 1991), but it had a lot of imitators. Such non-simplified sans-serifs typefaces have e.g. traditional Caroline shapes for the lower-case "a", "g" and "l" and also true italic variants (i.e. not just oblique variants).
Besides the removal of serifs, a traditional simplification in the sans serif typefaces is the removal of the contrast between thin lines and thick lines, making the thickness of the lines uniform. At least for me, any long text printed with a font with uniform line thickness looks boring, so I strongly prefer the sans serif fonts that go even further in their resemblance with serif fonts, by having thin lines and thick lines, for example Optima nova and Palatino Sans.
While any sans serif typeface by definition does not have serifs, when Hermann Zapf has designed Optima (which was released in 1958), he has found an alternative to serifs, which achieves a similar optical effect. Starting from a line that has the form of a long rectangle, instead of attaching serifs to the ends, one can make the 2 lateral long edges of the rectangle concave, instead of flat. After that, one has 2 alternatives for how to terminate the ends of the line. The first is to also make concave the 2 short terminal edges. This results in sharp corners for the line and it is the solution chosen by Zapf in Optima. The second method of line termination is to keep the terminal edges flat or even slightly convex and to round the corners where they meet the concave lateral edges. This is the method chosen by Akira Kobayashi in Palatino Sans (2006), under the influence of the similar line terminations used in Cooper Black and in the rounded sans-serif typefaces that are popular for public signage in Japan.
This alternative to serifs, with concave lateral edges, is in my opinion superior both to serifs and to classic sans serifs, but unfortunately it is effective only on very high-resolution displays or on paper (even a cheap laser printer has better resolution than the most expensive monitors), because on low resolution monitors any slightly concave edges will become straight.
In any case, for the best readability, I never use fonts with ambiguous characters, like Helvetica/Arial and most other sans serifs. Instead of that, serif fonts are better, but even better are good modern sans serifs which have been designed carefully, to have distinctive characters. With good monitors, fonts with contrast between thin lines and thick lines, or even with concave lateral line edges, are preferable.
I read this HN thread rendered in the Palatino Sans mentioned in TFA (with the italic of Palatino nova configured as its italic form; the italic of Palatino Sans Informal is also a good choice, but I prefer a stronger contrast between the regular and the italic variants of a font; that is why I have also configured the italic of Bauer Bodoni as the italic for Optima nova).
While in TFA Palatino Sans was dismissed with regret, for having to be purchased, I have bought Palatino Sans, together with a few other high-quality typefaces and I use them on Linux, instead of the available free fonts. I consider the money that I have spent on good typefaces as some of the best purchasing decisions that I have made.
However, for CLI/TUI applications and program editing, I use the free JetBrains Mono. Unlike for regular text, for programming there are many high-quality free fonts, but I prefer JetBrains Mono because it supports an extended character set, with many Unicode mathematical symbols that are missing in other programming fonts.
Examples
[1] https://lukeyoo.fyi/test/data/render/latex-statements-1.md
[2] https://lukeyoo.fyi/recap/2025/5/statistical-inference-1
[3] https://lukeyoo.fyi/test/data/render/latex-integrals-1.md
[4] https://lukeyoo.fyi/test/data/render/latex-integrals-in-comp...
For mono fonts there are a lot of nice choices but I used PragmataPro for no other reasons that I own it and it provided a nice readable contrast.
Otherwise for the free options, Palatino+mathpazo or StixTwoText + StixTwoMath are quite good options. Honestly anything but ComputerCM is a good option; it's imho not a very good font nowadays; it's way too thin. It was designed with the assumption that it will be printed on old, fairly bad, printers with significant ink overspill.
MinionPro / Minion 3 is Adobe's flagship font so it basically support every language and ligature under the sun
HTML on the other hand printed perfect tiny vector glyphs, so it wasn't the printer's fault.
I don't know enough now to find back what bad font this was back then, but I hope the situation is better today for LaTeX PDFs...
One common issue is that you don't have all fonts embedded in the PDF file. There may not be anything wrong on screen, as the PDF reader can probably find the font if you have it installed. But the software you use for printing may not know enough about the capabilities of your printer, and it may choose to hope for the best. If your printer does not have the font and the printing software didn't bother sending it with the document, the printer will use another font as a substitute.
Xophmeister•8mo ago
ahartmetz•8mo ago
If ever have to do much LaTeX again though, I'll check out the alternatives because the mess of partially compatible modules and the troubles with figure placement are still bad in LaTeX.
dhosek•8mo ago
Note also that Palatino was originally designed for Linotype hot metal typesetting and has incorporated in its design the limitations of that system (which, in some ways is actually a bonus for naïve digital setting where ligatures may be limited or non-existent). The most obvious case of this is the lack of character kerns—that is, characters cannot extend beyond their typeset width. This makes the italics look cramped since, e.g., d, l and f cannot reach over the following letter with their ascenders.
ahartmetz•8mo ago
drob518•8mo ago
Syzygies•8mo ago
I'm among the guilty. Palatino does appear spread out compared to alternatives, for better or worse.
The article does note that Palatino was originally designed for display text. I had long heard that Hermann Zapf was horrified at its adoption as a body font, which I couldn't confirm. The best I could do was to find a quote,
"One day I got up the nerve to ask 'Mr Zapf, what do you do?' He replied, 'I correct the errors of my youth.'"
spookie•8mo ago
sneak•8mo ago
spongeb00b•8mo ago