frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Ask HN: Do we need "metadata in source code" syntax that LLMs will never delete?

1•andrewstuart•6m ago•1 comments

Pentagon cutting ties w/ "woke" Harvard, ending military training & fellowships

https://www.cbsnews.com/news/pentagon-says-its-cutting-ties-with-woke-harvard-discontinuing-milit...
2•alephnerd•8m ago•1 comments

Can Quantum-Mechanical Description of Physical Reality Be Considered Complete? [pdf]

https://cds.cern.ch/record/405662/files/PhysRev.47.777.pdf
1•northlondoner•9m ago•1 comments

Kessler Syndrome Has Started [video]

https://www.tiktok.com/@cjtrowbridge/video/7602634355160206623
1•pbradv•11m ago•0 comments

Complex Heterodynes Explained

https://tomverbeure.github.io/2026/02/07/Complex-Heterodyne.html
3•hasheddan•12m ago•0 comments

EVs Are a Failed Experiment

https://spectator.org/evs-are-a-failed-experiment/
2•ArtemZ•23m ago•4 comments

MemAlign: Building Better LLM Judges from Human Feedback with Scalable Memory

https://www.databricks.com/blog/memalign-building-better-llm-judges-human-feedback-scalable-memory
1•superchink•24m ago•0 comments

CCC (Claude's C Compiler) on Compiler Explorer

https://godbolt.org/z/asjc13sa6
2•LiamPowell•26m ago•0 comments

Homeland Security Spying on Reddit Users

https://www.kenklippenstein.com/p/homeland-security-spies-on-reddit
3•duxup•29m ago•0 comments

Actors with Tokio (2021)

https://ryhl.io/blog/actors-with-tokio/
1•vinhnx•30m ago•0 comments

Can graph neural networks for biology realistically run on edge devices?

https://doi.org/10.21203/rs.3.rs-8645211/v1
1•swapinvidya•42m ago•1 comments

Deeper into the shareing of one air conditioner for 2 rooms

1•ozzysnaps•44m ago•0 comments

Weatherman introduces fruit-based authentication system to combat deep fakes

https://www.youtube.com/watch?v=5HVbZwJ9gPE
3•savrajsingh•45m ago•0 comments

Why Embedded Models Must Hallucinate: A Boundary Theory (RCC)

http://www.effacermonexistence.com/rcc-hn-1-1
1•formerOpenAI•47m ago•2 comments

A Curated List of ML System Design Case Studies

https://github.com/Engineer1999/A-Curated-List-of-ML-System-Design-Case-Studies
3•tejonutella•51m ago•0 comments

Pony Alpha: New free 200K context model for coding, reasoning and roleplay

https://ponyalpha.pro
1•qzcanoe•55m ago•1 comments

Show HN: Tunbot – Discord bot for temporary Cloudflare tunnels behind CGNAT

https://github.com/Goofygiraffe06/tunbot
2•g1raffe•58m ago•0 comments

Open Problems in Mechanistic Interpretability

https://arxiv.org/abs/2501.16496
2•vinhnx•1h ago•0 comments

Bye Bye Humanity: The Potential AMOC Collapse

https://thatjoescott.com/2026/02/03/bye-bye-humanity-the-potential-amoc-collapse/
3•rolph•1h ago•0 comments

Dexter: Claude-Code-Style Agent for Financial Statements and Valuation

https://github.com/virattt/dexter
1•Lwrless•1h ago•0 comments

Digital Iris [video]

https://www.youtube.com/watch?v=Kg_2MAgS_pE
1•vermilingua•1h ago•0 comments

Essential CDN: The CDN that lets you do more than JavaScript

https://essentialcdn.fluidity.workers.dev/
1•telui•1h ago•1 comments

They Hijacked Our Tech [video]

https://www.youtube.com/watch?v=-nJM5HvnT5k
2•cedel2k1•1h ago•0 comments

Vouch

https://twitter.com/mitchellh/status/2020252149117313349
38•chwtutha•1h ago•6 comments

HRL Labs in Malibu laying off 1/3 of their workforce

https://www.dailynews.com/2026/02/06/hrl-labs-cuts-376-jobs-in-malibu-after-losing-government-work/
4•osnium123•1h ago•1 comments

Show HN: High-performance bidirectional list for React, React Native, and Vue

https://suhaotian.github.io/broad-infinite-list/
2•jeremy_su•1h ago•0 comments

Show HN: I built a Mac screen recorder Recap.Studio

https://recap.studio/
1•fx31xo•1h ago•1 comments

Ask HN: Codex 5.3 broke toolcalls? Opus 4.6 ignores instructions?

1•kachapopopow•1h ago•0 comments

Vectors and HNSW for Dummies

https://anvitra.ai/blog/vectors-and-hnsw/
1•melvinodsa•1h ago•0 comments

Sanskrit AI beats CleanRL SOTA by 125%

https://huggingface.co/ParamTatva/sanskrit-ppo-hopper-v5/blob/main/docs/blog.md
1•prabhatkr•1h ago•1 comments
Open in hackernews

Cracovians: The Twisted Twins of Matrices

https://marcinciura.wordpress.com/2025/06/20/cracovians-the-twisted-twins-of-matrices/
81•mci•7mo ago

Comments

kubb•7mo ago
> However, the multiplication of a cracovian by another cracovian is defined differently: the result of multiplying an element from column i of the left cracovian by an element from column j of the right cracovian is a term of the sum in column i and row j of the result.

Am I the only one for whom this crucial explanation didn’t click? Admittedly, I might be stupid.

Wikipedia is a bit more understandable: „The Cracovian product of two matrices, say A and B, is defined by A ∧ B = (B^T)A

tempodox•7mo ago
You're not the only one. That “explanation” is just really bad.
tgv•7mo ago
It's the crucial part, and even with the example, I couldn't understand it. Like I can't understand why the second column in the first matrix doesn't have signs. Or why the 0 in the result matrix is negative.

But in another link I found that it's column by column multiplication. So A × B = C, then C[i][j] = sum(A[k][i] * B[k][j]). Unfortunately, the example doesn't match that definition...

burnished•7mo ago
No, I think it is too ambiguous to be useful. The example wasn't helpful either, I think they needed to perform the individual calculations for clarity.
kubb•7mo ago
Yeah, usually you name the matrix elements a, b, c, d, etc. and write out the formula for the elements of the result.
AdamH12113•7mo ago
The example is simply wrong, according to other sources. This along with the inconsistent formatting makes me wonder if it was written by an LLM. It's a shame; this seems like an interesting topic.
andrewla•7mo ago
Agreed -- "is a term of the sum" is such an inverted way to look at it.

Better I think would be to say "the result in column i and row j is the sum of product of elements in column i of the left cracovian and column j of the right cracovian".

And even by this definition the example given doesn't seem to track (and the strangeness of sometimes saying "+" and sometimes not, and having both "0" and "-0" in the example is bananas!):

   {  3  2 } {  1  -4 }  =   {  5   -2 }
   { -1  0 } { -2   3 }  =   {  0    2 }


   3 * 1 + -1 * -2 == 5 -- check
   3 * -4 + -1 * 3 == -15 -- what?
   2 * 1 + 0 * -2 == 2 (okay, but shouldn't this be in the lower left, column 1 dotted with column 2?)
   2 * -4 + 0 * 3 = -8 (now I'm really missing something)
mci•7mo ago
Thanks for the feedback, everyone. I pasted my Polish text into Gemini to translate it into English. Gemini hallucinated the translation of this example. Now it should be OK.
pomian•7mo ago
Even in Polish, this comes out Greek to me.
pomian•7mo ago
I mean, it makes some sort of visual sense, but can't grasp the results from the matrices shown.
mci•7mo ago
I took the liberty to replace my awkward wording with your "the result in column i and row j is the sum of product of elements in column i of the left cracovian and column j of the right cracovian". Hope you don't mind. Thanks!
fxj•7mo ago
I didnt get the explanation of the multiplication. After reading the wikipedia article it made mode sense:

https://en.wikipedia.org/wiki/Cracovian

The Cracovian product of two matrices, say A and B, is defined by

A ∧ B = BT A,

where BT and A are assumed compatible for the common (Cayley) type of matrix multiplication and BT is the transpose of B.

Since (AB)T = BT AT, the products (A ∧ B) ∧ C and A ∧ (B ∧ C) will generally be different; thus, Cracovian multiplication is non-associative.

A good reference how to use them and why they are useful is here (pdf):

https://archive.computerhistory.org/resources/access/text/20...

adastra22•7mo ago
As far as I can tell I don’t think it is correct to say that this isn’t a matrix. B is just written down in transposed form. Whether that makes the math more or less clear is something you can argue for or against, but it’s the same math and it is confusing to call it something else.
noosphr•7mo ago
It is a tensor of rank two with a special binary operation on tensors. These objects aren't matrices in the mathematical sense any more than convolution kernels aren't.
adastra22•7mo ago
A tensor of rank two is the same thing as a matrix…
noosphr•7mo ago
It isn't.

Matrices come with the matrix product defined over them.

This is one of four possible closed first order tensor contractions for an order two tensor, viz. AijBik AijBki AijBjk AijBkj. Only the third is applicable to matrices, all other contractions only work for general tensors without transposition.

What we deal with in computer science are actually n dimensional arrays since we don't have the co and contravariant indices that define tensors in physics.

adastra22•7mo ago
I think we are talking past each other.

The thing described by the article can be summarized by: "Any time you see A x B, you replace it with A x B^T" and it would, in practice, be exactly the same. I'm not sure the author understood this because then then go on to do a bunch of performance checks to see if there is any difference between the two. Which there isn't, because under the hood it is the exact same operations. They just multiple columns into columns (or rows into rows) instead of rows into columns. But the implicit transpose would undo that.

You can note (correctly) that this doesn't line up with the precise, but arbitrary traditional definition of a matrix, and that is correct. But that is just word games because you can very simply, using only syntax and no calculations, transform one into the other.

gnulinux•7mo ago
I guess I'm skeptical of using a non-associative algebra instead of something that can trivially be made into a ring or field (i.e. matrix algebra). What advantages does this give us?
mci•7mo ago
Author here. There are no practical advantages, as far as I know. Not even faster multiplication on today's computers.
hansvm•7mo ago
One thing that comes up in the sort of code ML I like to write is a careful attention to memory layout. Cracovians, defined according to some sibling comment as (B^T)A, make that a little more natural, since B and A can now have the same layout. I haven't used them though, so I don't have a good sense of whether that's more or less painful than other approaches.
bravesoul2•7mo ago
Shouldn't be the same on a computer right? The change is in human perception not actually what hapens when multiplying.
esafak•7mo ago
Missed a chance to call it the twisted sister!
TimorousBestie•7mo ago
What an interesting little nook of matrix analysis history! Thanks for sharing. :)
noosphr•7mo ago
In Einstein notation this operation is Aij Bkj, which incidentally shows why Einstein notation is so useful.
Syzygies•7mo ago
I'm a mathematician who taught linear algebra for decades. I love Einstein notation. I don't find Cracovians interesting at all.

Old texts got really worked up whether a vector was a row or column. The programming language APL resolved this quite nicely: A scalar has no dimensions, a vector has its length as its one dimension, ... Arbitrary rank objects all played nicely with each other, in this system.

A Cracovian is a character or two's difference in APL code. There's a benign form of mental illness learning anything, where one clutches onto something novel and obsesses over it, rather than asking "That was exciting! What novel idea will I learn in the next five minutes?" I have friends from my working class high school who still say "ASSUME makes an ask of you and me" as if they just heard it for the first time, while the most successful mathematicians that I know keep moving like sharks.

I wouldn't stall too long thinking about Cracovians, as amusing a skim as the post provided.

noosphr•7mo ago
I mean theres nothing special about naming binary operations on tensors of fixed rank. Matrices have some nice mathematical properties which is why they are studied so much in mathematics. But for number crunching there is no reason to prefer then to cracovians, or vice versa, without knowing what the underlying memory layout is in hardware.
semiinfinitely•7mo ago
Uhh so it's just matrices where the left slot of matmul is transposed?
gus_massa•7mo ago
> It turns out that multiplying cracovians by computers is not faster than multiplying matrices.

That's very specific of Python. A few years ago we were multiplying a lot of matrices in Fortran and we tried to transpose one of the matrices before the multiplication. With -o0 it was a huge difference because the calculation used contiguous numbers and was more chache friendly. Anyway, with -o3 the compiler made some trick that made the difference disappear, but I never tried to understand what the compiler was doing.

wjholden•7mo ago
I would expect that Julia could similarly show performance boosts here because of its column-major memory layout.
rundigen12•7mo ago
Read to the end and... what was the point of that? Where's the payoff?

There was a claim near the top that some things are easier to compute when viewed as cracovians. then some explanation, then suddently it switches to numpy and showing the time is the same.

New title: "Cracovians are a Waste of (the Reader's) Time"?