frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Designing Electronics That Works

https://nostarch.com/designingelectronics
1•0x54MUR41•3s ago•0 comments

Most LLM cost isn't compute – it's identity drift (110-cycle GPT-4o benchmark)

https://github.com/sigmastratum/documentation/blob/main/sigma-runtime/SR-EI-03/benchmark_report_S...
1•teugent•58s ago•1 comments

Show HN: PlanEat AI, an AI iOS app for weekly meal plans and smart grocery lists

1•franklinm1715•1m ago•0 comments

A Post-Incident Control Test for External AI Representation

https://zenodo.org/records/17921051
1•businessmate•1m ago•1 comments

اdifference gbps overview find answers

1•shahrtjany•2m ago•0 comments

Measuring Impact of Early-2025 AI on Experienced Open-Source Dev Productivity

https://arxiv.org/abs/2507.09089
1•vismit2000•4m ago•0 comments

Show HN: Lazy Demos

http://demoscope.app/lazy
1•admtal•5m ago•0 comments

AI-Driven Facial Recognition Leads to Innocent Man's Arrest (Bodycam Footage) [video]

https://www.youtube.com/watch?v=B9M4F_U1eEw
1•niczem•5m ago•1 comments

Annual Production of 1/72 (22mm) scale plastic soldiers, 1958-2025

https://plasticsoldierreview.com/ShowFeature.aspx?id=27
1•YeGoblynQueenne•6m ago•0 comments

Error-Handling and Locality

https://www.natemeyvis.com/error-handling-and-locality/
1•Theaetetus•8m ago•0 comments

Petition for David Sacks to Self-Deport

https://form.jotform.com/253464131055147
1•resters•8m ago•0 comments

Get found where people search today

https://kleonotus.com/
1•makenotesfast•10m ago•1 comments

Show HN: An early-warning system for SaaS churn (not another dashboard)

https://firstdistro.com
1•Jide_Lambo•11m ago•1 comments

Tell HN: Musk has never *tweeted* a guess for real identity of Satoshi Nakamoto

1•tokenmemory•11m ago•2 comments

A Practical Approach to Verifying Code at Scale

https://alignment.openai.com/scaling-code-verification/
1•gmays•13m ago•0 comments

Show HN: macOS tool to restore window layouts

https://github.com/zembutsu/tsubame
1•zembutsu•16m ago•0 comments

30 Years of <Br> Tags

https://www.artmann.co/articles/30-years-of-br-tags
2•FragrantRiver•23m ago•0 comments

Kyoto

https://github.com/stevepeak/kyoto
2•handfuloflight•23m ago•0 comments

Decision Support System for Wind Farm Maintenance Using Robotic Agents

https://www.mdpi.com/2571-5577/8/6/190
1•PaulHoule•24m ago•0 comments

Show HN: X-AnyLabeling – An open-source multimodal annotation ecosystem for CV

https://github.com/CVHub520/X-AnyLabeling
1•CVHub520•27m ago•0 comments

Penpot Docker Extension

https://www.ajeetraina.com/introducing-the-penpot-docker-extension-one-click-deployment-for-self-...
1•rainasajeet•27m ago•0 comments

Company Thinks It Can Power AI Data Centers with Supersonic Jet Engines

https://www.extremetech.com/science/this-company-thinks-it-can-power-ai-data-centers-with-superso...
1•vanburen•30m ago•0 comments

If AIs can feel pain, what is our responsibility towards them?

https://aeon.co/essays/if-ais-can-feel-pain-what-is-our-responsibility-towards-them
3•rwmj•34m ago•5 comments

Elon Musk's xAI Sues Apple and OpenAI over App Store Drama

https://mashable.com/article/elon-musk-xai-lawsuit-apple-openai
1•paulatreides•37m ago•1 comments

Ask HN: Build it yourself SWE blogs?

1•bawis•38m ago•1 comments

Original Apollo 11 Guidance Computer source code

https://github.com/chrislgarry/Apollo-11
3•Fiveplus•43m ago•0 comments

How Did the CIA Lose Nuclear Device?

https://www.nytimes.com/interactive/2025/12/13/world/asia/cia-nuclear-device-himalayas-nanda-devi...
1•Wonnk13•44m ago•1 comments

Is vibe coding the new gateway to technical debt?

https://www.infoworld.com/article/4098925/is-vibe-coding-the-new-gateway-to-technical-debt.html
2•birdculture•48m ago•1 comments

Why Rust for Embedded Systems? (and Why I'm Teaching Robotics with It)

https://blog.ravven.dev/blog/why-rust-for-embedded-systems/
2•aeyonblack•49m ago•0 comments

EU: Protecting children without the privacy nightmare of Digital IDs

https://democrats.eu/en/protecting-minors-online-without-violating-privacy-is-possible/
3•valkrieco•49m ago•0 comments
Open in hackernews

Cracovians: The Twisted Twins of Matrices

https://marcinciura.wordpress.com/2025/06/20/cracovians-the-twisted-twins-of-matrices/
81•mci•5mo ago

Comments

kubb•5mo ago
> However, the multiplication of a cracovian by another cracovian is defined differently: the result of multiplying an element from column i of the left cracovian by an element from column j of the right cracovian is a term of the sum in column i and row j of the result.

Am I the only one for whom this crucial explanation didn’t click? Admittedly, I might be stupid.

Wikipedia is a bit more understandable: „The Cracovian product of two matrices, say A and B, is defined by A ∧ B = (B^T)A

tempodox•5mo ago
You're not the only one. That “explanation” is just really bad.
tgv•5mo ago
It's the crucial part, and even with the example, I couldn't understand it. Like I can't understand why the second column in the first matrix doesn't have signs. Or why the 0 in the result matrix is negative.

But in another link I found that it's column by column multiplication. So A × B = C, then C[i][j] = sum(A[k][i] * B[k][j]). Unfortunately, the example doesn't match that definition...

burnished•5mo ago
No, I think it is too ambiguous to be useful. The example wasn't helpful either, I think they needed to perform the individual calculations for clarity.
kubb•5mo ago
Yeah, usually you name the matrix elements a, b, c, d, etc. and write out the formula for the elements of the result.
AdamH12113•5mo ago
The example is simply wrong, according to other sources. This along with the inconsistent formatting makes me wonder if it was written by an LLM. It's a shame; this seems like an interesting topic.
andrewla•5mo ago
Agreed -- "is a term of the sum" is such an inverted way to look at it.

Better I think would be to say "the result in column i and row j is the sum of product of elements in column i of the left cracovian and column j of the right cracovian".

And even by this definition the example given doesn't seem to track (and the strangeness of sometimes saying "+" and sometimes not, and having both "0" and "-0" in the example is bananas!):

   {  3  2 } {  1  -4 }  =   {  5   -2 }
   { -1  0 } { -2   3 }  =   {  0    2 }


   3 * 1 + -1 * -2 == 5 -- check
   3 * -4 + -1 * 3 == -15 -- what?
   2 * 1 + 0 * -2 == 2 (okay, but shouldn't this be in the lower left, column 1 dotted with column 2?)
   2 * -4 + 0 * 3 = -8 (now I'm really missing something)
mci•5mo ago
Thanks for the feedback, everyone. I pasted my Polish text into Gemini to translate it into English. Gemini hallucinated the translation of this example. Now it should be OK.
pomian•5mo ago
Even in Polish, this comes out Greek to me.
pomian•5mo ago
I mean, it makes some sort of visual sense, but can't grasp the results from the matrices shown.
mci•5mo ago
I took the liberty to replace my awkward wording with your "the result in column i and row j is the sum of product of elements in column i of the left cracovian and column j of the right cracovian". Hope you don't mind. Thanks!
fxj•5mo ago
I didnt get the explanation of the multiplication. After reading the wikipedia article it made mode sense:

https://en.wikipedia.org/wiki/Cracovian

The Cracovian product of two matrices, say A and B, is defined by

A ∧ B = BT A,

where BT and A are assumed compatible for the common (Cayley) type of matrix multiplication and BT is the transpose of B.

Since (AB)T = BT AT, the products (A ∧ B) ∧ C and A ∧ (B ∧ C) will generally be different; thus, Cracovian multiplication is non-associative.

A good reference how to use them and why they are useful is here (pdf):

https://archive.computerhistory.org/resources/access/text/20...

adastra22•5mo ago
As far as I can tell I don’t think it is correct to say that this isn’t a matrix. B is just written down in transposed form. Whether that makes the math more or less clear is something you can argue for or against, but it’s the same math and it is confusing to call it something else.
noosphr•5mo ago
It is a tensor of rank two with a special binary operation on tensors. These objects aren't matrices in the mathematical sense any more than convolution kernels aren't.
adastra22•5mo ago
A tensor of rank two is the same thing as a matrix…
noosphr•5mo ago
It isn't.

Matrices come with the matrix product defined over them.

This is one of four possible closed first order tensor contractions for an order two tensor, viz. AijBik AijBki AijBjk AijBkj. Only the third is applicable to matrices, all other contractions only work for general tensors without transposition.

What we deal with in computer science are actually n dimensional arrays since we don't have the co and contravariant indices that define tensors in physics.

adastra22•5mo ago
I think we are talking past each other.

The thing described by the article can be summarized by: "Any time you see A x B, you replace it with A x B^T" and it would, in practice, be exactly the same. I'm not sure the author understood this because then then go on to do a bunch of performance checks to see if there is any difference between the two. Which there isn't, because under the hood it is the exact same operations. They just multiple columns into columns (or rows into rows) instead of rows into columns. But the implicit transpose would undo that.

You can note (correctly) that this doesn't line up with the precise, but arbitrary traditional definition of a matrix, and that is correct. But that is just word games because you can very simply, using only syntax and no calculations, transform one into the other.

gnulinux•5mo ago
I guess I'm skeptical of using a non-associative algebra instead of something that can trivially be made into a ring or field (i.e. matrix algebra). What advantages does this give us?
mci•5mo ago
Author here. There are no practical advantages, as far as I know. Not even faster multiplication on today's computers.
hansvm•5mo ago
One thing that comes up in the sort of code ML I like to write is a careful attention to memory layout. Cracovians, defined according to some sibling comment as (B^T)A, make that a little more natural, since B and A can now have the same layout. I haven't used them though, so I don't have a good sense of whether that's more or less painful than other approaches.
bravesoul2•5mo ago
Shouldn't be the same on a computer right? The change is in human perception not actually what hapens when multiplying.
esafak•5mo ago
Missed a chance to call it the twisted sister!
TimorousBestie•5mo ago
What an interesting little nook of matrix analysis history! Thanks for sharing. :)
noosphr•5mo ago
In Einstein notation this operation is Aij Bkj, which incidentally shows why Einstein notation is so useful.
Syzygies•5mo ago
I'm a mathematician who taught linear algebra for decades. I love Einstein notation. I don't find Cracovians interesting at all.

Old texts got really worked up whether a vector was a row or column. The programming language APL resolved this quite nicely: A scalar has no dimensions, a vector has its length as its one dimension, ... Arbitrary rank objects all played nicely with each other, in this system.

A Cracovian is a character or two's difference in APL code. There's a benign form of mental illness learning anything, where one clutches onto something novel and obsesses over it, rather than asking "That was exciting! What novel idea will I learn in the next five minutes?" I have friends from my working class high school who still say "ASSUME makes an ask of you and me" as if they just heard it for the first time, while the most successful mathematicians that I know keep moving like sharks.

I wouldn't stall too long thinking about Cracovians, as amusing a skim as the post provided.

noosphr•5mo ago
I mean theres nothing special about naming binary operations on tensors of fixed rank. Matrices have some nice mathematical properties which is why they are studied so much in mathematics. But for number crunching there is no reason to prefer then to cracovians, or vice versa, without knowing what the underlying memory layout is in hardware.
semiinfinitely•5mo ago
Uhh so it's just matrices where the left slot of matmul is transposed?
gus_massa•5mo ago
> It turns out that multiplying cracovians by computers is not faster than multiplying matrices.

That's very specific of Python. A few years ago we were multiplying a lot of matrices in Fortran and we tried to transpose one of the matrices before the multiplication. With -o0 it was a huge difference because the calculation used contiguous numbers and was more chache friendly. Anyway, with -o3 the compiler made some trick that made the difference disappear, but I never tried to understand what the compiler was doing.

wjholden•5mo ago
I would expect that Julia could similarly show performance boosts here because of its column-major memory layout.
rundigen12•5mo ago
Read to the end and... what was the point of that? Where's the payoff?

There was a claim near the top that some things are easier to compute when viewed as cracovians. then some explanation, then suddently it switches to numpy and showing the time is the same.

New title: "Cracovians are a Waste of (the Reader's) Time"?