frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

MetaImGui

https://github.com/andynicholson/MetaImGUI
1•andynicholson•12s ago•0 comments

OpenAI Is Doing Everything Poorly

https://www.theatlantic.com/technology/2026/03/sora-openai-identity-crisis/686544/
1•JumpCrisscross•3m ago•0 comments

A Phone-Free Childhood? One Irish Village Is Making It Happen

https://www.nytimes.com/2026/03/25/realestate/ireland-cell-phones-children.html
1•JumpCrisscross•3m ago•0 comments

IPhey: Fingerprinting and IP Checker

https://momoproxy.com/blog/iphey
1•udpchannel•4m ago•0 comments

2023 Or, Why I am Not a Doomer

https://www.hyperdimensional.co/p/2023
1•sebg•6m ago•0 comments

Treat Errors as Warnings

https://thejoylab.ai/p/ewarning
1•xerxes249•7m ago•0 comments

In Math, Rigor Is Vital. But Are Digitized Proofs Taking It Too Far?

https://www.quantamagazine.org/in-math-rigor-is-vital-but-are-digitized-proofs-taking-it-too-far-...
1•isaacfrond•7m ago•0 comments

LetPlant GreenFocus Pomodoro Timer

https://chromewebstore.google.com/detail/letplant-greenfocus-pomod/faafihopigjlmphlldphfgdgagioejmk
1•doener•7m ago•0 comments

Supreme Court rejects Sony's attempt to kick music pirates off the Internet

https://arstechnica.com/tech-policy/2026/03/supreme-court-rejects-sonys-attempt-to-kick-music-pir...
4•isaacfrond•8m ago•0 comments

We cache too much

https://websmith.studio/blog/we-cache-too-much/
1•titanslayer•8m ago•0 comments

Mazda may have found the apex in ICE design with the Skyactiv-Z

https://newatlas.com/automotive/mazda-skyactiv-z/
4•breve•10m ago•0 comments

Show HN: Imrobot – Reverse CAPTCHA that verifies AI agents, not humans

https://github.com/leopechnicki/im_robot
1•leo_pechnicki•10m ago•0 comments

Ireland's first mobile video call via satellite is made

https://www.rte.ie/news/business/2026/0326/1565222-satellite-call-ireland/
1•austinallegro•10m ago•0 comments

Why Sora Failed: $15M/day inference cost vs. $2.1M lifetime revenue

https://www.revolutioninai.com/2026/03/%20chatgpt-gpt-54-mini-silent-switch-march-2026.html
2•vinodpandey7•11m ago•0 comments

Running Sonnet 4.5 Level LLM's on Your Own Servers: Kimi K2.5 Economics

https://twitter.com/CDerinbogaz/status/2037101565249487079
1•textcortex•11m ago•0 comments

MSA on memory issues with AI-a [pdf]

https://github.com/EverMind-AI/MSA/blob/main/paper/MSA__Memory_Sparse_Attention_for_Efficient_End...
1•Liriel•13m ago•0 comments

Vim_gym – Practice Vim by competing against other people

https://www.vimgym.app/
1•Aaronmacaron•13m ago•0 comments

Why pylock.toml includes digital attestations

https://snarky.ca/why-pylock-toml-includes-digital-attestations/
1•lumpa•14m ago•0 comments

Paper: Reducing hallucination in English–Hindi LLMs using citation grounding

https://arxiv.org/abs/2603.18911
1•vedantpandya•16m ago•0 comments

Is there any way to remove an already-pushed commit from GitLab?

1•bluewhalecove•17m ago•1 comments

Engineers do get promoted for writing simple code

https://www.seangoedecke.com/simple-work-gets-rewarded/
3•lalitmaganti•22m ago•0 comments

Intel Arc Pro B70 and Arc Pro B65 GPUs Bring 32GB of RAM to AI and Pro Apps

https://www.tomshardware.com/pc-components/gpus/intel-arc-pro-b70-and-arc-pro-b65-gpus-bring-32gb...
4•throwaway270925•22m ago•1 comments

The Inside Story of the Greatest Deal Google Ever Made: Buying DeepMind

https://www.wsj.com/tech/ai/deepmind-google-demis-hassabis-5bd6de54
1•bookofjoe•22m ago•1 comments

SidClaw – The approval layer for AI agents (open-source)

https://github.com/sidclawhq/platform
1•sidclaw•22m ago•0 comments

Scientists heated a Rocky Mountain wildlife meadow by 2C?

https://www.theguardian.com/environment/2026/mar/25/flowers-heated-2c-meadow-climate-crisis-exper...
1•robaato•24m ago•0 comments

Show HN: An x402 gateway for buying a finished local business website

https://boosterpack.xyz/x402
2•Martibis•24m ago•0 comments

Show HN: Neural DNA – 258 params that grow network topology

https://github.com/tejassudsfp/ndna
1•tejassuds•30m ago•0 comments

Searching for Fast Astronomical Transients in Archival Photographic Plates

https://arxiv.org/abs/2603.20407
1•solarist•30m ago•0 comments

Microsoft's Rust Training

https://github.com/microsoft/RustTraining
2•tayadeamit•34m ago•0 comments

2k Words Becomes One Word – a short story co-created with Claude Opus

https://medium.com/@cmitre/how-2-000-words-becomes-one-word-5365cf8df07b
2•ceemite•38m ago•0 comments
Open in hackernews

Numerical Linear Algebra Class in Julia TUM

https://venkovic.github.io/NLA-for-CS-and-IE.html
145•darboux•10mo ago

Comments

staplung•10mo ago
Not exactly the same material but U. Michigan has their Robotics 101 course up as well: Computational Linear Algebra, also in Julia.

https://github.com/michiganrobotics/rob101/tree/main

ted_dunning•10mo ago
This is a nicely comprehensive course, but it looks like it is pretty fast paced, especially in the last few lectures (some of those later slides definitely aren't finished).

As a reference, it looks very useful.

stabbles•10mo ago
A good resource is Gerard Sleijpen's course: https://webspace.science.uu.nl/~sleij101/Opgaven/NumLinAlg/
me3meme•10mo ago
I just selected lecture 07 to take a look: Lecture 07 is about QR factorizacion and Householder reflections. The author proves how to construct a reflection to make zeros in the first column and then he just claims that following this procedure for the other columns finish the proof. But he should prove or justify why the other reflections do not destroy the zeros of previous reflections. Also he proves that a vector v is the vector to construct the reflection (but there is a factor of 2 that was not correctly simplified, maybe a latex error), but I think that it should be more general and easier to prove that for any w the vector from w to its image f(w) is the orthogonal vector to the plane of the reflection.

I thank the author for the slides, but this little proof need some more care, I don't know about the quality of other sections or the overall quality of the slides. Anyway I like how he tries to make things easy but good work is hard.

Edited: I was wondering whether a LLM reading Lecture 7 would detect what was missing in the proof. I tried with deepseek but its first feedback on the Lecture 7 was positive, then when prompted about the incomplete proof it recognized it as a common error and explained how to complete the proof. Also I have to prompt it about the bad factor 2 for it to detect it. So it seems that deepseek is not a useful tool to judge quality of math content without very expert guidance, deepseek suggested to ask the LLM to compare this proof with another proof to detect important or vital differences.

Certhas•10mo ago
That's an absolutely obvious step though? As in, detailed lecture notes should maybe elaborate with a sentence, but in a lecture I would not put this on the slides but mention the core point and expect students at this level (who should have seen some amount of more theoretical LinAlg courses by then) to understand how to do the 1 line calculation.

There aren't even any real details to fill in, you iterate on the lower right block so anything you do is orthogonal to the upper left block. Do a 2x2 block matrix multiplication to convince yourself that this preserves the form achieved so far.

me3meme•10mo ago
-- Do a 2x2 block matrix multiplication to convince yourself that this preserves the form achieved so far.

I don't consider this a proof. Perhaps you have in mind two simple but key properties of reflections about the hyperplane orthogonal to a vector v: (a) The hyperplane of a reflection is the fixed point of the reflection (b) the hyperplane is the orthogonal vector space to the vector space spanned by v. From this two properties it follows that each step of making zeroes does not change previous zeroes.

Your claim that for advanced students there is no need to comment about details it is not falsifiable. Citing Mac Lane: A monad is just a monoid in the category of endofunctors.

But from a practical point of view one can see the very basic level and simplicity of the definitions and calculations prior to the proof. So at this level of detail I consider that noticing that one must be careful to not destroy previous zeros is matching the level of discourse at the proper level.

Certhas•10mo ago
10 LB = LB' 0Q 0A 0A'

The proof says iterate on A, so that obviously creates a lower dimensional rotation Q that will act on the full space as above.

Absolutely mention this in lecture notes/during the lecture.

slwvx•10mo ago
I guess the title would better be "Numerical Linear Algebra Class in Julia at TUM". I.e. the "TUM" in the title does not mean that there's some new "TUM" version of Julia, rather that the class is at the Technical University of Munich.