frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

GPT-5.3-Codex System Card [pdf]

https://cdn.openai.com/pdf/23eca107-a9b1-4d2c-b156-7deb4fbc697c/GPT-5-3-Codex-System-Card-02.pdf
1•tosh•28s ago•0 comments

Atlas: Manage your database schema as code

https://github.com/ariga/atlas
1•quectophoton•3m ago•0 comments

Geist Pixel

https://vercel.com/blog/introducing-geist-pixel
1•helloplanets•5m ago•0 comments

Show HN: MCP to get latest dependency package and tool versions

https://github.com/MShekow/package-version-check-mcp
1•mshekow•13m ago•0 comments

The better you get at something, the harder it becomes to do

https://seekingtrust.substack.com/p/improving-at-writing-made-me-almost
2•FinnLobsien•15m ago•0 comments

Show HN: WP Float – Archive WordPress blogs to free static hosting

https://wpfloat.netlify.app/
1•zizoulegrande•16m ago•0 comments

Show HN: I Hacked My Family's Meal Planning with an App

https://mealjar.app
1•melvinzammit•17m ago•0 comments

Sony BMG copy protection rootkit scandal

https://en.wikipedia.org/wiki/Sony_BMG_copy_protection_rootkit_scandal
1•basilikum•19m ago•0 comments

The Future of Systems

https://novlabs.ai/mission/
2•tekbog•20m ago•1 comments

NASA now allowing astronauts to bring their smartphones on space missions

https://twitter.com/NASAAdmin/status/2019259382962307393
2•gbugniot•24m ago•0 comments

Claude Code Is the Inflection Point

https://newsletter.semianalysis.com/p/claude-code-is-the-inflection-point
3•throwaw12•26m ago•1 comments

Show HN: MicroClaw – Agentic AI Assistant for Telegram, Built in Rust

https://github.com/microclaw/microclaw
1•everettjf•26m ago•2 comments

Show HN: Omni-BLAS – 4x faster matrix multiplication via Monte Carlo sampling

https://github.com/AleatorAI/OMNI-BLAS
1•LowSpecEng•27m ago•1 comments

The AI-Ready Software Developer: Conclusion – Same Game, Different Dice

https://codemanship.wordpress.com/2026/01/05/the-ai-ready-software-developer-conclusion-same-game...
1•lifeisstillgood•29m ago•0 comments

AI Agent Automates Google Stock Analysis from Financial Reports

https://pardusai.org/view/54c6646b9e273bbe103b76256a91a7f30da624062a8a6eeb16febfe403efd078
1•JasonHEIN•32m ago•0 comments

Voxtral Realtime 4B Pure C Implementation

https://github.com/antirez/voxtral.c
2•andreabat•35m ago•1 comments

I Was Trapped in Chinese Mafia Crypto Slavery [video]

https://www.youtube.com/watch?v=zOcNaWmmn0A
2•mgh2•41m ago•0 comments

U.S. CBP Reported Employee Arrests (FY2020 – FYTD)

https://www.cbp.gov/newsroom/stats/reported-employee-arrests
1•ludicrousdispla•43m ago•0 comments

Show HN: I built a free UCP checker – see if AI agents can find your store

https://ucphub.ai/ucp-store-check/
2•vladeta•48m ago•1 comments

Show HN: SVGV – A Real-Time Vector Video Format for Budget Hardware

https://github.com/thealidev/VectorVision-SVGV
1•thealidev•49m ago•0 comments

Study of 150 developers shows AI generated code no harder to maintain long term

https://www.youtube.com/watch?v=b9EbCb5A408
1•lifeisstillgood•50m ago•0 comments

Spotify now requires premium accounts for developer mode API access

https://www.neowin.net/news/spotify-now-requires-premium-accounts-for-developer-mode-api-access/
1•bundie•52m ago•0 comments

When Albert Einstein Moved to Princeton

https://twitter.com/Math_files/status/2020017485815456224
1•keepamovin•54m ago•0 comments

Agents.md as a Dark Signal

https://joshmock.com/post/2026-agents-md-as-a-dark-signal/
2•birdculture•56m ago•0 comments

System time, clocks, and their syncing in macOS

https://eclecticlight.co/2025/05/21/system-time-clocks-and-their-syncing-in-macos/
1•fanf2•57m ago•0 comments

McCLIM and 7GUIs – Part 1: The Counter

https://turtleware.eu/posts/McCLIM-and-7GUIs---Part-1-The-Counter.html
2•ramenbytes•1h ago•0 comments

So whats the next word, then? Almost-no-math intro to transformer models

https://matthias-kainer.de/blog/posts/so-whats-the-next-word-then-/
1•oesimania•1h ago•0 comments

Ed Zitron: The Hater's Guide to Microsoft

https://bsky.app/profile/edzitron.com/post/3me7ibeym2c2n
2•vintagedave•1h ago•1 comments

UK infants ill after drinking contaminated baby formula of Nestle and Danone

https://www.bbc.com/news/articles/c931rxnwn3lo
1•__natty__•1h ago•0 comments

Show HN: Android-based audio player for seniors – Homer Audio Player

https://homeraudioplayer.app
3•cinusek•1h ago•2 comments
Open in hackernews

Gaussian Processes for Machine Learning (2006) [pdf]

https://gaussianprocess.org/gpml/chapters/RW.pdf
72•susam•5mo ago

Comments

abhgh•5mo ago
This is the definitive reference on the topic! I have some notes on the topic as well, if you want something concise, but that doesn't ignore the math [1].

[1] https://blog.quipu-strands.com/bayesopt_1_key_ideas_GPs#gaus...

C-x_C-f•5mo ago
These are very cool, thanks. Do you know what kind of jobs are more likely to require Gaussian process expertise? I have experience in using GP for surrogate modeling and will be on the job market soon.

Also a resource I enjoyed is the book by Bobby Gramacy [0] which, among other things, spends a good bit on local GP approximation [1] (and has fun exercises).

[0] https://bobby.gramacy.com/surrogates/surrogates.pdf

[1] https://arxiv.org/abs/1303.0383

abhgh•5mo ago
Aside from secondmind [1] I don't know of any companies (only because I haven't looked)... But if I had to look for places with strong research culture on GPs (I don't know if you're) I would find relevant papers on arxiv and Google scholar, and see if any of them come from industry labs. If I had to take a guess on Bayesian tools at work, maybe the industries to look at would be advertising and healthcare.I would also look out for places that hire econometricists.

Also thank you for the book recommendation!

[1] https://www.secondmind.ai/

CamperBob2•5mo ago
Your tutorials show a real talent for visualization. I never grokked SVMs before I came across your Medium page at https://medium.com/cube-dev/support-vector-machines-tutorial... . Thanks!
abhgh•5mo ago
Thank you for your kind comment!
memming•5mo ago
Stationary GPs are just stochastic linear dynamical systems. (Not just the Matern covariance kernel)
FL33TW00D•5mo ago
For the visually inclined: https://distill.pub/2019/visual-exploration-gaussian-process...
tomhow•5mo ago
On the HN front page for 16 hours (though with strangely little discussion) just two days ago:

A Visual Exploration of Gaussian Processes (2019) - https://news.ycombinator.com/item?id=44919831 - Aug 2025 (1 comment)

maxrobeyns•5mo ago
Good to see GPs still being discussed in 2025!

Here was my attempt at a 'second' introduction a few years ago: https://maximerobeyns.com/second_intro_gps

heinrichhartman•5mo ago
Why would you learn Gaussian Processes today? Is there any application where they are still leading and have not been superseeded by Deep NNets?
cjbgkagh•5mo ago
AFAIK state of the art is still a mix of new DNN and old school techniques. Things like parameter efficiency, data efficiency, runtime performance, and understandability would factor into the decision making process.
timdellinger•5mo ago
Bayesian optimization of, say, hyperparameters is the canonical modern usage in my view, and there are other similar optimization problems where it's the preferred approach.
hodgehog11•5mo ago
I would argue there are more applications overall where Gaussian processes are superior, as most scientific applications have smaller data sets. Not everything has enough data to take advantage of feature learning in NNs. They are generally reliable, interpretable, and provide excellent uncertainty estimates for free. They can be made to be multiscale, achieving higher precisions as a function approximator than most other methods. Plus, they can exhibit reversion to the prior when you need that.

Another example where it is used is for emulating outputs of an agent-based model for sensitivity analyses.

xpe•5mo ago
To reduce the risk of being a lemming. It is in everyone's interests for some people not to follow the herd / join the plague of locusts.
roadside_picnic•5mo ago
Basically they're incredibly useful for any situation where you have "medium" data where you don't have enough data to properly train a NN (which are very data hungry in practice) but enough data that you're not really exploiting all the information using a more traditional approach.

GPs essentially allow you to get a lot of the power of a NN while also being able to encode a bunch of domain knowledge you have (which is necessary when you don't have enough data for the model to effectively learn that domain knowledge). On top of that, you get variance estimates which are very important for things like forecasting.

The only real draw back to GPs is that they absolutely do not fit into the "fit/predict" paradigm. Properly building a scalable GP takes a more deeper understanding of the model than most cases. The mathematical foundations required to really understand what's happening when you train a sparse GP greatly exceed what is required to understand a NN, and on top of that there is a fair amount of practical insight into kernel development that is required as well. But the payoff is fantastic.

It's worth recognizing that, once you realize that "attention" is really just kernel smoothing, transformers are essentially learning sophisticated stacked kernels, so ultimately share a lot in common with GPs.

ysaatchi•5mo ago
you can combine deep NNets with GPs, e.g. here https://arxiv.org/abs/1511.02222

So it isn't a matter of which is better. If you ever need to imbue your deep nets with good confidence estimates, it is definitely worth checking out.

timdellinger•5mo ago
My take is that the Rasmussen book isn't especially approachable, and that this book has actually held back the wider adoption of GPs in the world.

The book has been seen as the authoritative source on the topic, so people were hesitant to write anything else. At the same time, the book borders on impenetrable.