A Visual Exploration of Gaussian Processes (2019) - https://news.ycombinator.com/item?id=44919831 - Aug 2025 (1 comment)
Here was my attempt at a 'second' introduction a few years ago: https://maximerobeyns.com/second_intro_gps
Another example where it is used is for emulating outputs of an agent-based model for sensitivity analyses.
GPs essentially allow you to get a lot of the power of a NN while also being able to encode a bunch of domain knowledge you have (which is necessary when you don't have enough data for the model to effectively learn that domain knowledge). On top of that, you get variance estimates which are very important for things like forecasting.
The only real draw back to GPs is that they absolutely do not fit into the "fit/predict" paradigm. Properly building a scalable GP takes a more deeper understanding of the model than most cases. The mathematical foundations required to really understand what's happening when you train a sparse GP greatly exceed what is required to understand a NN, and on top of that there is a fair amount of practical insight into kernel development that is required as well. But the payoff is fantastic.
It's worth recognizing that, once you realize that "attention" is really just kernel smoothing, transformers are essentially learning sophisticated stacked kernels, so ultimately share a lot in common with GPs.
So it isn't a matter of which is better. If you ever need to imbue your deep nets with good confidence estimates, it is definitely worth checking out.
The book has been seen as the authoritative source on the topic, so people were hesitant to write anything else. At the same time, the book borders on impenetrable.
abhgh•5mo ago
[1] https://blog.quipu-strands.com/bayesopt_1_key_ideas_GPs#gaus...
C-x_C-f•5mo ago
Also a resource I enjoyed is the book by Bobby Gramacy [0] which, among other things, spends a good bit on local GP approximation [1] (and has fun exercises).
[0] https://bobby.gramacy.com/surrogates/surrogates.pdf
[1] https://arxiv.org/abs/1303.0383
abhgh•5mo ago
Also thank you for the book recommendation!
[1] https://www.secondmind.ai/
CamperBob2•5mo ago
abhgh•5mo ago