Another way of looking at it:
probability: parameter/model is fixed; outcome is random/unknown.
likelihood: outcome/data is fixed; we vary the parameter/model to assess how well it explains the data.
The probability of a model M given data X, or P(M|X) is the posterior probability. The likelihood of data X given model M, or P(X|M), is the probability (or probability density, depending on whether your data is continuous or discrete) of observing data X given model M. We often are given un-normalised likelihoods, which is what the linked paper talks about. These quantities are related via Bayes' Theorem.
Now, you may ask, isn't the probability of observing data X given model M still a probability? I mean, yeah, a properly normalized likelihood is indeed a probability. It's not the mirror image of probability, it is just an un-normalised probability (or a probability distribution) of data given a model or model parameters.
pkoird•3mo ago
qwertytyyuu•3mo ago
MiscCompFacts•3mo ago
nerdponx•3mo ago
If that sentence doesn't make sense, then it's helpful to just write out the likelihood function. You will notice that that it is in fact just the joint probability density of your model.
The only thing that makes it a "likelihood function" is that you fix the data and vary the parameters, whereas normally probability is a function of the data.
voidhorse•3mo ago
nerdponx•3mo ago
wiz21c•3mo ago
"For conditional probability, the hypothesis is treated as a given, and the data are free to vary. For likelihood, the data are treated as a given, and the hypothesis varies."