| ... | @@ -127,4 +127,23 @@ This is basically what Bayes’s formula is. It utilized under the hood of sever |
... | @@ -127,4 +127,23 @@ This is basically what Bayes’s formula is. It utilized under the hood of sever |
|
|
|
|
|
|
|
### What is likelihood?
|
|
### What is likelihood?
|
|
|
|
|
|
|
|
|
Likelihood function gives us the joint probability of an observation data as a function of model parameters. In other words, it describes the how likely it is to make that particular observation with that parameters. As usual, best seen with an illustration:
|
|
|
|
|
|
|
|

|
|
|
|
|
|
|
|
<div align="center">
|
|
|
|
<img src="uploads/7a6f01eb27597c075e4d8fcee7eee1b8/bs1.png" width="400">
|
|
|
|
</div>
|
|
|
|
|
|
|
|
In this X sample space, we make an observation. Likelihood describes us the relative possibility of a distribution with properties ($`\mu, \sigma`$) for a given observation of $`x_1`$. So, in likelihood, what changes is the model parameters, not the observation.
|
|
|
|
|
|
|
|
in the Bayesian inference, the likelihood indicates the compatibility of the evidence with the given hypothesis. Let's talk over an example. Imagine a linear model:
|
|
|
|
|
|
|
|
```math
|
|
|
|
y_p(x, w) = w_0 + w_1x_1
|
|
|
|
```
|
|
|
|
|
|
|
|
Here we have two model parameters, $`w_1, w_2`$. What we want to know, is the likelihood of observing $`y_1`$, given $`x_1`$ with the above model equation: function, p(y|x,w). Lets assume that model weights has the following probability distributions:
|
|
|
|
|
|
|
|
|
|
|
... |
|
... |