| ... | @@ -143,7 +143,15 @@ in the Bayesian inference, the likelihood indicates the compatibility of the evi |
... | @@ -143,7 +143,15 @@ in the Bayesian inference, the likelihood indicates the compatibility of the evi |
|
|
y_p(x, w) = w_0 + w_1x_1
|
|
y_p(x, w) = w_0 + w_1x_1
|
|
|
```
|
|
```
|
|
|
|
|
|
|
|
Here we have two model parameters, $`w_1, w_2`$. What we want to know, is the likelihood of observing $`y_1`$, given $`x_1`$ with the above model equation: function, p(y|x,w). Lets assume that model weights has the following probability distributions:
|
|
Here we have two model parameters, $`w_1, w_2`$. What we want to know, is the likelihood of observing $`y_1`$, given $`x_1`$ with the above model equation: function, p(y|w,x). Lets assume that model weights has the following probability distributions:
|
|
|
|
|
|
|
|
|
<div align="center">
|
|
|
|
<img src="uploads/619342fc426ef02e69f0f85a45629f92/bs2.png" width="400">
|
|
|
|
</div>
|
|
|
|
|
|
|
|
We just assumed to be Gaussians with a means of zero. After making an observation $`P = (x_1,x_2)`$, we can calculate the likelihood for varying combinations of $`w_1, w_2`$:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Here, we can see that most likely distributions are spread around the $`y=1-x`$ line. The true values are also show as + on the figure. In Bayesian regression, for instance, this likelihood will be combined with the prior distribution (p(w|x)) to update the weight probabilities (p(w|y,X)). We will discuss the procedure in more detail [later on](DDE-1/Regression#bayesian-linear-regression).
|
|
|
|
|
|
|
|
... |
|
|