| ... | ... | @@ -108,7 +108,7 @@ Herein, we will represent the the probabilities for the weights in terms of Gaus |
|
|
|
|
|
|
|
Here on the right, you can see lines drawn by using the weight probabilities given in the middle. As you can see, they do not agree on a solution. This is expected as at this stage, the model is not trained at all.
|
|
|
|
|
|
|
|
What happens as we see examples? Lets show the model the very first example (below, blue circle on the right plot). In the Bayesian learning, we use the likelihood function, p(y|x,w) -- probability of getting the true y value given the weights and the example. With the example we pass, the calculated likelihood is shown below (left). The true weights are shown as "+" for comparison. Here, the calculated likelihood estimates that the weights of the model should be around this zone. By using the likelihood and our prior (middle figure before seeing any data), we now calculate the posterior probabilities, given in the middle:
|
|
|
|
What happens as we see examples? Lets show the model the very first example (below, blue circle on the right plot). In the Bayesian learning, we use the likelihood function, p(y|w,x) -- probability of getting the true y value given the weights and the example. With the example we pass, the calculated likelihood is shown below (left). The true weights are shown as "+" for comparison. Here, the calculated likelihood estimates that the weights of the model should be around this zone. By using the likelihood and our prior (middle figure before seeing any data), we now calculate the posterior probabilities, given in the middle:
|
|
|
|
|
|
|
|
<img src="uploads/056765b94996330d69d7dc932f5576b0/br2.png" width="600">
|
|
|
|
|
| ... | ... | |
| ... | ... | |