| ... | @@ -106,11 +106,24 @@ p(+|D) = (0.0085)/(0.0382) = 0.22 |
... | @@ -106,11 +106,24 @@ p(+|D) = (0.0085)/(0.0382) = 0.22 |
|
|
|
|
|
|
|
Even if the test seems to be relatively accurate to capture sick people, the probability of being measured as a sick person given that you are sick is 22 %. As the sickness gets more rare, the test results will be less and less reliable. There is also a very nice [visual illustration here]( https://seeing-theory.brown.edu/bayesian-inference/index.html#section1) that you can play with.
|
|
Even if the test seems to be relatively accurate to capture sick people, the probability of being measured as a sick person given that you are sick is 22 %. As the sickness gets more rare, the test results will be less and less reliable. There is also a very nice [visual illustration here]( https://seeing-theory.brown.edu/bayesian-inference/index.html#section1) that you can play with.
|
|
|
|
|
|
|
|
Our concern here is, how can we generalize the approach for N number of events?
|
|
Our concern here is, how can we generalize the approach for N number of events? In the above formulation, we only considered two events in our weighted summation. If we have N number of possible events, constituting the sample space, we just need to extend this description for an event m:
|
|
|
|
|
|
|
|
|
```math
|
|
|
|
p(E_n|E_m) = p(E_mE_n)/p(E_m)
|
|
|
|
...
|
|
|
|
|
|
|
|
|
|
|
|
```math
|
|
|
|
p(E_n|E_m) = p(E_m|E_n)p(E_n)/p(E_m)
|
|
|
...
|
|
...
|
|
|
|
|
|
|
|
Bayes’s formula is utilized under the hood of several models we will learn throughout the lecture. Let’s look at an example, for which we have [a visual illustration]( https://seeing-theory.brown.edu/bayesian-inference/index.html#section1) you can play with.
|
|
```math
|
|
|
|
p(E_n|E_m) = p(E_m|E_n)p(E_n)/(\sum_{n=1)^N {p(E_m|E_n)p(E_n)})
|
|
|
|
...
|
|
|
|
|
|
|
|
This is basically what Bayes’s formula is. It utilized under the hood of several models we will learn throughout the lecture. Let’s look at an example, for which we have [a visual illustration]( https://seeing-theory.brown.edu/bayesian-inference/index.html#section1) you can play with.
|
|
|
|
|
|
|
|
[Here](https://www.youtube.com/watch?v=HZGCoVF3YvM), you can also find a nice geometric description of the Bayesian approach.
|
|
|
|
|
|
|
|
### What is likelihood?
|
|
### What is likelihood?
|
|
|
|
|
|
| ... | |
... | |
| ... | | ... | |