| ... | @@ -65,11 +65,52 @@ p(E_1) = p(E_1|E_2)p(E_2) + p(E_1|E_2^c) p(E_2^c) |
... | @@ -65,11 +65,52 @@ p(E_1) = p(E_1|E_2)p(E_2) + p(E_1|E_2^c) p(E_2^c) |
|
|
p(E_1) = p(E_1|E_2)p(E_2) + p(E_1|E_2^c)(1-p(E_2))
|
|
p(E_1) = p(E_1|E_2)p(E_2) + p(E_1|E_2^c)(1-p(E_2))
|
|
|
```
|
|
```
|
|
|
|
|
|
|
|
This is exactly what we described in text at the beginning, where the weights are $`p(E_2)`$ and $`1-p(E_2)`$, respectively.
|
|
This is exactly what we described in text at the beginning, where the weights are now explicitly seen as the $`p(E_2)`$ and $`1-p(E_2)`$, respectively. In other words, each conditional probability is scaled with respect to the probability of it is happening.
|
|
|
|
|
|
|
|
How is this useful to us? Let's discuss it over an example. We would like to know the probability a person has a disease (e.g. Covid) given that the test result is positive (e.g. quick antigen tests). Assume that the test is not that accurate: 85% times gives true positives, while gives 3% false positives on healthy people. We suspect that 1% of the population has the disease in average. Question is, what is the probability of the getting a positive test results, if we are actually sick?
|
|
|
|
|
|
|
|
In order to simplify things, lets denote being tested positive and negative with + and - signs, where the sick and healthy people are symbolized as D and H, respectively. So, what we want to know:
|
|
|
|
|
|
|
|
```math
|
|
|
|
p(D|+) = p(D+)/p(+)
|
|
|
|
```
|
|
|
|
Note that:
|
|
|
|
|
|
|
|
|
|
|
|
```math
|
|
|
|
p(D|+) = p(+|D)p(D)
|
|
|
|
```
|
|
|
|
|
|
|
|
So, $`p(D|+)`$ becomes:
|
|
|
|
|
|
|
|
```math
|
|
|
|
p(D|+) = p(+|D)p(D)/p(+)
|
|
|
|
```
|
|
|
|
|
|
|
|
We now know how to evaluate $`p(+)`$:
|
|
|
|
|
|
|
|
```math
|
|
|
|
p(+|D) = p(+|D)p(D)/(p(+|D)p(D)+p(+|D^c)p(D^c))
|
|
|
|
```
|
|
|
|
|
|
|
|
If we put the numbers:
|
|
|
|
|
|
|
|
|
|
|
|
```math
|
|
|
|
p(+|D) = (0.85)(0.01)/((0.85)(0.01)+(0.03)(0.99))
|
|
|
|
```
|
|
|
|
|
|
|
|
```math
|
|
|
|
p(+|D) = (0.0085)/(0.0382) = 0.22
|
|
|
|
```
|
|
|
|
|
|
|
|
It seems even if the test seems to be relatively accurate to capture sick people, the probability of being measured as a sick person given that you are sick is 22 %. There is also a very nice [visual illustration here]( https://seeing-theory.brown.edu/bayesian-inference/index.html#section1) that you can play with.
|
|
|
|
|
|
|
|
How can we generalize the approach for N number of events?
|
|
|
|
|
|
|
|
...
|
|
...
|
|
|
|
|
|
|
|
Bayes’s formula is utilized under the hood of several models we will learn throughout the lecture. Herein, we will update our understanding of the probabilistic world (i.e., a probability distribution) by making new observations. Let’s look at an example, for which we have [a visual illustration]( https://seeing-theory.brown.edu/bayesian-inference/index.html#section1) you can play with.
|
|
Bayes’s formula is utilized under the hood of several models we will learn throughout the lecture. Let’s look at an example, for which we have [a visual illustration]( https://seeing-theory.brown.edu/bayesian-inference/index.html#section1) you can play with.
|
|
|
|
|
|
|
|
### What is likelihood?
|
|
### What is likelihood?
|
|
|
|
|
|
| ... | |
... | |
| ... | | ... | |