| ... | ... | @@ -44,7 +44,28 @@ p(E_2|E_1)= p(E_2); |
|
|
|
|
|
|
|
### Bayesian statistics
|
|
|
|
|
|
|
|
Probability of $`E_1`$ to occur, $`p(E_1)`$ can be represented as a weighted average of conditional probabilities: (i) probability of observing $`E_1`$ given that $`E_2`$ is observed, (ii) probability of observing $`E_1`$ given that $`E_2`$ is not observed.
|
|
|
|
Probability of $`E_1`$ to occur, $`p(E_1)`$, can be represented as a weighted average of conditional probabilities: (i) probability of observing $`E_1`$ given that $`E_2`$ is observed, (ii) probability of observing $`E_1`$ given that $`E_2`$ is not observed:
|
|
|
|
|
|
|
|
```math
|
|
|
|
p(E_1) = p(E_1E_2) + p(E_1E_2^c)
|
|
|
|
```
|
|
|
|
This is best seen over a [Venn diagram](https://en.wikipedia.org/wiki/Venn_diagram). You can see the spaces occupied by the event rules easily (c: complement)
|
|
|
|
|
|
|
|
<div align="center">
|
|
|
|
<img src="uploads/262d0b21f39beae9a935822bb21456b5/s1.png" width="330">
|
|
|
|
</div>
|
|
|
|
|
|
|
|
If we replace the probabilities with conditional expressions discussed above, we get the following:
|
|
|
|
|
|
|
|
```math
|
|
|
|
p(E_1) = p(E_1|E_2)p(E_2) + p(E_1|E_2^c) p(E_2^c)
|
|
|
|
```
|
|
|
|
|
|
|
|
```math
|
|
|
|
p(E_1) = p(E_1|E_2)` + p(E_1|E_2^c)(1-p(E_2))
|
|
|
|
```
|
|
|
|
|
|
|
|
This is exactly what we described in text at the beginning, where the weights are $`p(E_2)`$ and $`1-p(E_2)`$, respectively.
|
|
|
|
|
|
|
|
...
|
|
|
|
|
| ... | ... | |
| ... | ... | |