贝叶斯定理
Logical foundations
Conditional probabilities
- we need to turn around odd number
- Because we already know only when we get the: condition True, but conclusion false situation can we fully reject the hypothesis. Other situation can't strongly prove our hypothesis
Bayes' theorem
\[P(A|B)= \frac{P(B|A)P(A)}{P(B)} \]Markov chain
- Stochastic model
- 我的理解,本质上就是计算出每一个状态的转变概率矩阵,链图是可视化的过程
- Probabilities of the state transitions depend on the state the system is currently in, not its history
Bayes' theorem
\[P(A|B)= \frac{P(B|A)P(A)}{P(B)} \]Beyes factor
We have two hypothesis H0 and H1, and some data D
- Prior: 在本次计算收集数据之前,基于经验和之前的计算得到的概率
My prior is just a guess. What if it’s a bad guess? - That’s OK. The priors are just a starting point. The whole point is
that we are updating our beliefs when we get new information.
How do I get P(D|H)?
- Often this can be done using simulation. You have done this before!
How do I know P(D)?
- If you are comparing two hypotheses, the nice thing is you don’t need
to know it
some times we want to compare two hypotheses
If we consider two hypothesis: \(H_2, H_1\)
Then $$\frac{P(D|H_1)}{P(D|H_2)}$$
we called Bayes factor (\(\alpha\)): to judge whether the two hypothesis are equal or not$$\frac{P(H_1|D)}{P(H_2|D)}=\alpha \frac{P(H_1)}{P(H_2)}$$