Grammar

posterior probabilities - probability some hypothesis is true given some data
bayes rule - well you know what that looks like. for a single hypothesis

\[P(H \mid D) = \frac{P(D \mid H) \cdot P(H)}{P(D)}\]

prior probability - what you think probability of hypothesis is
likelihood - probability of observing data given hypothesis is true
marginal likelihood - probability of observing the data without assuming H is true or false

we now go off on one about the marginal likelihood

the marginal likelihood is obtained using the sum rule

there are 2 ways that the data D could occur

\[P(D) = P(H)P(D|H) + P( \overline{H})P(D|\overline{H})\]

or if there are a set of hypotheses

the posterior for a single hypothesis is:

\[P(H_i \mid D) = \frac{P(D \mid H_i) \cdot P(H_i)}{P(D)}\] \[P(D) = \sum_{i=1}^{N} P(H_i)P(D|H_i)\]

private_assets

Example


This site uses Just the Docs, a documentation theme for Jekyll.