Path: blob/main/notebooks/published/bayesian_inference/bayesian_inference.ipynb
51 views
Bayesian Inference: A Computational Introduction
1. Theoretical Foundation
Bayesian inference is a method of statistical inference that uses Bayes' theorem to update the probability of a hypothesis as more evidence becomes available. Unlike frequentist approaches, Bayesian methods treat parameters as random variables with probability distributions.
1.1 Bayes' Theorem
The cornerstone of Bayesian inference is Bayes' theorem:
Where:
is the posterior distribution — our updated belief about parameter after observing data
is the likelihood — the probability of observing data given parameter
is the prior distribution — our initial belief about before seeing data
is the marginal likelihood (evidence) — a normalizing constant
1.2 Conjugate Priors
A prior is said to be conjugate to a likelihood if the posterior distribution belongs to the same family as the prior. This property enables analytical solutions.
Beta-Binomial Conjugacy
For binomial likelihood with parameter (probability of success):
With a Beta prior:
The posterior is also Beta:
The Beta distribution PDF is:
where is the Beta function.
1.3 Posterior Predictive Distribution
To predict future observations , we integrate over the posterior:
1.4 Credible Intervals
A credible interval satisfies:
Unlike frequentist confidence intervals, credible intervals have a direct probabilistic interpretation.
2. Implementation
We will demonstrate Bayesian inference using the Beta-Binomial model to estimate the probability of success in a series of Bernoulli trials.
2.1 Generate Synthetic Data
We simulate coin flips from a biased coin with true probability .
2.2 Define Prior, Likelihood, and Posterior
We compare three different priors:
Uniform prior: — no prior information
Informative prior: — centered at 0.5
Strong prior: — prior belief that
2.3 Visualization of Bayesian Updating
3. Posterior Predictive Distribution
We compute the probability of observing successes in the next trials, marginalizing over the posterior uncertainty in .
4. Model Comparison: Bayes Factor
The Bayes factor compares two models and :
For Beta-Binomial, the marginal likelihood is:
5. Numerical Integration: Grid Approximation
For non-conjugate models, we can use grid approximation to compute the posterior numerically.
6. Summary
This notebook demonstrated key concepts in Bayesian inference:
Bayes' theorem provides a principled framework for updating beliefs given data
Conjugate priors (Beta-Binomial) enable analytical posterior computation
Prior sensitivity: Different priors lead to different posteriors, but with sufficient data, posteriors converge
Credible intervals provide direct probabilistic statements about parameter values
Posterior predictive distributions account for parameter uncertainty in predictions
Bayes factors enable formal model comparison
Grid approximation extends Bayesian methods to non-conjugate models
Key Takeaways
Bayesian methods naturally quantify uncertainty through probability distributions
The choice of prior matters most when data are limited
Conjugacy provides computational convenience but is not always realistic
Numerical methods (grid, MCMC) extend Bayesian inference to complex models