set.seed(1)
M <- 1e5
theta <- rbeta(M, 2, 5)
mean(theta) # Monte Carlo estimate[1] 0.2861808
var(theta) / M # Monte Carlo variance[1] 2.56548e-07
This week introduces Monte Carlo methods, which allow us to approximate Bayesian quantities when analytical solutions are unavailable.
We explore how random sampling can be used to estimate expectations, posterior summaries, and probabilities.
By the end of this week, students will understand how Monte Carlo simulation forms the foundation for modern Bayesian computation such as MCMC.
By the end of Week 3, you should be able to:
Bayesian inference often requires evaluating integrals such as: \[ E[h(\theta) \mid y] = \int h(\theta)\, p(\theta \mid y)\, d\theta, \] which are rarely available in closed form.
If we can sample \(\theta^{(1)}, \ldots, \theta^{(M)}\) from the posterior \(p(\theta \mid y)\),
then we can approximate the expectation by: \[
\hat{E}[h(\theta)] = \frac{1}{M} \sum_{m=1}^M h(\theta^{(m)}).
\] This is called the Monte Carlo estimator.
By the Law of Large Numbers, \(\hat{E}[h(\theta)] \to E[h(\theta)]\) as \(M \to \infty\). The Central Limit Theorem gives: \[ \sqrt{M}\,(\hat{E} - E[h(\theta)]) \approx N(0, \text{Var}[h(\theta)]). \]
We can estimate the simulation error by:
\[ \text{SE}(\hat{E}) \approx \sqrt{\frac{\text{Var}(h(\theta))}{M}}. \] Larger \(M\) gives more accurate approximations but increases computation time.
Compute \(E[\theta]\) for \(\theta \sim \text{Beta}(2,5)\) using Monte Carlo.