Bayesian Methods Beyond the Course
Weight: 20%
Due date: April 27, 2026
Last update: April 06, 2026
Objective
The goal of this project is to explore a Bayesian topic that goes beyond the core course material and to demonstrate
- conceptual understanding,
- computational implementation,
- critical thinking, and
- clear communication.
You are encouraged to go somewhat beyond what we covered in class. The emphasis is not only on fitting a Bayesian model, but also on evaluating, comparing, and interpreting Bayesian methods thoughtfully.
Collaboration Policy
- Master’s students may work in groups of up to 2 students.
- PhD students must work independently.
All submissions must clearly indicate the names of contributors and, when applicable, briefly describe each member’s contribution.
Group Information
| Group | Member | Topic | |
|---|---|---|---|
| 1 | Wenpu Ma | ||
| 2 | Ruihang Han | ||
| 3 | Zilong Zhang | ||
| 4 | Kalen Jinnah | Dirichlet Process | |
| 5 | Zhe Zhong | Gaussian Process | |
| 6 | Yang Zong | Hierarchical Bayesian Regression | |
| 7 | Taiwo Ayeni | Comparison Study between Bayesian and frequentist logistic regression with a prior sensitivity analysis, using data from the Framingham Heart Study | |
| 8 | Andrew Krause | Metropolis-Hastings algorithm |
Page Limit
- The main report is limited to 12 pages.
- This limit excludes references and appendices.
- Appendices may include:
- additional derivations,
- extra figures,
- supplementary results, and
- code.
The main text should focus on clarity, key ideas, and main results.
Project Options
You may choose one of the following directions.
Option 1: Method Exploration
Study a Bayesian method that was not fully covered in class.
Possible topics include:
- Metropolis–Hastings algorithm
- Hamiltonian Monte Carlo (HMC)
- Variational Bayes
- Bayesian model selection
- Bayesian nonparametrics (for example, Dirichlet process models)
- Gaussian processes
- Hierarchical Bayesian models
Option 2: Applied Bayesian Analysis
Conduct a full Bayesian analysis on a real dataset, going beyond simple conjugate models.
Possible topics include:
- Bayesian regression (linear or logistic)
- Hierarchical models for grouped data
- Mixture models
- Time series models
- Spatial models
Option 3: Comparison Study
Compare two Bayesian approaches, models, or prior choices.
Possible comparisons include:
- Gibbs vs. Metropolis–Hastings
- MCMC vs. variational inference
- JAGS vs. Stan
- Bayesian vs. frequentist analysis
- informative prior vs. weakly informative prior
- hierarchical vs. non-hierarchical models
You are also welcome to propose your own topic. If you are unsure whether your idea is appropriate, please discuss it with the instructor.
Required Components
Your project should be written as a short report containing the following components.
1. Introduction and Background [10%]
- What is the problem or question?
- Why is a Bayesian approach appropriate?
- What background does the reader need in order to understand your project?
2. Model and Method [15%]
- Clearly describe the model:
\[ p(y \mid \theta), \qquad p(\theta) \]
- Explain the inferential or computational method being used.
- Define the main quantities of interest.
3. Computation [20%]
- Implement your analysis using:
- R (custom code), and/or
- JAGS, Stan, or another Bayesian software tool.
- Clearly explain the computational procedure.
- Include enough code and explanation for the analysis to be reproducible.
4. Results and Inference [15%]
Present the main results of your analysis, such as
- posterior summaries,
- credible intervals,
- posterior probabilities, and
- model-based conclusions.
5. Diagnostics and Evaluation [10%]
Include appropriate diagnostics and discuss the quality of the fit or computation.
Examples include:
- trace plots,
- convergence diagnostics,
- posterior predictive checks, and
- discussion of model limitations.
6. Comparison Component [10%]
Your project must include at least one meaningful comparison, for example:
- between two methods,
- between two models, or
- between two prior choices.
You should explain
- what is being compared,
- why the comparison is meaningful, and
- what conclusion you draw.
7. Simulation Study or Empirical Evaluation [10%]
You must include one of the following:
- a simulation study under controlled settings, or
- a systematic empirical evaluation.
Your study should
- state a clear goal,
- vary at least one factor (such as sample size, prior choice, or noise level), and
- summarize findings using plots or tables.
8. Interpretation and Discussion [5%]
- Explain the results in clear, plain language.
- Discuss what you learned from the project.
- Comment on strengths, weaknesses, or possible extensions.
9. Reflection [5%]
Briefly discuss
- what worked well,
- what was difficult, and
- what you would improve if you had more time.
Key Requirement
Your project must answer at least one decision question, such as:
- Which model is better, and why?
- How sensitive are the results to the prior?
- Does the method scale well?
- What are the practical limitations of the approach?
Your conclusions should be supported by evidence from your analysis.
Grading Rubric
| Component | Excellent | Good | Needs Improvement | Points |
|---|---|---|---|---|
| Introduction and Background | Problem is clearly motivated and the Bayesian context is well explained | Motivation is adequate but could be sharper | Motivation is unclear or incomplete | 10 |
| Model and Method | Model and method are clearly specified and well justified | Mostly clear, with minor gaps | Important pieces are missing or unclear | 15 |
| Computation | Code is correct, reproducible, and well explained | Mostly correct with minor issues | Major errors or weak reproducibility | 20 |
| Results and Inference | Results are clearly presented and interpreted appropriately | Results are mostly clear | Results are incomplete or poorly explained | 15 |
| Diagnostics and Evaluation | Diagnostics are appropriate and thoughtfully discussed | Diagnostics are present but limited | Little or no meaningful diagnostics | 10 |
| Comparison Component | Comparison is meaningful and conclusions are well supported | Comparison is present but limited | Comparison is weak or missing | 10 |
| Simulation / Empirical Evaluation | Evaluation is well designed and clearly presented | Evaluation is adequate | Evaluation is weak, incomplete, or missing | 10 |
| Interpretation and Discussion | Strong interpretation and insight | Some interpretation, but limited depth | Minimal or unclear interpretation | 5 |
| Reflection | Thoughtful and specific | Adequate but brief | Minimal | 5 |
| Total | 100 |
Deliverables
Submit the following:
- PDF report
- Source file (
.Rmdor.qmd) - Any additional code files needed to reproduce the analysis
Your submission must be reproducible.
Recommended Report Structure
A typical project report may be organized as follows:
- Introduction
- Background
- Model / Method
- Computation
- Results
- Diagnostics / Evaluation
- Comparison or Simulation Study
- Discussion and Reflection
Bonus (up to +5%)
You may earn bonus points for work that goes meaningfully beyond the basic requirements, such as:
- implementing your own MCMC algorithm,
- comparing multiple models in a thoughtful way,
- using advanced tools such as Stan or PyMC, or
- providing especially strong insight or creativity in the analysis.
- You do not need to produce a perfect or publishable analysis.
- Start with a simple version, then extend it.
- A clear and well-executed project is better than an overly ambitious but incomplete one.
- Clarity, reasoning, and interpretation matter as much as technical complexity.
Summary
This project is about learning by exploring Bayesian ideas beyond the classroom.
\[ \text{Model} \;\longrightarrow\; \text{Computation} \;\longrightarrow\; \text{Evaluation} \;\longrightarrow\; \text{Insight} \]
Focus on
- understanding,
- computation,
- comparison, and
- communication.
Good luck. This is your opportunity to explore a Bayesian topic that genuinely interests you!