Lecture Learning Objectives#

Lecture 1 - Frequentist and Bayesian Overview, Probabilistic Generative Models, and Stan#

  1. Review statistical inference (frequentist so far!).

  2. Pave the way to Bayesian statistics.

  3. Introduce probabilistic generative models.

  4. Illustrate the basic use of Stan and rstan via Monte Carlo simulations.

  5. Differentiate probability and likelihood in Statistics.

Lecture 2 - Conditional Probabilities, Bayes’ Rules, and Maximum a Posteriori Estimation#

  1. Illustrate the concept of conditional probability.

  2. Apply the Bayes rule to probabilistic inquiries.

  3. Introduce the concept of maximum a posteriori (MAP) estimation.

  4. Relate MAP with maximum likelihood (MLE) estimation.

Lecture 3 - Bayesian Statistics in Action: The Beta-Binomial Model#

  1. Illustrate the elements and process of Bayesian modelling.

  2. Recall some distributional probability concepts.

  3. Explain the Bayes’ rule on analytical models.

  4. Describe the Beta-Binomial model via the Bayes’ rule.

  5. Explain key posterior metrics for Bayesian inference.

Lecture 4 - Markov Chain Monte Carlo, Stan, and Complex Bayesian Models#

  1. Illustrate the steps to build a likelihood in a Bayesian design.

  2. Explore a complex dataset involving categorical, count, and continuous variables.

  3. Apply a Bayesian model design with the proper random variables (Normal, Gamma, and Poisson).

  4. Explain the general procedure of Markov Chain Monte Carlo (MCMC).

  5. Practice computational Bayesian inference via Stan.

  6. Draw inferential conclusions from our posterior distribution of parameters.

Lecture 5 - Bayesian Normal Linear Regression and Hypothesis Testing#

  1. Describe the basics of Bayesian Normal linear regression.

  2. Explore and contrast Bayesian Normal linear regression versus Ordinary Least-Squares (OLS).

  3. Retake the concept of the posterior credible interval in a Beta-Binomial framework.

  4. Introduce one-sided hypothesis testing in a Beta-Binomial framework.

  5. Relate the posterior credible interval with the two-sided hypothesis in a Beta-Binomial framework.

Lecture 6 - Bayesian Binary Logistic Regression#

  1. Extend the paradigm of Bayesian regression to Generalized Linear Models (GLMs), specifically Binary Logistic regression.

  2. Explore and contrast Bayesian Binary Logistic regression versus its frequentist counterpart.

  3. Illustrate the Bayesian modelling setup in Binary Logistic regression.

  4. Compare the Bayesian prior and posterior sigmoid curves in simple Binary Logistic regression.

  5. Compare the Bayesian prior and posterior predicted probabilities in simple Binary Logistic regression.

Lecture 7 - Bayesian Hierarchical Models#

  1. Explain the concept of the Bayesian hierarchical model.

  2. Contrast the hierarchical model versus standard Bayesian models.

  3. Demonstrate the advantages of hierarchical modelling in Bayesian inference.

  4. Apply hierarchical modelling to predictive inquiries.

Lecture 8 - More Hierarchical Modelling and MCMC Diagnostics#

  1. Apply Bayesian hierarchical modelling in a more complex case.

  2. Formulate the Bayesian modelling setup in this complex case.

  3. Interpret the simulation results in function of the initial statistical inquiries.

  4. Evaluate MCMC posterior sampling via model diagnostics.

Tutorial on MCMC and Gamma-Poisson Model#

  1. Describe the Gamma-Poisson model via the Bayes’ rule.

  2. Describe the Monte Carlo algorithm.

  3. Describe a Markov Chain.

  4. Explain an overview of MCMC via the Bayes’ rule.

  5. Apply MCMC via the Metropolis-Hasting algorithm.

  6. Apply MCMC via the Metropolis algorithm.

  7. Practice R coding of the MCMC via the Metropolis algorithm with a Gamma-Poisson example.