Publications by Ken Wood

Hierarchical Modeling using Bayesian Methods

30.07.2023

Data Input Let’s fit our hierarhical model for counts of chocolate chips. The data can be found in cookies.dat. dat = read.table(file="cookies.dat", header=TRUE) head(dat) ## chips location ## 1 12 1 ## 2 12 1 ## 3 6 1 ## 4 13 1 ## 5 12 1 ## 6 12 1 table(dat$location) ## ## 1 2 3 ...

2334 sym R (5334 sym/43 pcs) 10 img

Poisson Regression using Bayesian Methods

27.07.2023

For an example of Poisson regression, we’ll use the badhealth data set from the COUNT package in R. library("COUNT") ## Loading required package: msme ## Loading required package: MASS ## Loading required package: lattice ## Loading required package: sandwich data("badhealth") ?badhealth head(badhealth) ## numvisit badh age ## 1 30 0 ...

3782 sym R (5100 sym/48 pcs) 11 img

Logistic Regression using Bayesian Methods

26.07.2023

Data For an example of logistic regression, we’ll use the urine data set from the boot package in R. The response variable is r, which takes on values of \(0\) or \(1\). We will remove some rows from the data set which contain missing values. library("boot") data("urine") ?urine head(urine) ## r gravity ph osmo cond urea calc ## 1 0 1.021 4...

1030 sym R (585 sym/7 pcs) 2 img

ANOVA - Bayesian Methods

26.07.2023

Introduction ANOVA (analysis of variance) is used when we have categorical explanatory variables where the observations belong to groups, i.e., we compare the variability of responses between groups. If the variability between groups is large relative to the variability within groups, we conclude that there is a ‘grouping effect’. Data As an e...

2763 sym R (4250 sym/34 pcs) 8 img

Linear Regression Quizes

26.07.2023

library("car") # load the 'car' package ## Loading required package: carData data("Anscombe") # load the data set ?Anscombe # read a description of the data head(Anscombe) # look at the first few lines of the data ## education income young urban ## ME 189 2824 350.7 508 ## NH 169 3259 345.9 564 ## VT 230 3072 348....

70 sym R (8620 sym/59 pcs) 19 img

Linear Regression using Bayesian Methods

24.07.2023

If we have \(k\) different predictor variables \(x_1,x_2,...,x_k\), what is the primary advantage of fitting a joint linear model (multiple regression \(E(y)=\beta_0+\beta_1 x_1+...+\beta_k x_k\)) over fitting \(k\) simple linear regressions (\(E(y)=\beta_0+\beta_j x_j\)), one for each predictor? Answer: Each coefficient in the multiple regression ...

1394 sym R (2856 sym/21 pcs) 5 img

Gibbs Sampling

20.07.2023

Introduction So far, we have demonstrated MCMC for a single parameter. What if we seek the posterior distribution of multiple parameters, and that posterior distribution does not have a standard form? One option is to perform Metropolis-Hastings (M-H) by sampling candidates for all parameters at once, and accepting or rejecting all of those candida...

3607 sym

Introduction to JAGS

19.07.2023

Introduction There are several software packages available that will handle the details of MCMC for us. See the supplementary material for a brief overview of options. The package we will use in this course is JAGS (Just Another Gibbs Sampler) by Martyn Plummer. The program is free, and runs on Mac OS, Windows, and Linux. Better yet, the program ca...

2907 sym R (1437 sym/11 pcs) 1 img

Metropolis-Hastings Algorithm

18.07.2023

Introduction Metropolis-Hastings is an algorithm that allows us to sample from a generic probability distribution (which we will call the target distribution), even if we do not know the normalizing constant. To do this, we construct and sample from a Markov chain whose stationary distribution is the target distribution. It consists of picking an a...

5985 sym

Monte Carlo Simulation of Bayesian Models

12.07.2023

\(y_i=\mu+\epsilon_i,\space \space \epsilon \stackrel{iid}{\sim}N(0,\sigma^2),\space i=1,..,n\) \(\therefore \space y_i \stackrel{iid}{\sim}N(0,\sigma^2)\) Likelihood: \(P(y|\theta) = \frac{P(y,\theta)}{P(\theta)}\), where \(P(\theta)\) is the prior probability distribution of \(\theta\). Posterior: \(P(\theta|y) = \frac{P(y,\theta)}{P(y)} = \frac{...

3937 sym 2 img