Publications by Jason Louwagie
Assignment 5
2. For parts (a) through (c), indicate which of i. through iv. is correct. Justify your answer. (a) The lasso, relative to least squares, is: i. More flexible and hence will give improved prediction accuracy when its increase in bias is less than its decrease in variance. ii. More flexible and hence will give improved prediction accuracy when it...
4458 sym R (8251 sym/39 pcs) 3 img
Assignment 4
3. We now review k-fold cross-validation. (a) Explain how k-fold cross-validation is implemented. K-fold is implemented by dividing the set of observations into k number of groups of similar size. The first fold of k number of folds is treated as a validation set and the method is fit on the remaining k fold groups. The MSE is then computed on th...
6025 sym R (5924 sym/41 pcs)
Assignment 3
10. This question should be answered using the Weekly data set, which is part of the ISLR package. This data is similar in nature to the Smarket data from this chapter’s lab, except that it contains 1,089 weekly returns for 21 years, from the beginning of 1990 to the end of 2010 (a) Produce some numerical and graphical summaries of the Weekly d...
5524 sym R (13221 sym/133 pcs) 5 img
Assignment 2
Chapter 2 2. Carefully explain the differences between the KNN classifier and KNN regression methods. KNN classifier first identifies the K points in the training data that are closest to \(X_{0}\). It then estimates the conditional probability for class j. KNN Regression is similar in that is it first identifies the K points that are closest to...
5634 sym R (11984 sym/32 pcs) 3 img
Assignment 6
6. In this exercise, you will further analyze the Wage data set considered throughout this chapter. (a) Perform polynomial regression to predict wage using age. Use cross-validation to select the optimal degree d for the polynomial. What degree was chosen, and how does this compare to the results of hypothesis testing using ANOVA? Make a plot of ...
1703 sym R (7437 sym/24 pcs) 3 img
Assignment 8
5. We have seen that we can fit an SVM with a non-linear kernel in order to perform classification using a non-linear decision boundary. We will now see that we can also obtain a non-linear decision boundary by performing logistic regression using non-linear transformations of the features. (a) Generate a data set with n = 500 and p = 2, such tha...
4125 sym R (15517 sym/60 pcs) 26 img
Assignment 7
3. Consider the Gini index, classification error, and entropy in a simple classification setting with two classes. Create a single plot that displays each of these quantities as a function of ˆpm1. The x-axis should display ˆpm1, ranging from 0 to 1, and the y-axis should display the value of the Gini index, classification error, and entropy. H...
4204 sym R (8019 sym/53 pcs) 8 img