Publications by Xi Jia

505-90-Assignment1

19.05.2020

2E1. Which of the expressions below correspond to the statement: the probability of rain on Monday? (1) Pr(rain) (2) Pr(rain|Monday) (3) Pr(Monday|rain) (4) Pr(rain, Monday)/ Pr(Monday) # (2) and (4) # For (4), Pr(rain, Monday) = Pr(rain | Monday) * Pr(Monday) => Pr(rain | Monday) = Pr(rain, Monday) / Pr(Monday) 2E2. Which of the following state...

5053 sym R (4016 sym/16 pcs) 6 img

lab4_505

16.06.2020

5E1. Which of the linear models below are multiple linear regressions? \[\begin{align} {μ_i = α + βx_i} \tag{1}\\ μ_i = β_xx_i + β_zz_i \tag{2} \\ μ_i = β_xx_i + β_zz_i \tag{3} \\ μ_i = α + β(x_i − z_i) \tag{4} \\ μ_i = α + β_xx_i + β_zz_i \tag{5} \\ \end{align}\] #2, 3, 5 are the multiple linear regressions 5E2....

3978 sym R (4281 sym/24 pcs) 2 img

505-10-lab5

23.06.2020

6E1. List three mechanisms by which multiple regression can produce false inferences about causal effects. # Multicollinearity: When two predictor variables are very strongly correlated, including both in a model may lead to confusion and you cant identify the true contributor # Post-treatment bias:mistaken inferences arising from including vari...

1228 sym R (2935 sym/6 pcs)

assignment9

04.08.2020

11E1. If an event has probability 0.35, what are the log-odds of this event? 0.35/(1-0.35) ## [1] 0.5384615 11E2. If an event has log-odds 3.2, what is the probability of this event? 3.2/(1+3.2) ## [1] 0.7619048 11E3. Suppose that a coefficient in a logistic regression has value 1.7. What does this imply about the proportional change in odds of t...

2149 sym R (4806 sym/23 pcs) 1 img

Document7

14.07.2020

8E1. For each of the causal relationships below, name a hypothetical third variable that would lead to an interaction effect: Bread dough rises because of yeast. Education leads to higher income. Gasoline makes a car go. #1 : temperature, if the temperature of the dough is too high, the yeast dies; and a lower temperature would slow down the do...

2549 sym R (5155 sym/28 pcs) 2 img

assignment6

06.07.2020

7E1. State the three motivating criteria that define information entropy. Try to express each in your own words. #(1) be measured on a continuous scale such that the spacing between adjacent values is consistent #(2) capture the size of the possibility space such that its value scales with the number of possible outcomes. #(3) be additive fo...

1547 sym R (3041 sym/16 pcs)

Assignment8

21.07.2020

9E1. Which of the following is a requirement of the simple Metropolis algorithm? The parameters must be discrete. The likelihood function must be Gaussian. The proposal distribution must be symmetric. #3. The proposal distribution must be symmetric. 9E2. Gibbs sampling is more efficient than the Metropolis algorithm. How does it achieve this ex...

2116 sym R (15527 sym/47 pcs) 10 img