Answers to graded exercise 7 with brief explanations. Question numbers marked with asterisks were frequently missed. Questions? Ask! 1) A (!) 2) 76 (given in the first summary, 45+31, and in dim(PTSD)) *3) 0.5921 (the mean of the csa variable in the 2nd summary, or 45/76) 4) 0.3688162 (given in the cor(PTSD) correlation matrix, bottom left corner) 5) B (correlation between dummy coded categorical and numeric variables) 6) B (the correlation is positive; csa=1 more likely to be high on cpa) 7) 7.2458 (from the first regression, the coefficient for csa) 8) 6.2728 (from the second regression, the coefficient for csa) 9) D (any of those would have given the same result; try them and see!) 10) 10.2474 (csa=1 combines the csa coefficient with the intercept) 11) 0.5507 (the coefficient of cpa in the 2nd regression) 12) 3.9746 (csa=0 causes this coefficient to drop out; hence, just the intercept) 13) 0.5507 (same as no. 11) 14) 0.5787 (multiple R-squared from the second regression) *15) D (we would need to do lm(ptsd~cpa) and lm(ptsd~cpa+csa) or use aov(ptsd~cpa+csa) to get this) *16) C (from the 2nd regression: csa 6.2728 0.8219 7.632 6.89e-11 ***) *17) 4.6348026 (this and 18 are the third line in confint(lm.ptsd)) *18) 7.9107063 19) A (!) 20) B (interaction not significant? leave it out!) 21) C (any effect not included in the analysis becomes part of the error term) 22) A (aov and lm test different effects of the first term entered) 23) 0.24 (from the aov summary, calculation SS.tot = 1856, then 450.1/1856 *24) 0.33 (624.0/1856; how could so many people get 23 and miss this? I don't underestand.) *25) A (of course they do!) 26) A (yes, from the degrees of freedom; df.tot=1+1+73=75, one less than N) 27) 8.985 (from either of the summaries in the first analysis) *28) 24.747 (from the second analysis: SS.tot=1856, df.tot=75, so MS.tot=1856/75) *29) no (from either summary in the first analysis, Q1 and Q3 as well as Min and Max are reasonably symmetrical around Md and Md is very close to the mean) 30) 3.9746 (from the second regression, when all predictors are zero, the prediction is the slope)