TX TX 17.8 24.6 18.9 19 17.0 72.0 20.7 18.5 10.7 30.2 10.6 15.3 27.3 66.5 17,6 2
ID: 3252846 • Letter: T
Question
TX TX 17.8 24.6 18.9 19 17.0 72.0 20.7 18.5 10.7 30.2 10.6 15.3 27.3 66.5 17,6 20.8 98.3 9.6 35.6 10.6 56 74.8 28.2 13.1 92.2 10.8 34.7 11.9 5.9 97.9 96 35.8 10.8 5.5 (a) 88.1 10.5 29.6 1117 7.8 Fit a simple linear model of Y on ne inde dent variable (b) and regression coefficient (2) test Ho: On report (1) the Y intercept Fit a full linear B 0. (10 pts) coefficients; model of Y on ALL Xs, report (1) the Y intercept and partial regression test Ho: p, (2) test if there is a significant multiple regression relationship between Y and Xsi (3) 0 (for X3). (15 pts) 18.8Explanation / Answer
One can use any software to analysis thi, Here i use R-software to solve this.
y=c(51.4,72,53.2,83.2,57.4,66.5,98.3,74.8,92.2,97.9,88.1)
x1=c(0.2,1.9,0.2,10.7,6.8,10.6,9.6,6.3,10.8,9.6,10.5)
x2=c(17.8,29.4,17,30.2,15.3,17.6,35.6,28.2,34.7,35.8,29.6)
x3=c(24.6,20.7,18.5,10.6,8.9,11.1,10.6,8.8,11.9,10.8,11.7)
x4=c(18.9,8,22.6,7.1,27.3,20.8,5.6,13.1,5.9,5.5,7.8)
(a)
yx1 = lm(y~x2)
summary(yx1)
Call:
lm(formula = y ~ x2)
Residuals:
Min 1Q Median 3Q Max
-9.9003 -3.9828 -0.3377 4.0341 8.7507
Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) 21.7273 6.5659 3.309 0.0091 **
x2 2.0467 0.2383 8.589 1.25e-05 ***
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
Residual standard error: 6.037 on 9 degrees of freedom
Multiple R-squared: 0.8913, Adjusted R-squared: 0.8792
F-statistic: 73.76 on 1 and 9 DF, p-value: 1.25e-05
(2) The p-value is less than level of significance. We reject the null hypothesis. Hence there is a significant relationship between the variablesY in the linear regression model of the data set X2.
(b)
yf=lm(y~x1+x2+x3+x4)
> summary(yf)
Call:
lm(formula = y ~ x1 + x2 + x3 + x4)
Residuals:
Min 1Q Median 3Q Max
-3.9430 -1.5599 -0.8015 2.1704 3.4277
Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) -17.4663 39.3679 -0.444 0.6728
x1 1.7712 0.6052 2.927 0.0264 *
x2 2.4361 0.7628 3.194 0.0187 *
x3 0.4263 0.5506 0.774 0.4682
x4 0.8263 0.7927 1.042 0.3374
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
Residual standard error: 3.095 on 6 degrees of freedom
Multiple R-squared: 0.9809, Adjusted R-squared: 0.9682
F-statistic: 77.22 on 4 and 6 DF, p-value: 2.728e-05
(2) The p-value for F-distribution is less than level of significance. We reject the null hypothesis
(3) The p-value is greater than level of significance. We do not reject the null hypothesis. Hence there is a significant relationship between the variables Y in the linear regression model of the data set X3.
(c) Form the above two model, model (b) is better than model (a) because this shows a perfect linear relationship between the variables. This can show by R-squared value.
(d)
Related Questions
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.