True or false and why 1. Adding another explanatory variable to a regression mod
ID: 3295658 • Letter: T
Question
True or false and why
1. Adding another explanatory variable to a regression model will raise the error sum of squares.
2. Coefficient estimates that are precise (small standard errors of coefficients) are also unbiased (expected value equals parameter value).
3. Unlike simple regression with one regressor, in multiple regression the regressors are always independent of the error term, epsilon, because in multiple regression there is no need to exclude any explanatory variable.
4. Unlike simple regression, in multiple regression the residuals need not add to zero. For example, if the residuals add to zero in a regression of Y on X1, then they might not add to zero when another X variable is added to the model.
Explanation / Answer
1)
Answer: false
Least squares minimizes sum of squares of the residuals (SSE). When we adding explanatory variable to the model will increase variance explained and decrease the variance of the residual. New explanatory variable will decrease or keeps the same error sum of squares but not increase error sum of squares.
2)
Answer: true
Your regression output not only gives point estimates of the coefficients of the variables in the regression equation, it also gives information about the precision of these estimates. Under the assumption that your regression model is correct--i.e., that the dependent variable really is a linear function of the independent variables, with independent and identically normally distributed errors--the coefficient estimates are expected to be unbiased and their errors are normally distributed. The standard errors of the coefficients are the (estimated) standard deviations of the errors in estimating them. In general, the standard error of the coefficient for variable X is equal to the standard error of the regression times a factor that depends only on the values of X and the other independent variables (not on Y), and which is roughly inversely proportional to the standard deviation of X. Now, the standard error of the regression may be considered to measure the overall amount of "noise" in the data, whereas the standard deviation of X measures the strength of the "signal" in X. Hence, you can think of the standard error of the estimated coefficient of X as the reciprocal of the signal-to-noise ratio for observing the effect of X on Y. The larger the standard error of the coefficient estimate, the worse the signal-to-noise ratio--i.e., the less precise the measurement of the coefficient.
3)
Answer: false
In multiple regression the regressions are not always independent of the error term epsilon
4)
Answer: false
Related Questions
drjack9650@gmail.com
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.