Linear Regression

28 questions
Question 1 of 28

Given that the sum of cross-deviations is 153.30 and the sum of squared deviations of X is 122.64, the least-squares slope coefficient is closest to?

Question 2 of 28

Assertion (A): In a log-log model, the slope coefficient measures the relative change in the dependent variable for a relative change in the independent variable.
Reason (R): In a lin-log model, the slope coefficient measures the absolute change in the dependent variable for a relative change in the independent variable.

Question 3 of 28

If the calculated $t$-statistic for testing a zero slope is 4.00131, the equivalent $F$-statistic is closest to?

Question 4 of 28

A portfolio-versus-index regression had an original $R^2$ of 0.9921. After correcting a data error, the revised $R^2$ was 0.6784. The original fit was overstated by closest to?

Question 5 of 28

Consider the following:
I. When the dependent variable is in different forms across models, raw R-squared and F-statistics are still directly comparable.
II. Random residuals are useful evidence when deciding whether a functional form fits well.
III. Selecting a functional form should rely only on the highest R-squared, without examining the standard error of the estimate or residual patterns.
How many of the above statements are most accurate?

Question 6 of 28

Assertion (A): In simple linear regression, the calculated t-statistic for testing whether the correlation is zero can differ from the calculated t-statistic for testing whether the slope is zero.
Reason (R): In simple linear regression, the t-statistic for testing a zero slope and the t-statistic for testing zero correlation are the same value.

Question 7 of 28

Consider the following:
I. The predicted value of the dependent variable is obtained by inserting the forecasted independent variable into the estimated regression equation.
II. The standard error of the forecast is minimized when the forecasted independent variable is farthest from the mean of the independent variable.
III. A prediction interval around the forecast uses only the standard error of the estimate, without using the standard error of the forecast.
How many of the above statements are most accurate?

Question 8 of 28

Assertion (A): The standard error of the estimate is an absolute measure of fit, whereas the coefficient of determination and the F-statistic are relative measures of fit.
Reason (R): The standard error of the estimate equals the square root of MSE and measures the distance between observed values of the dependent variable and the values predicted from the estimated regression.

Question 9 of 28

Assertion (A): In simple linear regression with an intercept, the fitted line is not chosen by minimizing the sum of residuals, even though the residuals sum to zero.
Reason (R): Ordinary least squares selects the intercept and slope that minimize the sum of squared residuals, and with an estimated intercept the residuals sum to zero by design.

Question 10 of 28

A regression has standard error of the estimate 1.8618987, sample size 8, mean of X equal to 7.5, and variance of X equal to 4.285714. If the forecasted X value is 15, the standard error of the forecast is closest to?

Question 11 of 28

For the Amtex regression, using $\hat{Y}_f = 0.0071$, $s_f = 0.0469$, and a 99% critical value of 2.728, the lower bound of the prediction interval is closest to?

Question 12 of 28

Consider the following:
I. In a lin-log model, the slope gives the absolute change in the dependent variable for a relative change in the independent variable.
II. In a log-lin model, the predicted value of the dependent variable is obtained by taking the antilog of the predicted logarithmic value.
III. In a log-log model, the slope gives the absolute change in the dependent variable for an absolute change in the independent variable.
How many of the above statements are most accurate?

Question 13 of 28

A regression is estimated with $n = 5$ and SST = 95.2. The sample variance of the dependent variable is closest to?

Question 14 of 28

Consider the following:
I. In simple linear regression, the coefficient of determination equals the square of the pairwise correlation.
II. The regression F-statistic is left-tailed because the analyst tests whether explained variation is smaller than unexplained variation.
III. If the null hypothesis that the slope equals zero is not rejected, the intercept must also equal zero.
How many of the above statements are most accurate?

Question 15 of 28

From the ANOVA table with SSR = 576.1485 and SST = 2,449.7100, the coefficient of determination is closest to?

Question 16 of 28

Consider the following:
I. Residuals that appear random when plotted against the independent variable are consistent with the linearity assumption.
II. Residuals clustering into regimes with markedly different variances support the homoskedasticity assumption.
III. Normality in simple linear regression requires the dependent and independent variables themselves to be normally distributed.
How many of the above statements are most accurate?

Question 17 of 28

Given $\bar{Y} = 12.5$, $\bar{X} = 6.1$, and $\hat{b}_1 = 1.25$, the regression intercept is closest to?

Question 18 of 28

If the mean square error from ANOVA is 2.4, the standard error of the estimate is closest to?

Question 19 of 28

Assertion (A): In simple linear regression, a two-sided test of whether the slope equals 1.0 uses the same degrees of freedom as a two-sided test of whether the slope equals 0.
Reason (R): To test a slope hypothesis, the analyst subtracts the hypothesized population slope from the estimated slope and divides by the standard error of the slope.

Question 20 of 28

Assertion (A): A residual plot that shows a curved pattern against the independent variable is evidence against the linearity assumption.
Reason (R): The linearity assumption requires the variance of residuals to be the same for all observations.

Question 21 of 28

A regression model has $R^2 = 0.2089$ in lin-lin form and $R^2 = 0.8491$ after a log-log transformation. The increase in $R^2$ is closest to?

Question 22 of 28

A portfolio-versus-index regression had an original standard error of the estimate of 2.8619. After correcting a data error, the revised standard error was 2.0624. The standard error fell by closest to?

Question 23 of 28

Suppose the estimated model is $\ln Y = -7 + 2X$. If $X = 2.5$, the predicted level of $Y$ is closest to?

Question 24 of 28

Using $\hat{Y} = 16.5 - 1.3X$, the predicted net profit margin when RDR = 5 is closest to?

Question 25 of 28

Consider the following:
I. In simple linear regression, the least squares criterion fits the line by minimizing the sum of squared residuals.
II. Because the intercept equals the predicted value of the dependent variable when the independent variable is zero, it is always economically meaningful.
III. The estimated slope equals the covariance of Y and X divided by the standard deviation of X.
How many of the above statements are most accurate?

Question 26 of 28

Consider the following:
I. In an ANOVA table for simple linear regression, the standard error of the estimate equals the square root of mean square error.
II. In an ANOVA table for simple linear regression, the error degrees of freedom equal n minus 1.
III. The standard error of the estimate is a relative measure of fit in the same sense as R-squared.
How many of the above statements are most accurate?

Question 27 of 28

Assertion (A): For a given regression and significance level, the prediction interval is narrowest when the forecasted value of the independent variable is closest to its sample mean.
Reason (R): The prediction interval is widest at the sample mean of the independent variable because the term involving $(X_f - \bar{X})^2$ is largest there.

Question 28 of 28

For the model $\ln(\mathrm{NPM}) = 0.5987 + 0.2951\,\mathrm{FATO}$, the predicted NPM when FATO = 2 is closest to?