R - Multiple regression - Quizz

From Training Material
Jump to navigation Jump to search
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.

<quiz display=simple > {The multiple correlation (R) is: (check all that apply)

|type="[]"} +The correlation between predicted and observed scores. -The sum of the simple r's. -The highest simple r. +Always between 0 and 1 (inclusive).

{

Answer >>

R is the correlation between predicted and observed scores when there are two or more predictors. It is always between 0 and 1.

}

{In multiple regression there are:

|type="[]"} -multiple criterion variables. +multiple predictor variables. -two predictor variables.

{

Answer >>

Having two or more predictor variables is what distinguishes multiple regression from simple regression.

}

{The difference between a regression weight and a beta weight is:

|type="[]"} -A regression weight assumes linearity. -A beta weight is for the population while a regression weight is for the sample. -A regression weight is less biased. +A beta weight is a standardized regression weight.

{

Answer >>

A beta weight is a standardized regression weight.

}

{In the regression equation Y' = b1X1 + b2X2 + A, if b1 = 5, then how much would the predicted value of Y differ for two observations that had the same value of X2 but differed by 7 on X1?

|type="{}"} { 35 }

{

Answer >>

35

}

{The difference between a regression weight and a regression coefficient is: (check all that apply)

|type="[]"} -The regression weight is more important. -The regression weight is unbiased. -The regression weight is added rather than multiplied.

{

Answer >>

They are synonymous.

}

{A regression weight is a partial slope because:

|type="[]"} +It is the slope when the part of the predictor independent of the other predictors is used to predict the criterion. -It is only one of several slopes, so it is only part of the prediction equation. -It is the relationship between the significant part of a predictor and the criterion. -It is only an estimate of the true slope and so is a partial solution.

{

Answer >>

It is the slope when the part of the predictor independent of the other predictors is used to predict the criterion. The other predictors are "partialled out."

}

{Find the value of the multiple correlation (R). You should use a computer to find the solution.

Y X1 X2 X3

27.6	  1	  4	  4
 9.4	  3	  5	  3
15.6	  4	  7	  1
20.3	  5	  5	  4
12.3	  3	  7	  3
 8.7	  5	  3	  6
 7.3	  7	  5	  7
14.9	  6	  4	  8
17.0	  5	  3	  9
-0.8	  4	  2	  0

|type="{}"} { 0.7575 }

{

Answer >>

0.7575

}

{These are the same data as in the previous question. Find the value of b2. You should use a computer to find the solution.

Y X1 X2 X3

27.6	  1	  4	  4
 9.4	  3	  5	  3
15.6	  4	  7	  1
20.3	  5	  5	  4
12.3	  3	  7	  3
 8.7	  5	  3	  6
 7.3	  7	  5	  7
14.9	  6	  4	  8
17.0	  5	  3	  9
-0.8	  4	  2	  0

|type="{}"} { 1.6848 }

{

Answer >>

1.6848

}

{The sum of squares explained is 200 and the sum of squares error is 100. What is the R2?

|type="{}"} { 0.667 }

{

Answer >>

0.667

}

{The sum of the simple r2's is typically

|type="[]"} -less than R2 -equal to R2 +greater than R2

{

Answer >>

Greater than R squared. Typically there is overlap in the variance explained by the predictors.

}

{Which of the following assumptions pertain to inferential statistics in multiple regression?

|type="[]"} -The predictor variables are normally distributed. -The criterion variable is normally distributed. +The errors of prediction (the residuals) are normally distributed. +The variance about the regression line is the same for all predicted values. +The predictor variables are linearly related to the criterion.

{

Answer >>

Residuals are normally distributed. The variance about the regression line is the same for all predicted values (homoscedasticity). The predictor variables are linearly related to the criterion.

}