分层回归分析

  • 格式:docx
  • 大小:16.34 KB
  • 文档页数:3

下载文档原格式

  / 5
  1. 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
  2. 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
  3. 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。

分层回归分析

2007-12-08 14:55:16|分类:专业补充|标签:|字号大中小订阅

Hierarchical Regression Analysis

In a hierarchical multiple regression, the researcher decides not only how many predictors to enter but also the order in which they enter. Usually, the order of entry is based on logical or theoretical considerations.

There are three predictor variables and one criterion variable in the following data set. A researcher decided the order of entry is X1, X2, and X3.

SPSS for Windows

1. Enter Data.

2. Choose Analyze /Regression / Linear.

Dependent: Select "y" and move it to the Dependent variable list. First, click on the variable y. Next, click on the right arrow.

Block 1 of 1

Independent(s): Choose the first predictor variable x1 and move it to the Independent(s) box. Next, click the Next button as shown below.

Block 2 of 2

Click the predictor variable x2 and move it to the Independent(s) box. Next, click the Next button as shown below.

Block 3 of 3

Click the predictor variable x3 and move it to the Independent(s) box.

3. Click the Statistics button. Check R squared change.

Click Continue and OK.

SPSS Output

1. R square Change

R Square and R Square Change

Order of Entry

Model 1 : Enter X1

Model 1: R square = .25

The predictor X1 alone accounts for 25% of the variance in Y.

R2 = .25

Model 2 : Enter X2next

.Model 2: R square = .582

The Increase in R square: . 582 - .25 = .332

The predictor X2 accounts for 33% of the variance in Y after controlling for X1.

R2 = .25 + .332 = .582

Model Three: Enter X3 third

Model 3: R square = .835

The Increase in R square: . 835 - .582 = .253

The predictor X3 accounts for 25% of the variance in Y, after X1 and X2 were partialed out from X3.

R2 = .25 + .332 + .253 = .835

About 84% of the variance in the criterion variable was explained by the first (25%), second (33%) and third (25%) predictor variables.

2. Adjusted R Square

For our example, there are only five subjects. However, there are three predictors. Recall that R square may be overestimated when the data sets have few cases (n) relative to number of predictors (k).

Data sets with a small sample size and a large number of predictors will have a greater difference between the obtained and adjusted R square (.25 vs. .000, .582 vs. .165, and .835 vs. .338).

3. F Change and Sig. F Change

If the R square change associated with a predictor variable in question is large, it means that the predictor variable is a good predictor of the criterion variable.

In the first step, enter the predictor variable x1 first. This resulted in an R square of .25, which was not statistically significant (F Change = 1.00, p > .05). In the second step, we add x2. This increased the R square by33%, which was not statistically significant (F Change = 1.592, p > .05). In the third step, we add x3. This increased the R square by an additional 25%, which was not statistically significant (F Change = 1.592, p > .05).

4. ANOVA Table

Model1:

About 25% (2.5/10 = .25) of the variance in the criterion variable (Y) can be accounted for by X1. The first model, which includes one predictor variable ( X1), resulted in an F ratio of 1.000 with a p > .05.

Model 2