Statistics Solutions is the country's leader in multiple regression and dissertation statistics. Contact Statistics Solutions today for a free 30-minute consultation.
Questions like how much of the variations in sales can be explained by advertising expenditures, prices and the level of distribution can be answered by employing the statistical technique called multiple regression.
The general form of multiple regression is given by the multiple regression model and is the following:
Y= ß0 + ß1X1 + ß2X2 + …….. + ßkXk + e.
This multiple regression model is estimated using the following equation:
= a + b1X1 + b2X2 + …….. + bkXk.
There are certain statistics that are used while conducting the analysis on multiple regression.
The R2 in multiple regression is the coefficient of the multiple determination. This coefficient in multiple regression measures the strength of association.
The F test in multiple regression is used to test the null hypothesis that the coefficient of the multiple determination in the population is equal to zero.
The partial F test in multiple regression is used to test the significance of a partial regression coefficient. This incremental F statistic in multiple regression is based on the increment in the explained sum of squares that results from the addition of the independent variable to the regression equation after all the independent variables have been included.
The partial regression coefficient in multiple regression is denoted by b1. This denotes the change in the predicted value per unit change in X1, when the other independent variables are held constant.
In SPSS, multiple regression is conducted by the researcher by selecting “regression” from the “analyze menu.” From regression, the researcher selects the “linear” option. When the linear regression dialogue box appears, then the researcher enters one numeric dependent variable and two or more independent variables and then finally he will carry out multiple regression in SPSS.
The following assumptions are made in multiple regression statistical analysis:
The first assumption of multiple regression involves the proper specification of the model. This assumption is important in multiple regression because if the relevant variables are omitted from the model, then the common variance which they share with variables that are included in the mode is then wrongly characterized with respect to those variables, and hence the error term is inflated.
The second assumption is that the residual errors in multiple regression are normally distributed. In other words, one can say that the residual errors in multiple regression should follow the normal population having zero as mean and a variance as one.
The third assumption in multiple regression is that of unbounded data. The regression line produced by OLS (ordinary least squares) in multiple regression can be extrapolated in both directions, but is meaningful only within the upper and lower natural bounds of the dependent.