Cite this

# Simple and Multiple Regression Techniques Report (Assessment)

## Abstract

The paper carried out an analysis of bivariate correlation, simple, multiple, and step-wise regression. In the paper, it is established that the simple regression is similar to the bivariate correlation except for the bivariate correlation does not separate between dependent and independent variable. The Pearson’s correlation coefficient between self-esteem and negative effect is -0.569. Simple linear regression yields an intercept of 89.726 and a coefficient of -3.189. The t-test reveals that a negative effect is a significant determinant of self-esteem. Further, the F-test shows that the overall regression line is significant. The coefficient of determination (32.3%) implies that the negative effect has a weak explanatory power. The results of multiple regression show that a negative effect and openness are not statistically significant. The overall regression line is statistically significant while the explanatory power of independent variables is 60.7%. Finally, the results of multiple regression are similar to those of stepwise multiple regression, except for the significance of negative effect and openness.

## Similarities and Differences between the Models

Simple linear regression is similar to correlation in the sense that they measure the extent of linear association between two variables. The difference between the two is that correlation analysis does not make a distinction between explanatory and explained variables. Further, the main purpose of regression is to forecast the value of the dependent variable based upon some values of independent variables. Simple and multiple regression models are used to develop linear equations. The difference between the two is that simple regression has only explanatory variable while in multiple regression there are more than one explanatory variable (Verbeek, 2017).

## Bivariate Correlation

This statistical tool will try to establish whether there is a relationship between self-esteem and negative emotions.

### Results

The correlation coefficients are displayed in the attached SPSS output file.

### Discussion

The results show the mean and standard deviation for self esteem and negative effect. The value of Pearson correlation is -0.569. This implies that there is an average negative relationship between the two variables. This relationship is expected because higher levels of self-esteem are often associated with low level of negative emotions. The significance of the correlation coefficient will also be tested by comparing the alpha level of 0.05 with a significance level of 0.000. The correlation coefficient is statistically significant (Wooldridge, 2013).

### Strengths and Weaknesses

One advantage of the correlation coefficient is that it shows both the direction and strength of the relationship between two variables. Secondly, it is easy to use and interpret. A major weakness of this tool is that the coefficient is often affected by extreme values. Also, it assumes a linear relationship between the variables.

## Simple Linear Regression

Simple linear regression is a statistical tool that approximates a linear relationship between two variables. The equation takes the form Y = a + bX.

### Regression Output

The results of simple regression are displayed in the attached SPSS output file.

### Discussion

Based on the results, the regression equation will be Y = 89.726 – 3.189X. The intercept value of 89.726 captures the variables that were omitted. The coefficient value is negative, which implies that there is a negative relationship between the two variables. Further, in simple linear regression, it can be observed that the beta coefficient is the same as the Pearson’s correlation coefficient. The t-value of -14.163 is greater than the tabulated t-value (1.96). This implies that negative effect is a significant determinant of self-esteem. In the ANOVA table, f-calculated is 200.578 while the significance level is 0.000. This implies that the overall regression line is statistically significant. Further, the value of the R-Square is 0.323. This implies that negative effect explains 32.3% of the variations in self-esteem. It is an indication of a weak explanatory variable. Finally, the value of Durbin Watson is approaching 2. This signals the absence of autocorrelation in the residuals (Verbeek, 2017).

### Strengths and Limitations

A major strength of the simple regression model is that it shows optimal results when the relationship between the variables is almost linear. A limitation of this statistical tool is that it is often incorrectly used to model non-linear relationships. Secondly, it cannot be used for non-numerical data.

## Multiple Regression

This statistical tool develops a linear relationship between one explained variable and more than one explanatory variable. In this case, the explanatory variables are six.

### Regression Result

The results of multiple regression analysis are displayed in the attached SPSS output file

### Discussion

All the correlation coefficients are statistically significant apart from extraversion and openness. Further, there is a strong negative relationship between self-esteem and negative effect, trait anxiety, and neuroticism. The other independent variables had a positive relationship with self-esteem. Based on the results, the regression equation will take the form Y = 96.885 -0.386X1 – 0.646X2 + 0.185X3 + 0.088X4 + 1.338X5 – 0.477X6. The sign of the coefficients shows that there is a positive association between self-esteem and extraversion, openness, and positive effect. The t-test reveals that negative effect and openness are not statistically significant.

The other four variables are statistically significant at the 5% level of significance. The ANOVA table shows that f-calculated is 106.356 while the significance level is 0.000. This implies that the overall regression line is statistically significant. Further, the value of the R-Square is 0.607. This indicates that the independent variables explain 60.7% of the variations in self-esteem. It suggests that the independent variables have a strong explanatory power. A comparison of simple and multiple regression models shows that adding explanatory variables in the regression models improves the coefficient of determination. Finally, the value of Durbin Watson is approaching 2. This infers absence of autocorrelation in the residuals (Meyers, Gamst, & Guarino, 2013).

### Step-wise Multiple Regression

The results of multiple regression are similar to step-wise multiple regression, except for the results of the t-test. All the explanatory variables are statistically significant. Further, all the four regression lines are statistically significant.

### Strengths and Weaknesses of Multiple Regression

A major advantage of the multiple regression model is that it is possible to determine the relative impact of the explanatory variables on the dependent variable. The second advantage is the ability to identify anomalies and outliers. A drawback of this tool is that it is often incorrectly used to model non-linear connections. Secondly, it cannot be used for non-numerical data (Gujarati, 2014).

## Graphical Representation

Scatter diagrams are used to represent the data because the pattern of the dot can give information on the nature of the relationship between the variables. For instance, in the diagram for self-esteem and negative effect the dots tend to slope downwards. This signifies a negative association between the two variables. The scatter diagrams are displayed in the attached chart output file.

## Conclusion

The simple and multiple regressions were used to develop a linear relationship between self-esteem as the dependent variable and other independent variables. In the case of simple regression, it is established that negative effect is statistically significant. Also, the overall regression line is significant. However, negative effect has a weak explanatory power. In the case of multiple regression, only two variables are not statistically significant. The overall regression line is significant. Further, there is no autocorrelation in the residuals for the two models.

## References

Gujarati, D. (2014). Econometrics by example (2nd ed.). New York, NY: Macmillan Publishers Limited.

Meyers, L. S., Gamst, G. C., & Guarino, A. J. (2013). Performing data analysis using IBM SPSS (6th Ed.). New Jersey, NJ: John Wiley & Sons, Inc.

Verbeek, M. (2017). A guide to modern econometrics (5th ed.). New Jersey, NJ: John Wiley & Sons, Inc.

Wooldridge, J. M. (2013). Introductory econometrics: A modern approach (5th ed.). Mason, OH: South-Cengage Learning.

This assessment on Simple and Multiple Regression Techniques was written and submitted by your fellow student. You are free to use it for research and reference purposes in order to write your own paper; however, you must cite it accordingly.
Removal Request
If you are the copyright owner of this paper and no longer wish to have your work published on IvyPanda.

Need a custom Assessment sample written from scratch by
professional specifically for you?            certified writers online

Cite This paper

Select a referencing style:

Reference

IvyPanda. (2021, January 4). Simple and Multiple Regression Techniques. Retrieved from https://ivypanda.com/essays/simple-and-multiple-regression-techniques/

Work Cited

"Simple and Multiple Regression Techniques." IvyPanda, 4 Jan. 2021, ivypanda.com/essays/simple-and-multiple-regression-techniques/.

1. IvyPanda. "Simple and Multiple Regression Techniques." January 4, 2021. https://ivypanda.com/essays/simple-and-multiple-regression-techniques/.

Bibliography

IvyPanda. "Simple and Multiple Regression Techniques." January 4, 2021. https://ivypanda.com/essays/simple-and-multiple-regression-techniques/.

References

IvyPanda. 2021. "Simple and Multiple Regression Techniques." January 4, 2021. https://ivypanda.com/essays/simple-and-multiple-regression-techniques/.

References

IvyPanda. (2021) 'Simple and Multiple Regression Techniques'. 4 January.