Difference between R square and Adjusted R square?
Naveen
- 0
The R-square is a measure of how well the linear regression model fits the observed data. It is calculated by squaring the correlation coefficient and dividing by the standard deviation of errors. It is the square of the correlation coefficient divided by its standard deviation (r2/s2). The R-square value of 1 indicates that the model explains 100% of the variation in Y. The R-square values greater than 1 indicate that the model explains more than 100% of the variation in Y. The larger an R-square is, then more clearly the model is able to fit the observed data. The R-square value of 0 indicates that the model cannot be used to predict Y accurately.
Adjusted R-square, on the other hand, takes into account how well a model fits independent data. It’s calculated by taking into account both how well a model fits new data and how well it predicts old data. A higher adjusted R-square indicates more accurate predictions for future observations. A perfect model would have an adjusted R-square equal to 1. If a model has an adjusted R-square of 0, the model only fits the new data and doesn’t offer predictive value for future observations. The accuracy of two linear regression models with different stopping criteria but similar intercepts is tested by comparing their adjusted R-squares. A model with a lower adjusted R-square has a better fit.