Definition |
The average of the squared differences between the predicted values and the true values. |
The proportion of the variance in the dependent variable explained by the independent variables. |
Range of Values |
Can take any non-negative value. |
Ranges between 0 and 1, inclusive. |
Interpretation |
Lower MSE indicates better model performance, as it reflects the average prediction error. |
Higher R-squared indicates better model performance, as it explains a larger proportion of the variance in the data. |
Sensitivity to Outliers |
Highly sensitive to outliers, as the squared differences amplify the impact of outliers on the overall error. |
Less sensitive to outliers, as it considers the proportion of variance explained rather than the absolute differences. |
Relationship to Data |
Directly measures the prediction accuracy by quantifying the average deviation between predicted and actual values. |
Indirectly measures the goodness of fit by assessing how well the model explains the variability in the data. |
Combination of Errors |
MSE combines both systematic and random errors into a single metric. |
R-squared separates systematic errors (explained variance) from random errors (unexplained variance). |
Usage |
Useful for comparing different models or tuning hyperparameters, as it provides a numerical measure of prediction accuracy. |
Widely used for model evaluation and selection, as it provides insights into the model’s explanatory power. |