User`s guide

Parametric Fitting
3-31
If you increase the number of fitted coefficients in your model, R-square might
increase although the fit may not improve. To avoid this situation, you should
use the degrees of freedom adjusted R-square statistic described below.
Note that it is possible to get a negative R-square for equations that do not
contain a constant term. If R-square is defined as the proportion of variance
explained by the fit, and if the fit is actually worse than just fitting a horizontal
line, then R-square is negative. In this case, R-square cannot be interpreted as
the square of a correlation.
Degrees of Freedom Adjusted R-Square. This statistic uses the R-square statistic
defined above, and adjusts it based on the residual degrees of freedom. The
residual degrees of freedom is defined as the number of response values n
minus the number of fitted coefficients m estimated from the response values.
v indicates the number of independent pieces of information involving the n
data points that are required to calculate the sum of squares. Note that if
parameters are bounded and one or more of the estimates are at their bounds,
then those estimates are regarded as fixed. The degrees of freedom is increased
by the number of such parameters.
The adjusted R-square statistic is generally the best indicator of the fit quality
when you add additional coefficients to your model.
The adjusted R-square statistic can take on any value less than or equal to 1,
with a value closer to 1 indicating a better fit.
Root Mean Squared Error. This statistic is also known as the fit standard error
and the standard error of the regression
where MSE is the mean square error or the residual mean square
A RMSE value closer to 0 indicates a better fit.
vnm=
adjusted R-square 1
SSE n 1()
SST v 1()
-------------------------------=
RMSE s MSE==
MSE
SSE
v
-------------=