PDA

View Full Version : Goodness of a regression without an intercept term



kalbracht47
10-19-2007, 08:49 PM
Dear Biomech-L readers,

I have some problems fitting a linear regression to my measured data and
especially to determine the goodness of the fit. The regressions we
need, have two substantial differences to the simple linear regressions:

1) Due to some physical consideration we have a regression model with no
intercept term, i.e. through the origin (y = b*x)
2) Both variables are erroneous

My question is: How to calculate a 'valid' rsquare and the standard
error of such a regression ?

In the literature (Casalla, G. & Berger, R. L. 2002: Statistical
Inference, Duxbury, pp 581-583) we found for the sake that both
variables are erroneous, the orthogonal least square distance is used
instead of the ordinary least square distance to fit the regression. In
addition we found that the calculation of rsquare is different for the
no-intercept model compared to the common used intercept model (Hahn, G.
H. 1977: Journal of Quality Technology 9(2), pp 56-61; Eisenhauer, J. G.
2003: Teaching Statistics 25(3), pp 76-80).For the intercept model
rsquare is the proportion of the initial variation, as measured by the
sum of squares around the mean of Y, which is accounted for by the
regression. For the no-intercept model, the variation around the fitted
regression, however, could exceed the variation around the mean,
resulting in a negative value of rsquar. Therefore, for the intercept
model it is recommended to calculate rsquare as the variation around the
origin.

However, I am wondering whether I can apply this calculation also when I
used the orthogonal least square distance to fit the regression.

Any help would be greatly appreciated

Kirsten Albracht

--
Kirsten Albracht
Institute for Biomechanics and Orthopaedics
German Sport University Cologne
Carl Diem Weg 6
50933 Cologne

Email: albracht@dshs-koeln.de
Tel.: +49 221 4982-5680
Fax.: +49 221 4971598