Dear Biomch-L readers,
With the apparent popularity of the GCVSPL FORTRAN package on the Biomch-L
fileserver and in the GCV sublibrary of netlib@research.att.com and its
companion servers elsewhere, it is a real pleasure to cross-post Grace
Wahba's book advert below from the Neuron Digest.
Enjoy! -- hjw
- = - = - = - = - = - = - = - = - = - = - = - = - = - = - = - = - = - = -
Subject: Book Advert-CV,GCV, et al
From: Grace Wahba
Date: Wed, 27 May 92 20:43:00 -0600
Sender: Neuron Digest Vol. 9, Nr. 24 (3 Jun 92)
BOOK ADVERT - CV, GCV, DF SIGNAL, The BIAS-VARIANCE TRADEOFF
AND ALL THAT ....
Spline Models for Observational Data by G. Wahba v 59 in the SIAM
NSF/CBMS Series in Applied Mathematics
Although this book is written in the language of statistics it covers a
number of topics that are increasingly recognized as being of importance
to the computational learning community. It is well known that models
such as neural nets, radial basis functions, spline and other bayesian
models that are adapted to fit the data very well may in fact overfit the
data, leading to large generalization error. In particular, minimizing
generalization error, aka aka the bias-variance tradeoff, is discussed in
the context of smooth multivariate function estimation with noisy data.
Here, reducing the bias (fitting the data well) increases the variance
(a proxy for the generalization error) and vice versa. Included is an
in-depth discussion of ordinary cross validation, generalized cross
validation and unbiassed risk as criteria for optimizing the bias-
variance tradeoff. The role of "degrees of freedom for signal" as well as
the relationships between Bayes estimation, regularization, optimization
in (reproducing kernel) hilbert spaces, splines, and certain radial basis
functions are covered, as well as a discussion of the relationship
between generalized cross validation and maximum likelihood estimates of
the main parameter(s) controlling the bias-variance tradeoff, both in the
context of a well- known prior for the unknown smooth function, and in
the general context of (smooth) regularization.
..................
Spline Models for Observational Data, by Grace Wahba v. 59 in the
CBMS-NSF Regional Conference Series in Applied Mathematics, SIAM,
Philadelphia, PA, March 1990. Softcover, 169 pages, bibliography,
author index. ISBN 0-89871-244-0
List Price US $24.75, SIAM or CBMS* Member Price $19.80
(Domestic 4th class postage free, UPS or Air extra)
May be ordered from SIAM by mail, electronic mail, or phone:
o e-mail (internet) service@siam.org
o SIAM P. O. Box 7260 Philadelphia, PA 19101-7260 USA
o Toll-Free 1-800-447-7426 (8:30-4:45 Eastern Standard Time, USA)
Regular phone: +1(215)382-9800, FAX +1(215)386-7999
May be ordered on American Express, Visa or Mastercard, or paid by check
or money order in US dollars, or may be billed (extra charge).
* CBMS member organizations include AMATC, AMS, ASA, ASL, ASSM, IMS,
MAA, NAM, NCSM, ORSA, SOA and TIMS.
With the apparent popularity of the GCVSPL FORTRAN package on the Biomch-L
fileserver and in the GCV sublibrary of netlib@research.att.com and its
companion servers elsewhere, it is a real pleasure to cross-post Grace
Wahba's book advert below from the Neuron Digest.
Enjoy! -- hjw
- = - = - = - = - = - = - = - = - = - = - = - = - = - = - = - = - = - = -
Subject: Book Advert-CV,GCV, et al
From: Grace Wahba
Date: Wed, 27 May 92 20:43:00 -0600
Sender: Neuron Digest Vol. 9, Nr. 24 (3 Jun 92)
BOOK ADVERT - CV, GCV, DF SIGNAL, The BIAS-VARIANCE TRADEOFF
AND ALL THAT ....
Spline Models for Observational Data by G. Wahba v 59 in the SIAM
NSF/CBMS Series in Applied Mathematics
Although this book is written in the language of statistics it covers a
number of topics that are increasingly recognized as being of importance
to the computational learning community. It is well known that models
such as neural nets, radial basis functions, spline and other bayesian
models that are adapted to fit the data very well may in fact overfit the
data, leading to large generalization error. In particular, minimizing
generalization error, aka aka the bias-variance tradeoff, is discussed in
the context of smooth multivariate function estimation with noisy data.
Here, reducing the bias (fitting the data well) increases the variance
(a proxy for the generalization error) and vice versa. Included is an
in-depth discussion of ordinary cross validation, generalized cross
validation and unbiassed risk as criteria for optimizing the bias-
variance tradeoff. The role of "degrees of freedom for signal" as well as
the relationships between Bayes estimation, regularization, optimization
in (reproducing kernel) hilbert spaces, splines, and certain radial basis
functions are covered, as well as a discussion of the relationship
between generalized cross validation and maximum likelihood estimates of
the main parameter(s) controlling the bias-variance tradeoff, both in the
context of a well- known prior for the unknown smooth function, and in
the general context of (smooth) regularization.
..................
Spline Models for Observational Data, by Grace Wahba v. 59 in the
CBMS-NSF Regional Conference Series in Applied Mathematics, SIAM,
Philadelphia, PA, March 1990. Softcover, 169 pages, bibliography,
author index. ISBN 0-89871-244-0
List Price US $24.75, SIAM or CBMS* Member Price $19.80
(Domestic 4th class postage free, UPS or Air extra)
May be ordered from SIAM by mail, electronic mail, or phone:
o e-mail (internet) service@siam.org
o SIAM P. O. Box 7260 Philadelphia, PA 19101-7260 USA
o Toll-Free 1-800-447-7426 (8:30-4:45 Eastern Standard Time, USA)
Regular phone: +1(215)382-9800, FAX +1(215)386-7999
May be ordered on American Express, Visa or Mastercard, or paid by check
or money order in US dollars, or may be billed (extra charge).
* CBMS member organizations include AMATC, AMS, ASA, ASL, ASSM, IMS,
MAA, NAM, NCSM, ORSA, SOA and TIMS.