Fast exact leave-one-out cross-validation of sparse least-squares support vector machines.
Neural Netw
; 17(10): 1467-75, 2004 Dec.
Article
em En
| MEDLINE
| ID: mdl-15541948
Leave-one-out cross-validation has been shown to give an almost unbiased estimator of the generalisation properties of statistical models, and therefore provides a sensible criterion for model selection and comparison. In this paper we show that exact leave-one-out cross-validation of sparse Least-Squares Support Vector Machines (LS-SVMs) can be implemented with a computational complexity of only O(ln2) floating point operations, rather than the O(l2n2) operations of a naïve implementation, where l is the number of training patterns and n is the number of basis vectors. As a result, leave-one-out cross-validation becomes a practical proposition for model selection in large scale applications. For clarity the exposition concentrates on sparse least-squares support vector machines in the context of non-linear regression, but is equally applicable in a pattern recognition setting.
Buscar no Google
Coleções:
01-internacional
Base de dados:
MEDLINE
Assunto principal:
Algoritmos
/
Análise dos Mínimos Quadrados
/
Modelos Estatísticos
/
Redes Neurais de Computação
Tipo de estudo:
Prognostic_studies
/
Risk_factors_studies
Idioma:
En
Ano de publicação:
2004
Tipo de documento:
Article