Fast exact leave-one-out cross-validation of sparse least-squares support vector machines

Neural Netw. 2004 Dec;17(10):1467-75. doi: 10.1016/j.neunet.2004.07.002.

Abstract

Leave-one-out cross-validation has been shown to give an almost unbiased estimator of the generalisation properties of statistical models, and therefore provides a sensible criterion for model selection and comparison. In this paper we show that exact leave-one-out cross-validation of sparse Least-Squares Support Vector Machines (LS-SVMs) can be implemented with a computational complexity of only O(ln2) floating point operations, rather than the O(l2n2) operations of a naïve implementation, where l is the number of training patterns and n is the number of basis vectors. As a result, leave-one-out cross-validation becomes a practical proposition for model selection in large scale applications. For clarity the exposition concentrates on sparse least-squares support vector machines in the context of non-linear regression, but is equally applicable in a pattern recognition setting.

Publication types

  • Research Support, Non-U.S. Gov't
  • Validation Study

MeSH terms

  • Algorithms*
  • Least-Squares Analysis*
  • Mathematics
  • Models, Statistical*
  • Neural Networks, Computer*
  • Nonlinear Dynamics
  • Pattern Recognition, Automated / methods