Nested support vector machines

Gyemin Lee, Clayton Scott

Research output: Contribution to journalArticlepeer-review

34 Scopus citations

Abstract

One-class and cost-sensitive support vector machines (SVMs) are state-of-the-art machine learning methods for estimating density level sets and solving weighted classification problems, respectively. However, the solutions of these SVMs do not necessarily produce set estimates that are nested as the parameters controlling the density level or cost-asymmetry are continuously varied. Such nesting not only reflects the true sets being estimated, but is also desirable for applications requiring the simultaneous estimation of multiple sets, including clustering, anomaly detection, and ranking. We propose new quadratic programs whose solutions give rise to nested versions of one-class and cost-sensitive SVMs. Furthermore, like conventional SVMs, the solution paths in our construction are piecewise linear in the control parameters, although here the number of breakpoints is directly controlled by the user. We also describe decomposition algorithms to solve the quadratic programs. These methods are compared to conventional (non-nested) SVMs on synthetic and benchmark data sets, and are shown to exhibit more stable rankings and decreased sensitivity to parameter settings.

Original languageEnglish
Pages (from-to)1648-1660
Number of pages13
JournalIEEE Transactions on Signal Processing
Volume58
Issue number3 PART 2
DOIs
StatePublished - Mar 2010

Keywords

  • Cost-sensitive support vector machine (SC-SVM)
  • Machine learning
  • Nested set estimation
  • One-class support vector machine (OC-SVM)
  • Pattern classification
  • Solution paths

Fingerprint

Dive into the research topics of 'Nested support vector machines'. Together they form a unique fingerprint.

Cite this