Abstract
One-class and cost-sensitive support vector machines (SVMs) are state-of-the-art machine learning methods for estimating density level sets and solving weighted classification problems, respectively. However, the solutions of these SVMs do not necessarily produce set estimates that are nested as the parameters controlling the density level or cost-asymmetry are continuously varied. Such nesting not only reflects the true sets being estimated, but is also desirable for applications requiring the simultaneous estimation of multiple sets, including clustering, anomaly detection, and ranking. We propose new quadratic programs whose solutions give rise to nested versions of one-class and cost-sensitive SVMs. Furthermore, like conventional SVMs, the solution paths in our construction are piecewise linear in the control parameters, although here the number of breakpoints is directly controlled by the user. We also describe decomposition algorithms to solve the quadratic programs. These methods are compared to conventional (non-nested) SVMs on synthetic and benchmark data sets, and are shown to exhibit more stable rankings and decreased sensitivity to parameter settings.
Original language | English |
---|---|
Pages (from-to) | 1648-1660 |
Number of pages | 13 |
Journal | IEEE Transactions on Signal Processing |
Volume | 58 |
Issue number | 3 PART 2 |
DOIs | |
State | Published - Mar 2010 |
Keywords
- Cost-sensitive support vector machine (SC-SVM)
- Machine learning
- Nested set estimation
- One-class support vector machine (OC-SVM)
- Pattern classification
- Solution paths