The search functionality is under construction.

The search functionality is under construction.

The core of the support vector machine (SVM) problem is a quadratic programming problem with a linear constraint and bounded variables. This problem can be transformed into the second order cone programming (SOCP) problems. An interior-point-method (IPM) can be designed for the SOCP problems in terms of storage requirements as well as computational complexity if the kernel matrix has low-rank. If the kernel matrix is not a low-rank matrix, it can be approximated by a low-rank positive semi-definite matrix, which in turn will be fed into the optimizer. In this paper we present two SOCP formulations for each SVM classification and regression problem. There are several search direction methods for implementing SOCPs. Our main goal is to find a better search direction for implementing the SOCP formulations of the SVM problems. Two popular search direction methods: HKM and AHO are tested analytically for the SVM problems, and efficiently implemented. The computational costs of each iteration of the HKM and AHO search direction methods are shown to be the same for the SVM problems. Thus, the training time depends on the number of IPM iterations. Our experimental results show that the HKM method converges faster than the AHO method. We also compare our results with the method proposed in Fine and Scheinberg (2001) that also exploits the low-rank of the kernel matrix, the state-of-the-art SVM optimization softwares *SVMTorch* and SVM^{light}. The proposed methods are also compared with Joachims 'Linear SVM' method on linear kernel.

- Publication
- IEICE TRANSACTIONS on Fundamentals Vol.E92-A No.4 pp.1209-1222

- Publication Date
- 2009/04/01

- Publicized

- Online ISSN
- 1745-1337

- DOI
- 10.1587/transfun.E92.A.1209

- Type of Manuscript
- PAPER

- Category
- Neural Networks and Bioengineering

The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.

Copy

Rameswar DEBNATH, Masakazu MURAMATSU, Haruhisa TAKAHASHI, "Implementation Issues of Second-Order Cone Programming Approaches for Support Vector Machine Learning Problems" in IEICE TRANSACTIONS on Fundamentals,
vol. E92-A, no. 4, pp. 1209-1222, April 2009, doi: 10.1587/transfun.E92.A.1209.

Abstract: The core of the support vector machine (SVM) problem is a quadratic programming problem with a linear constraint and bounded variables. This problem can be transformed into the second order cone programming (SOCP) problems. An interior-point-method (IPM) can be designed for the SOCP problems in terms of storage requirements as well as computational complexity if the kernel matrix has low-rank. If the kernel matrix is not a low-rank matrix, it can be approximated by a low-rank positive semi-definite matrix, which in turn will be fed into the optimizer. In this paper we present two SOCP formulations for each SVM classification and regression problem. There are several search direction methods for implementing SOCPs. Our main goal is to find a better search direction for implementing the SOCP formulations of the SVM problems. Two popular search direction methods: HKM and AHO are tested analytically for the SVM problems, and efficiently implemented. The computational costs of each iteration of the HKM and AHO search direction methods are shown to be the same for the SVM problems. Thus, the training time depends on the number of IPM iterations. Our experimental results show that the HKM method converges faster than the AHO method. We also compare our results with the method proposed in Fine and Scheinberg (2001) that also exploits the low-rank of the kernel matrix, the state-of-the-art SVM optimization softwares *SVMTorch* and SVM^{light}. The proposed methods are also compared with Joachims 'Linear SVM' method on linear kernel.

URL: https://global.ieice.org/en_transactions/fundamentals/10.1587/transfun.E92.A.1209/_p

Copy

@ARTICLE{e92-a_4_1209,

author={Rameswar DEBNATH, Masakazu MURAMATSU, Haruhisa TAKAHASHI, },

journal={IEICE TRANSACTIONS on Fundamentals},

title={Implementation Issues of Second-Order Cone Programming Approaches for Support Vector Machine Learning Problems},

year={2009},

volume={E92-A},

number={4},

pages={1209-1222},

abstract={The core of the support vector machine (SVM) problem is a quadratic programming problem with a linear constraint and bounded variables. This problem can be transformed into the second order cone programming (SOCP) problems. An interior-point-method (IPM) can be designed for the SOCP problems in terms of storage requirements as well as computational complexity if the kernel matrix has low-rank. If the kernel matrix is not a low-rank matrix, it can be approximated by a low-rank positive semi-definite matrix, which in turn will be fed into the optimizer. In this paper we present two SOCP formulations for each SVM classification and regression problem. There are several search direction methods for implementing SOCPs. Our main goal is to find a better search direction for implementing the SOCP formulations of the SVM problems. Two popular search direction methods: HKM and AHO are tested analytically for the SVM problems, and efficiently implemented. The computational costs of each iteration of the HKM and AHO search direction methods are shown to be the same for the SVM problems. Thus, the training time depends on the number of IPM iterations. Our experimental results show that the HKM method converges faster than the AHO method. We also compare our results with the method proposed in Fine and Scheinberg (2001) that also exploits the low-rank of the kernel matrix, the state-of-the-art SVM optimization softwares *SVMTorch* and SVM^{light}. The proposed methods are also compared with Joachims 'Linear SVM' method on linear kernel.},

keywords={},

doi={10.1587/transfun.E92.A.1209},

ISSN={1745-1337},

month={April},}

Copy

TY - JOUR

TI - Implementation Issues of Second-Order Cone Programming Approaches for Support Vector Machine Learning Problems

T2 - IEICE TRANSACTIONS on Fundamentals

SP - 1209

EP - 1222

AU - Rameswar DEBNATH

AU - Masakazu MURAMATSU

AU - Haruhisa TAKAHASHI

PY - 2009

DO - 10.1587/transfun.E92.A.1209

JO - IEICE TRANSACTIONS on Fundamentals

SN - 1745-1337

VL - E92-A

IS - 4

JA - IEICE TRANSACTIONS on Fundamentals

Y1 - April 2009

AB - The core of the support vector machine (SVM) problem is a quadratic programming problem with a linear constraint and bounded variables. This problem can be transformed into the second order cone programming (SOCP) problems. An interior-point-method (IPM) can be designed for the SOCP problems in terms of storage requirements as well as computational complexity if the kernel matrix has low-rank. If the kernel matrix is not a low-rank matrix, it can be approximated by a low-rank positive semi-definite matrix, which in turn will be fed into the optimizer. In this paper we present two SOCP formulations for each SVM classification and regression problem. There are several search direction methods for implementing SOCPs. Our main goal is to find a better search direction for implementing the SOCP formulations of the SVM problems. Two popular search direction methods: HKM and AHO are tested analytically for the SVM problems, and efficiently implemented. The computational costs of each iteration of the HKM and AHO search direction methods are shown to be the same for the SVM problems. Thus, the training time depends on the number of IPM iterations. Our experimental results show that the HKM method converges faster than the AHO method. We also compare our results with the method proposed in Fine and Scheinberg (2001) that also exploits the low-rank of the kernel matrix, the state-of-the-art SVM optimization softwares *SVMTorch* and SVM^{light}. The proposed methods are also compared with Joachims 'Linear SVM' method on linear kernel.

ER -