Domain knowledge is useful to improve the generalization performance of learning machines. Sign constraints are a handy representation to combine domain knowledge with learning machine. In this paper, we consider constraining the signs of the weight coefficients in learning the linear support vector machine, and develop an optimization algorithm for minimizing the empirical risk under the sign constraints. The algorithm is based on the Frank-Wolfe method that also converges sublinearly and possesses a clear termination criterion. We show that each iteration of the Frank-Wolfe also requires O(nd+d2) computational cost. Furthermore, we derive the explicit expression for the minimal iteration number to ensure an ε-accurate solution by analyzing the curvature of the objective function. Finally, we empirically demonstrate that the sign constraints are a promising technique when similarities to the training examples compose the feature vector.
Kenya TAJIMA
Gunma University
Takahiko HENMI
Gunma University
Tsuyoshi KATO
Gunma University
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Kenya TAJIMA, Takahiko HENMI, Tsuyoshi KATO, "Frank-Wolfe for Sign-Constrained Support Vector Machines" in IEICE TRANSACTIONS on Information,
vol. E105-D, no. 10, pp. 1734-1742, October 2022, doi: 10.1587/transinf.2022EDP7069.
Abstract: Domain knowledge is useful to improve the generalization performance of learning machines. Sign constraints are a handy representation to combine domain knowledge with learning machine. In this paper, we consider constraining the signs of the weight coefficients in learning the linear support vector machine, and develop an optimization algorithm for minimizing the empirical risk under the sign constraints. The algorithm is based on the Frank-Wolfe method that also converges sublinearly and possesses a clear termination criterion. We show that each iteration of the Frank-Wolfe also requires O(nd+d2) computational cost. Furthermore, we derive the explicit expression for the minimal iteration number to ensure an ε-accurate solution by analyzing the curvature of the objective function. Finally, we empirically demonstrate that the sign constraints are a promising technique when similarities to the training examples compose the feature vector.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.2022EDP7069/_p
Copy
@ARTICLE{e105-d_10_1734,
author={Kenya TAJIMA, Takahiko HENMI, Tsuyoshi KATO, },
journal={IEICE TRANSACTIONS on Information},
title={Frank-Wolfe for Sign-Constrained Support Vector Machines},
year={2022},
volume={E105-D},
number={10},
pages={1734-1742},
abstract={Domain knowledge is useful to improve the generalization performance of learning machines. Sign constraints are a handy representation to combine domain knowledge with learning machine. In this paper, we consider constraining the signs of the weight coefficients in learning the linear support vector machine, and develop an optimization algorithm for minimizing the empirical risk under the sign constraints. The algorithm is based on the Frank-Wolfe method that also converges sublinearly and possesses a clear termination criterion. We show that each iteration of the Frank-Wolfe also requires O(nd+d2) computational cost. Furthermore, we derive the explicit expression for the minimal iteration number to ensure an ε-accurate solution by analyzing the curvature of the objective function. Finally, we empirically demonstrate that the sign constraints are a promising technique when similarities to the training examples compose the feature vector.},
keywords={},
doi={10.1587/transinf.2022EDP7069},
ISSN={1745-1361},
month={October},}
Copy
TY - JOUR
TI - Frank-Wolfe for Sign-Constrained Support Vector Machines
T2 - IEICE TRANSACTIONS on Information
SP - 1734
EP - 1742
AU - Kenya TAJIMA
AU - Takahiko HENMI
AU - Tsuyoshi KATO
PY - 2022
DO - 10.1587/transinf.2022EDP7069
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E105-D
IS - 10
JA - IEICE TRANSACTIONS on Information
Y1 - October 2022
AB - Domain knowledge is useful to improve the generalization performance of learning machines. Sign constraints are a handy representation to combine domain knowledge with learning machine. In this paper, we consider constraining the signs of the weight coefficients in learning the linear support vector machine, and develop an optimization algorithm for minimizing the empirical risk under the sign constraints. The algorithm is based on the Frank-Wolfe method that also converges sublinearly and possesses a clear termination criterion. We show that each iteration of the Frank-Wolfe also requires O(nd+d2) computational cost. Furthermore, we derive the explicit expression for the minimal iteration number to ensure an ε-accurate solution by analyzing the curvature of the objective function. Finally, we empirically demonstrate that the sign constraints are a promising technique when similarities to the training examples compose the feature vector.
ER -