The search functionality is under construction.

The search functionality is under construction.

Following the formulation of Support Vector Regression (SVR), we consider a regression analogue of soft margin optimization over the feature space indexed by a hypothesis class *H*. More specifically, the problem is to find a linear model * w* ∈ ℝ

- Publication
- IEICE TRANSACTIONS on Information Vol.E107-D No.3 pp.294-300

- Publication Date
- 2024/03/01

- Publicized
- 2023/11/15

- Online ISSN
- 1745-1361

- DOI
- 10.1587/transinf.2023FCP0004

- Type of Manuscript
- Special Section PAPER (Special Section on Foundations of Computer Science — Foundations of Computer Science and their New Trends —)

- Category

Ryotaro MITSUBOSHI

Kyushu University,RIKEN AIP

Kohei HATANO

Kyushu University,RIKEN AIP

Eiji TAKIMOTO

Kyushu University

The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.

Copy

Ryotaro MITSUBOSHI, Kohei HATANO, Eiji TAKIMOTO, "Solving Linear Regression with Insensitive Loss by Boosting" in IEICE TRANSACTIONS on Information,
vol. E107-D, no. 3, pp. 294-300, March 2024, doi: 10.1587/transinf.2023FCP0004.

Abstract: Following the formulation of Support Vector Regression (SVR), we consider a regression analogue of soft margin optimization over the feature space indexed by a hypothesis class *H*. More specifically, the problem is to find a linear model * w* ∈ ℝ

URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.2023FCP0004/_p

Copy

@ARTICLE{e107-d_3_294,

author={Ryotaro MITSUBOSHI, Kohei HATANO, Eiji TAKIMOTO, },

journal={IEICE TRANSACTIONS on Information},

title={Solving Linear Regression with Insensitive Loss by Boosting},

year={2024},

volume={E107-D},

number={3},

pages={294-300},

abstract={Following the formulation of Support Vector Regression (SVR), we consider a regression analogue of soft margin optimization over the feature space indexed by a hypothesis class *H*. More specifically, the problem is to find a linear model * w* ∈ ℝ

keywords={},

doi={10.1587/transinf.2023FCP0004},

ISSN={1745-1361},

month={March},}

Copy

TY - JOUR

TI - Solving Linear Regression with Insensitive Loss by Boosting

T2 - IEICE TRANSACTIONS on Information

SP - 294

EP - 300

AU - Ryotaro MITSUBOSHI

AU - Kohei HATANO

AU - Eiji TAKIMOTO

PY - 2024

DO - 10.1587/transinf.2023FCP0004

JO - IEICE TRANSACTIONS on Information

SN - 1745-1361

VL - E107-D

IS - 3

JA - IEICE TRANSACTIONS on Information

Y1 - March 2024

AB - Following the formulation of Support Vector Regression (SVR), we consider a regression analogue of soft margin optimization over the feature space indexed by a hypothesis class *H*. More specifically, the problem is to find a linear model * w* ∈ ℝ

ER -