Classification tasks in computer vision and brain-computer interface research have presented several applications such as biometrics and cognitive training. However, like in any other discipline, determining suitable representation of data has been challenging, and recent approaches have deviated from the familiar form of one vector for each data sample. This paper considers a kernel between vector sets, the mean polynomial kernel, motivated by recent studies where data are approximated by linear subspaces, in particular, methods that were formulated on Grassmann manifolds. This kernel takes a more general approach given that it can also support input data that can be modeled as a vector sequence, and not necessarily requiring it to be a linear subspace. We discuss how the kernel can be associated with the Projection kernel, a Grassmann kernel. Experimental results using face image sequences and physiological signal data show that the mean polynomial kernel surpasses existing subspace-based methods on Grassmann manifolds in terms of predictive performance and efficiency.
Raissa RELATOR
Gunma University
Yoshihiro HIROHASHI
Gunma University
Eisuke ITO
Gunma University
Tsuyoshi KATO
Gunma University
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Raissa RELATOR, Yoshihiro HIROHASHI, Eisuke ITO, Tsuyoshi KATO, "Mean Polynomial Kernel and Its Application to Vector Sequence Recognition" in IEICE TRANSACTIONS on Information,
vol. E97-D, no. 7, pp. 1855-1863, July 2014, doi: 10.1587/transinf.E97.D.1855.
Abstract: Classification tasks in computer vision and brain-computer interface research have presented several applications such as biometrics and cognitive training. However, like in any other discipline, determining suitable representation of data has been challenging, and recent approaches have deviated from the familiar form of one vector for each data sample. This paper considers a kernel between vector sets, the mean polynomial kernel, motivated by recent studies where data are approximated by linear subspaces, in particular, methods that were formulated on Grassmann manifolds. This kernel takes a more general approach given that it can also support input data that can be modeled as a vector sequence, and not necessarily requiring it to be a linear subspace. We discuss how the kernel can be associated with the Projection kernel, a Grassmann kernel. Experimental results using face image sequences and physiological signal data show that the mean polynomial kernel surpasses existing subspace-based methods on Grassmann manifolds in terms of predictive performance and efficiency.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.E97.D.1855/_p
Copy
@ARTICLE{e97-d_7_1855,
author={Raissa RELATOR, Yoshihiro HIROHASHI, Eisuke ITO, Tsuyoshi KATO, },
journal={IEICE TRANSACTIONS on Information},
title={Mean Polynomial Kernel and Its Application to Vector Sequence Recognition},
year={2014},
volume={E97-D},
number={7},
pages={1855-1863},
abstract={Classification tasks in computer vision and brain-computer interface research have presented several applications such as biometrics and cognitive training. However, like in any other discipline, determining suitable representation of data has been challenging, and recent approaches have deviated from the familiar form of one vector for each data sample. This paper considers a kernel between vector sets, the mean polynomial kernel, motivated by recent studies where data are approximated by linear subspaces, in particular, methods that were formulated on Grassmann manifolds. This kernel takes a more general approach given that it can also support input data that can be modeled as a vector sequence, and not necessarily requiring it to be a linear subspace. We discuss how the kernel can be associated with the Projection kernel, a Grassmann kernel. Experimental results using face image sequences and physiological signal data show that the mean polynomial kernel surpasses existing subspace-based methods on Grassmann manifolds in terms of predictive performance and efficiency.},
keywords={},
doi={10.1587/transinf.E97.D.1855},
ISSN={1745-1361},
month={July},}
Copy
TY - JOUR
TI - Mean Polynomial Kernel and Its Application to Vector Sequence Recognition
T2 - IEICE TRANSACTIONS on Information
SP - 1855
EP - 1863
AU - Raissa RELATOR
AU - Yoshihiro HIROHASHI
AU - Eisuke ITO
AU - Tsuyoshi KATO
PY - 2014
DO - 10.1587/transinf.E97.D.1855
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E97-D
IS - 7
JA - IEICE TRANSACTIONS on Information
Y1 - July 2014
AB - Classification tasks in computer vision and brain-computer interface research have presented several applications such as biometrics and cognitive training. However, like in any other discipline, determining suitable representation of data has been challenging, and recent approaches have deviated from the familiar form of one vector for each data sample. This paper considers a kernel between vector sets, the mean polynomial kernel, motivated by recent studies where data are approximated by linear subspaces, in particular, methods that were formulated on Grassmann manifolds. This kernel takes a more general approach given that it can also support input data that can be modeled as a vector sequence, and not necessarily requiring it to be a linear subspace. We discuss how the kernel can be associated with the Projection kernel, a Grassmann kernel. Experimental results using face image sequences and physiological signal data show that the mean polynomial kernel surpasses existing subspace-based methods on Grassmann manifolds in terms of predictive performance and efficiency.
ER -