The search functionality is under construction.

The search functionality is under construction.

The mixture modeling framework is widely used in many applications. In this paper, we propose a *component reduction* technique, that collapses a Gaussian mixture model into a Gaussian mixture with fewer components. The EM (Expectation-Maximization) algorithm is usually used to fit a mixture model to data. Our algorithm is derived by extending mixture model learning using the EM-algorithm. In this extension, a difficulty arises from the fact that some crucial quantities cannot be evaluated analytically. We overcome this difficulty by introducing an effective approximation. The effectiveness of our algorithm is demonstrated by applying it to a simple synthetic component reduction task and a phoneme clustering problem.

- Publication
- IEICE TRANSACTIONS on Information Vol.E91-D No.12 pp.2846-2853

- Publication Date
- 2008/12/01

- Publicized

- Online ISSN
- 1745-1361

- DOI
- 10.1093/ietisy/e91-d.12.2846

- Type of Manuscript
- PAPER

- Category
- Pattern Recognition

The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.

Copy

Kumiko MAEBASHI, Nobuo SUEMATSU, Akira HAYASHI, "Component Reduction for Gaussian Mixture Models" in IEICE TRANSACTIONS on Information,
vol. E91-D, no. 12, pp. 2846-2853, December 2008, doi: 10.1093/ietisy/e91-d.12.2846.

Abstract: The mixture modeling framework is widely used in many applications. In this paper, we propose a *component reduction* technique, that collapses a Gaussian mixture model into a Gaussian mixture with fewer components. The EM (Expectation-Maximization) algorithm is usually used to fit a mixture model to data. Our algorithm is derived by extending mixture model learning using the EM-algorithm. In this extension, a difficulty arises from the fact that some crucial quantities cannot be evaluated analytically. We overcome this difficulty by introducing an effective approximation. The effectiveness of our algorithm is demonstrated by applying it to a simple synthetic component reduction task and a phoneme clustering problem.

URL: https://global.ieice.org/en_transactions/information/10.1093/ietisy/e91-d.12.2846/_p

Copy

@ARTICLE{e91-d_12_2846,

author={Kumiko MAEBASHI, Nobuo SUEMATSU, Akira HAYASHI, },

journal={IEICE TRANSACTIONS on Information},

title={Component Reduction for Gaussian Mixture Models},

year={2008},

volume={E91-D},

number={12},

pages={2846-2853},

abstract={The mixture modeling framework is widely used in many applications. In this paper, we propose a *component reduction* technique, that collapses a Gaussian mixture model into a Gaussian mixture with fewer components. The EM (Expectation-Maximization) algorithm is usually used to fit a mixture model to data. Our algorithm is derived by extending mixture model learning using the EM-algorithm. In this extension, a difficulty arises from the fact that some crucial quantities cannot be evaluated analytically. We overcome this difficulty by introducing an effective approximation. The effectiveness of our algorithm is demonstrated by applying it to a simple synthetic component reduction task and a phoneme clustering problem.},

keywords={},

doi={10.1093/ietisy/e91-d.12.2846},

ISSN={1745-1361},

month={December},}

Copy

TY - JOUR

TI - Component Reduction for Gaussian Mixture Models

T2 - IEICE TRANSACTIONS on Information

SP - 2846

EP - 2853

AU - Kumiko MAEBASHI

AU - Nobuo SUEMATSU

AU - Akira HAYASHI

PY - 2008

DO - 10.1093/ietisy/e91-d.12.2846

JO - IEICE TRANSACTIONS on Information

SN - 1745-1361

VL - E91-D

IS - 12

JA - IEICE TRANSACTIONS on Information

Y1 - December 2008

AB - The mixture modeling framework is widely used in many applications. In this paper, we propose a *component reduction* technique, that collapses a Gaussian mixture model into a Gaussian mixture with fewer components. The EM (Expectation-Maximization) algorithm is usually used to fit a mixture model to data. Our algorithm is derived by extending mixture model learning using the EM-algorithm. In this extension, a difficulty arises from the fact that some crucial quantities cannot be evaluated analytically. We overcome this difficulty by introducing an effective approximation. The effectiveness of our algorithm is demonstrated by applying it to a simple synthetic component reduction task and a phoneme clustering problem.

ER -