The search functionality is under construction.

The search functionality is under construction.

As the nonlinear adaptive filter, the neural filter is utilized to process the nonlinear signal and/or system. However, the neural filter requires large number of iterations for convergence. This letter presents a new structure of the multi-layer neural filter where the orthonormal transform is introduced into all inter-layers to accelerate the convergence speed. The proposed structure is called the transform domain neural filter (TDNF) for convenience. The weights are basically updated by the Back-Propagation (BP) algorithm but it must be modified since the error back-propagates through the orthogonal transform. Moreover, the variable step size which is normalized by the transformed signal power is introduced into the BP algorithm to realize the orthonormal transform. Through the computer simulation, it is confirmed that the introduction of the orthonormal transform is effective for speedup of convergence in the neural filter.

- Publication
- IEICE TRANSACTIONS on Fundamentals Vol.E83-A No.2 pp.367-370

- Publication Date
- 2000/02/25

- Publicized

- Online ISSN

- DOI

- Type of Manuscript
- Special Section LETTER (Special Section on Intelligent Signal and Image Processing)

- Category

The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.

Copy

Isao NAKANISHI, Yoshio ITOH, Yutaka FUKUI, "Introduction of Orthonormal Transform into Neural Filter for Accelerating Convergence Speed" in IEICE TRANSACTIONS on Fundamentals,
vol. E83-A, no. 2, pp. 367-370, February 2000, doi: .

Abstract: As the nonlinear adaptive filter, the neural filter is utilized to process the nonlinear signal and/or system. However, the neural filter requires large number of iterations for convergence. This letter presents a new structure of the multi-layer neural filter where the orthonormal transform is introduced into all inter-layers to accelerate the convergence speed. The proposed structure is called the transform domain neural filter (TDNF) for convenience. The weights are basically updated by the Back-Propagation (BP) algorithm but it must be modified since the error back-propagates through the orthogonal transform. Moreover, the variable step size which is normalized by the transformed signal power is introduced into the BP algorithm to realize the orthonormal transform. Through the computer simulation, it is confirmed that the introduction of the orthonormal transform is effective for speedup of convergence in the neural filter.

URL: https://global.ieice.org/en_transactions/fundamentals/10.1587/e83-a_2_367/_p

Copy

@ARTICLE{e83-a_2_367,

author={Isao NAKANISHI, Yoshio ITOH, Yutaka FUKUI, },

journal={IEICE TRANSACTIONS on Fundamentals},

title={Introduction of Orthonormal Transform into Neural Filter for Accelerating Convergence Speed},

year={2000},

volume={E83-A},

number={2},

pages={367-370},

abstract={As the nonlinear adaptive filter, the neural filter is utilized to process the nonlinear signal and/or system. However, the neural filter requires large number of iterations for convergence. This letter presents a new structure of the multi-layer neural filter where the orthonormal transform is introduced into all inter-layers to accelerate the convergence speed. The proposed structure is called the transform domain neural filter (TDNF) for convenience. The weights are basically updated by the Back-Propagation (BP) algorithm but it must be modified since the error back-propagates through the orthogonal transform. Moreover, the variable step size which is normalized by the transformed signal power is introduced into the BP algorithm to realize the orthonormal transform. Through the computer simulation, it is confirmed that the introduction of the orthonormal transform is effective for speedup of convergence in the neural filter.},

keywords={},

doi={},

ISSN={},

month={February},}

Copy

TY - JOUR

TI - Introduction of Orthonormal Transform into Neural Filter for Accelerating Convergence Speed

T2 - IEICE TRANSACTIONS on Fundamentals

SP - 367

EP - 370

AU - Isao NAKANISHI

AU - Yoshio ITOH

AU - Yutaka FUKUI

PY - 2000

DO -

JO - IEICE TRANSACTIONS on Fundamentals

SN -

VL - E83-A

IS - 2

JA - IEICE TRANSACTIONS on Fundamentals

Y1 - February 2000

AB - As the nonlinear adaptive filter, the neural filter is utilized to process the nonlinear signal and/or system. However, the neural filter requires large number of iterations for convergence. This letter presents a new structure of the multi-layer neural filter where the orthonormal transform is introduced into all inter-layers to accelerate the convergence speed. The proposed structure is called the transform domain neural filter (TDNF) for convenience. The weights are basically updated by the Back-Propagation (BP) algorithm but it must be modified since the error back-propagates through the orthogonal transform. Moreover, the variable step size which is normalized by the transformed signal power is introduced into the BP algorithm to realize the orthonormal transform. Through the computer simulation, it is confirmed that the introduction of the orthonormal transform is effective for speedup of convergence in the neural filter.

ER -