The search functionality is under construction.

Author Search Result

[Author] Seiji MIYOSHI(8hit)

1-8hit
  • Statistical Mechanics of Adaptive Weight Perturbation Learning

    Ryosuke MIYOSHI  Yutaka MAEDA  Seiji MIYOSHI  

     
    LETTER

      Vol:
    E94-D No:10
      Page(s):
    1937-1940

    Weight perturbation learning was proposed as a learning rule in which perturbation is added to the variable parameters of learning machines. The generalization performance of weight perturbation learning was analyzed by statistical mechanical methods and was found to have the same asymptotic generalization property as perceptron learning. In this paper we consider the difference between perceptron learning and AdaTron learning, both of which are well-known learning rules. By applying this difference to weight perturbation learning, we propose adaptive weight perturbation learning. The generalization performance of the proposed rule is analyzed by statistical mechanical methods, and it is shown that the proposed learning rule has an outstanding asymptotic property equivalent to that of AdaTron learning.

  • Statistical-Mechanical Analysis of Adaptive Volterra Filter with the LMS Algorithm Open Access

    Kimiko MOTONAKA  Tomoya KOSEKI  Yoshinobu KAJIKAWA  Seiji MIYOSHI  

     
    PAPER-Digital Signal Processing

      Pubricized:
    2021/06/01
      Vol:
    E104-A No:12
      Page(s):
    1665-1674

    The Volterra filter is one of the digital filters that can describe nonlinearity. In this paper, we analyze the dynamic behaviors of an adaptive signal-processing system including the Volterra filter by a statistical-mechanical method. On the basis of the self-averaging property that holds when the tapped delay line is assumed to be infinitely long, we derive simultaneous differential equations in a deterministic and closed form, which describe the behaviors of macroscopic variables. We obtain the exact solution by solving the equations analytically. In addition, the validity of the theory derived is confirmed by comparison with numerical simulations.

  • Estimation of Distribution Algorithm Incorporating Switching

    Kenji TSUCHIE  Yoshiko HANADA  Seiji MIYOSHI  

     
    LETTER-Fundamentals of Information Systems

      Vol:
    E93-D No:11
      Page(s):
    3108-3111

    We propose an "estimation of distribution algorithm" incorporating switching. The algorithm enables switching from the standard estimation of distribution algorithm (EDA) to the genetic algorithm (GA), or vice versa, on the basis of switching criteria. The algorithm shows better performance than GA and EDA in deceptive problems.

  • Statistical-Mechanical Analysis of Adaptive Volterra Filter for Nonwhite Input Signals

    Koyo KUGIYAMA  Seiji MIYOSHI  

     
    PAPER

      Pubricized:
    2023/07/13
      Vol:
    E107-A No:1
      Page(s):
    87-95

    The Volterra filter is one of the digital filters that can describe nonlinearity. In this paper, we analyze the dynamic behaviors of an adaptive signal processing system with the Volterra filter for nonwhite input signals by a statistical-mechanical method. Assuming the self-averaging property with an infinitely long tapped-delay line, we derive simultaneous differential equations that describe the behaviors of macroscopic variables in a deterministic and closed form. We analytically solve the derived equations to reveal the effect of the nonwhiteness of the input signal on the adaptation process. The results for the second-order Volterra filter show that the nonwhiteness decreases the mean-square error (MSE) in the early stages of the adaptation process and increases the MSE in the later stages.

  • Statistical Mechanics of On-Line Learning Using Correlated Examples

    Kento NAKAO  Yuta NARUKAWA  Seiji MIYOSHI  

     
    LETTER

      Vol:
    E94-D No:10
      Page(s):
    1941-1944

    We consider a model composed of nonlinear perceptrons and analytically investigate its generalization performance using correlated examples in the framework of on-line learning by a statistical mechanical method. In Hebbian and AdaTron learning, the larger the number of examples used in an update, the slower the learning. In contrast, Perceptron learning does not exhibit such behaviors, and the learning becomes fast in some time region.

  • Statistical-Mechanics Approach to Theoretical Analysis of the FXLMS Algorithm Open Access

    Seiji MIYOSHI  Yoshinobu KAJIKAWA  

     
    PAPER-Digital Signal Processing

      Vol:
    E101-A No:12
      Page(s):
    2419-2433

    We analyze the behaviors of the FXLMS algorithm using a statistical-mechanical method. The cross-correlation between a primary path and an adaptive filter and the autocorrelation of the adaptive filter are treated as macroscopic variables. We obtain simultaneous differential equations that describe the dynamical behaviors of the macroscopic variables under the condition that the tapped-delay line is sufficiently long. The obtained equations are deterministic and closed-form. We analytically solve the equations to obtain the correlations and finally compute the mean-square error. The obtained theory can quantitatively predict the behaviors of computer simulations including the cases of both not only white but also nonwhite reference signals. The theory also gives the upper limit of the step size in the FXLMS algorithm.

  • Statistical Mechanical Analysis of Simultaneous Perturbation Learning

    Seiji MIYOSHI  Hiroomi HIKAWA  Yutaka MAEDA  

     
    LETTER-Neural Networks and Bioengineering

      Vol:
    E92-A No:7
      Page(s):
    1743-1746

    We show that simultaneous perturbation can be used as an algorithm for on-line learning, and we report our theoretical investigation on generalization performance obtained with a statistical mechanical method. Asymptotic behavior of generalization error using this algorithm is on the order of t to the minus one-third power, where t is the learning time or the number of learning examples. This order is the same as that using well-known perceptron learning.

  • A Theoretical Analysis of On-Line Learning Using Correlated Examples

    Chihiro SEKI  Shingo SAKURAI  Masafumi MATSUNO  Seiji MIYOSHI  

     
    PAPER-Neural Networks and Bioengineering

      Vol:
    E91-A No:9
      Page(s):
    2663-2670

    In this paper we analytically investigate the generalization performance of learning using correlated inputs in the framework of on-line learning with a statistical mechanical method. We consider a model composed of linear perceptrons with Gaussian noise. First, we analyze the case of the gradient method. We analytically clarify that the larger the correlation among inputs is or the larger the number of inputs is, the stricter the condition the learning rate should satisfy is, and the slower the learning speed is. Second, we treat the block orthogonal projection learning as an alternative learning rule and derive the theory. In a noiseless case, the learning speed does not depend on the correlation and is proportional to the number of inputs used in an update. The learning speed is identical to that of the gradient method with uncorrelated inputs. On the other hand, when there is noise, the larger the correlation among inputs is, the slower the learning speed is and the larger the residual generalization error is.