The search functionality is under construction.
The search functionality is under construction.

Keyword Search Result

[Keyword] inverse function delayed model(4hit)

1-4hit
  • Avoidance of the Permanent Oscillating State in the Inverse Function Delayed Neural Network

    Akari SATO  Yoshihiro HAYAKAWA  Koji NAKAJIMA  

     
    PAPER-Neuron and Neural Networks

      Vol:
    E90-A No:10
      Page(s):
    2101-2107

    Many researchers have attempted to solve the combinatorial optimization problems, that are NP-hard or NP-complete problems, by using neural networks. Though the method used in a neural network has some advantages, the local minimum problem is not solved yet. It has been shown that the Inverse Function Delayed (ID) model, which is a neuron model with a negative resistance on its dynamics and can destabilize an intended region, can be used as the powerful tool to avoid the local minima. In our previous paper, we have shown that the ID network can separate local minimum states from global minimum states in case that the energy function of the embed problem is zero. It can achieve 100% success rate in the N-Queen problem with the certain parameter region. However, for a wider parameter region, the ID network cannot reach a global minimum state while all of local minimum states are unstable. In this paper, we show that the ID network falls into a particular permanent oscillating state in this situation. Several neurons in the network keep spiking in the particular permanent oscillating state, and hence the state transition never proceed for global minima. However, we can also clarify that the oscillating state is controlled by the parameter α which affects the negative resistance region and the hysteresis property of the ID model. In consequence, there is a parameter region where combinatorial optimization problems are solved at the 100% success rate.

  • Temporal Sequences of Patterns with an Inverse Function Delayed Neural Network

    Johan SVEHOLM  Yoshihiro HAYAKAWA  Koji NAKAJIMA  

     
    PAPER-Control, Neural Networks and Learning

      Vol:
    E89-A No:10
      Page(s):
    2818-2824

    A network based on the Inverse Function Delayed (ID) model which can recall a temporal sequence of patterns, is proposed. The classical problem that the network is forced to make long distance jumps due to strong attractors that have to be isolated from each other, is solved by the introduction of the ID neuron. The ID neuron has negative resistance in its dynamics which makes a gradual change from one attractor to another possible. It is then shown that a network structure consisting of paired conventional and ID neurons, perfectly can recall a sequence.

  • Hardware Implementation of an Inverse Function Delayed Neural Network Using Stochastic Logic

    Hongge LI  Yoshihiro HAYAKAWA  Shigeo SATO  Koji NAKAJIMA  

     
    PAPER-Biocybernetics, Neurocomputing

      Vol:
    E89-D No:9
      Page(s):
    2572-2578

    In this paper, the authors present a new digital circuit of neuron hardware using a field programmable gate array (FPGA). A new Inverse function Delayed (ID) neuron model is implemented. The Inverse function Delayed model, which includes the BVP model, has superior associative properties thanks to negative resistance. An associative memory based on the ID model with self-connections has possibilities of improving its basin sizes and memory capacity. In order to decrease circuit area, we employ stochastic logic. The proposed neuron circuit completes the stimulus response output, and its retrieval property with negative resistance is superior to a conventional nonlinear model in basin size of an associative memory.

  • Retrieval Property of Associative Memory Based on Inverse Function Delayed Neural Networks

    Hongge LI  Yoshihiro HAYAKAWA  Koji NAKAJIMA  

     
    PAPER-Nonlinear Problems

      Vol:
    E88-A No:8
      Page(s):
    2192-2199

    Self-connection can enlarge the memory capacity of an associative memory based on the neural network. However, the basin size of the embedded memory state shrinks. The problem of basin size is related to undesirable stable states which are spurious. If we can destabilize these spurious states, we expect to improve the basin size. The inverse function delayed (ID) model, which includes the Bonhoeffer-van der Pol (BVP) model, has negative resistance in its dynamics. The negative resistance of the ID model can destabilize the equilibrium states on certain regions of the conventional neural network. Therefore, the associative memory based on the ID model, which has self-connection in order to enlarge the memory capacity, has the possibility to improve the basin size of the network. In this paper, we examine the fundamental characteristics of an associative memory based on the ID model by numerical simulation and show the improvement of performance compared with the conventional neural network.