The search functionality is under construction.

Author Search Result

[Author] Yasuji SAWADA(8hit)

1-8hit
  • Fractal Connection Structure: A Simple Way to lmprove Generalization in Nonlinear Learning Systems

    Basabi CHAKRABORTY  Yasuji SAWADA  

     
    PAPER-Neural Nets and Human Being

      Vol:
    E79-A No:10
      Page(s):
    1618-1623

    The capability of generalization is the most desirable property of a learning system. It is well known that to achieve good generalization, the complexity of the system should match the intrinsic complexity of the problem to be learned. In this work, introduction of fractal connection structure in nonlinear learning systems like multilayer perceptrons as a means of improving its generalization capability in classification problems has been investigated via simulation on sonar data set in underwater target classification problem. It has been found that fractally connected net has better generalization capability compared to the fully connected net and a randomly connected net of same average connectivity for proper choice of fractal dimension which controlls the average connectivity of the net.

  • Switched Diffusion Analog Memory for Neural Networks with Hebbian Learning Function and Its Linear Operation

    Hyosig WON  Yoshihiro HAYAKAWA  Koji NAKAJIMA  Yasuji SAWADA  

     
    PAPER

      Vol:
    E79-A No:6
      Page(s):
    746-751

    We have fabricated a new analog memory for integrated artificial neural networks. Several attempts have been made to develop a linear characteristics of floating-gate analog memorys with feedback circuits. The learning chip has to have a large number of learning control circuit. In this paper, we propose a new analog memory SDAM with three cascaded TFTs. The new analog memory has a simple design, a small area occupancy, a fast switching speed and an accurate linearity. To improve accurate linearity, we propose a new chargetransfer process. The device has a tunnel junction (poly-Si/poly-Si oxide/poly-Si sandwich structure), a thin-film transistor, two capacitors, and a floating-gate MOSFET. The diffusion of the charges injected through the tunnel junction are controlled by a source follower operation of a thin film transistor (TFT). The proposed operation is possible that the amounts of transferred charges are constant independent of the charges in storage capacitor.

  • Limit Cycles of One-Dimensional Neural Networks with the Cyclic Connection Matrix

    Cheol-Young PARK  Yoshihiro HAYAKAWA  Koji NAKAJIMA  Yasuji SAWADA  

     
    PAPER

      Vol:
    E79-A No:6
      Page(s):
    752-757

    In this paper, a simple method to investigate the dynamics of continuous-time neural networks based on the force (kinetic vector) derived from the equation of motion for neural networks instead of the energy function of the system has been described. The number of equilibrium points and limit cycles of one-dimensional neural networks with the asymmetric cyclic connection matrix has been investigated experimently by this method. Some types of equilibrium points and limit cycles have been theoretically analyzed. The relations between the properties of limit cycles and the number of connections also have been discussed.

  • Fluxoid Transmission Line Using Series Josephson Junctions

    Hiroshi TAMAYAMA  Tsutomu YAMASHITA  Yutaka ONODERA  Yasuji SAWADA  

     
    PAPER-Other Devices

      Vol:
    E64-E No:11
      Page(s):
    724-729

    The discrete Josephson junction transmission line which has N series Josephson junctions in each loop is discussed by computer simulation. It is found that a single quantum in this line can be stuffed in about one loop even if the inductance of each loop decreases to be negligibly small, i.e., the contribution of flux to a fluxoid is decreased. In this line the very small size fluxoid quanta with little flux can be employed as information bits. The resistive Josephson transmission lines whose loops have resistances in series are also discussed. In the resistive lines various operations are possible because of relaxation of the quantization conditions.

  • LSI Neural Chip of Pulse-Output Network with Programmable Synapse

    Shigeo SATO  Manabu YUMINE  Takayuki YAMA  Junichi MUROTA  Koji NAKAJIMA  Yasuji SAWADA  

     
    PAPER-Integrated Electronics

      Vol:
    E78-C No:1
      Page(s):
    94-100

    We have fabricated a microchip of a neural circuit with pulse representation. The neuron output is a voltage pulse train. The synapse is a constant current source whose output is proportional to the duty ratio of neuron output. Membrane potential is charged by collection of synaptic currents through a RC circuit, providing an analog operation similar to the biological neural system. We use a 4-bit SRAM as the memory for synaptic weights. The expected I/O characteristics of the neurons and the synapses were measured experimentally. We have also demonstrated the capability of network operation with the use of synaptic weights, for solving the A/D conversion problem.

  • Natural Laws and Information Processing

    Yasuji SAWADA  

     
    INVITED PAPER

      Vol:
    E76-C No:7
      Page(s):
    1064-1069

    We discuss possible new principles of information processing by utilizing microscopic, semi-microscopic and macroscopic phenomena occuring in nature. We first discuss quantum mechanical universal information processing in microscopic world governed by quantum mechanics, and then we discuss superconducting phenomena in a mesoscopic system, especially an information processing system using flux quantum. Finally, we discuss macroscopic self-organizing phenomena in biology and suggest possibility of self-organizing devices.

  • Hardware Implementation of New Analog Memory for Neural Networks

    Koji NAKAJIMA  Shigeo SATO  Tomoyasu KITAURA  Junichi MUROTA  Yasuji SAWADA  

     
    PAPER-Integrated Electronics

      Vol:
    E78-C No:1
      Page(s):
    101-105

    We have fabricated a new analog memory with a floating gate as a key component to store synaptic weights for integrated artificial neural networks. The new analog memory comprises a tunnel junction (poly-Si/poly-si oxide/poly-Si sandwich structure), a thin-film transistor, two capacitors, and a floating gate MOSFET. The diffusion of the charges injected through the tunnel junction is controlled by switching operation of the thin-film transistor, and we refer to the new analog memory as switched diffusion analog memory (SDAM). The obtained characteristics of SDAM are a fast switching speed and an improved linearity between the potential of the floating gate and the number of pulse inputs. SDAM can be used in a neural network in which write/erase and read operations are performed simultaneously.

  • Fractal Neural Network Feature Selector for Automatic Pattern Recognition System

    Basabi CHAKRABORTY  Yasuji SAWADA  

     
    PAPER

      Vol:
    E82-A No:9
      Page(s):
    1845-1850

    Feature selection is an integral part of any pattern recognition system. Removal of redundant features improves the efficiency of a classifier as well as cut down the cost of future feature extraction. Recently neural network classifiers have become extremely popular compared to their counterparts from statistical theory. Some works on the use of artificial neural network as a feature selector have already been reported. In this work a simple feature selection algorithm has been proposed in which a fractal neural network, a modified version of multilayer perceptron, has been used as a feature selector. Experiments have been done with IRIS and SONAR data set by simulation. Results suggest that the algorithm with the fractal network architecture works well for removal of redundant informations as tested by classification rate. The fractal neural network takes lesser training time than the conventional multilayer perceptron for its lower connectivity while its performance is comparable to the multilayer perceptron. The ease of hardware implementation is also an attractive point in designing feature selector with fractal neural network.