The search functionality is under construction.

Author Search Result

[Author] Shun-ichi AMARI(11hit)

1-11hit
  • Differential and Algebraic Geometry of Multilayer Perceptrons

    Shun-ichi AMARI  Tomoko OZEKI  

     
    INVITED PAPER

      Vol:
    E84-A No:1
      Page(s):
    31-38

    Information geometry is applied to the manifold of neural networks called multilayer perceptrons. It is important to study a total family of networks as a geometrical manifold, because learning is represented by a trajectory in such a space. The manifold of perceptrons has a rich differential-geometrical structure represented by a Riemannian metric and singularities. An efficient learning method is proposed by using it. The parameter space of perceptrons includes a lot of algebraic singularities, which affect trajectories of learning. Such singularities are studied by using simple models. This poses an interesting problem of statistical inference and learning in hierarchical models including singularities.

  • A Cascade Neural Network for Blind Signal Extraction without Spurious Equilibria

    Ruck THAWONMAS  Andrzej CICHOCKI  Shun-ichi AMARI  

     
    PAPER-Neural Networks

      Vol:
    E81-A No:9
      Page(s):
    1833-1846

    We present a cascade neural network for blind source extraction. We propose a family of unconstrained optimization criteria, from which we derive a learning rule that can extract a single source signal from a linear mixture of source signals. To prevent the newly extracted source signal from being extracted again in the next processing unit, we propose another unconstrained optimization criterion that uses knowledge of this signal. From this criterion, we then derive a learning rule that deflates from the mixture the newly extracted signal. By virtue of blind extraction and deflation processing, the presented cascade neural network can cope with a practical case where the number of mixed signals is equal to or larger than the number of sources, with the number of sources not known in advance. We prove analytically that the proposed criteria both for blind extraction and deflation processing have no spurious equilibria. In addition, the proposed criteria do not require whitening of mixed signals. We also demonstrate the validity and performance of the presented neural network by computer simulation experiments.

  • Single-Trial Magnetoencephalographic Data Decomposition and Localization Based on Independent Component Analysis Approach

    Jianting CAO  Noboru MURATA  Shun-ichi AMARI  Andrzej CICHOCKI  Tsunehiro TAKEDA  Hiroshi ENDO  Nobuyoshi HARADA  

     
    PAPER-Nonlinear Problems

      Vol:
    E83-A No:9
      Page(s):
    1757-1766

    Magnetoencephalography (MEG) is a powerful and non-invasive technique for measuring human brain activity with a high temporal resolution. The motivation for studying MEG data analysis is to extract the essential features from measured data and represent them corresponding to the human brain functions. In this paper, a novel MEG data analysis method based on independent component analysis (ICA) approach with pre-processing and post-processing multistage procedures is proposed. Moreover, several kinds of ICA algorithms are investigated for analyzing MEG single-trial data which is recorded in the experiment of phantom. The analyzed results are presented to illustrate the effectiveness and high performance both in source decomposition by ICA approaches and source localization by equivalent current dipoles fitting method.

  • Information Geometry of Neural Networks

    Shun-ichi AMARI  

     
    INVITED PAPER

      Vol:
    E75-A No:5
      Page(s):
    531-536

    Information geometry is a new powerful method of information sciences. Information geometry is applied to manifolds of neural networks of various architectures. Here is proposed a new theoretical approach to the manifold consisting of feedforward neural networks, the manifold of Boltzmann machines and the manifold of neural networks of recurrent connections. This opens a new direction of studies on a family of neural networks, not a study of behaviors of single neural networks.

  • Approximate Maximum Likelihood Source Separation Using the Natural Gradient

    Seungjin CHOI  Andrzej CICHOCKI  Liqing ZHANG  Shun-ichi AMARI  

     
    PAPER-Digital Signal Processing

      Vol:
    E86-A No:1
      Page(s):
    198-205

    This paper addresses a maximum likelihood method for source separation in the case of overdetermined mixtures corrupted by additive white Gaussian noise. We consider an approximate likelihood which is based on the Laplace approximation and develop a natural gradient adaptation algorithm to find a local maximum of the corresponding approximate likelihood. We present a detailed mathematical derivation of the algorithm using the Lie group invariance. Useful behavior of the algorithm is verified by numerical experiments.

  • Neural Network Models for Blind Separation of Time Delayed and Convolved Signals

    Andrzej CICHOCKI  Shun-ichi AMARI  Jianting CAO  

     
    PAPER

      Vol:
    E80-A No:9
      Page(s):
    1595-1603

    In this paper we develop a new family of on-line adaptive learning algorithms for blind separation of time delayed and convolved sources. The algorithms are derived for feedforward and fully connected feedback (recurrent) neural networks on basis of modified natural gradient approach. The proposed algorithms can be considered as generalization and extension of existing algorithms for instantaneous mixture of unknown source signals. Preliminary computer simulations confirm validity and high performance of the proposed algorithms.

  • Geometry of Admissible Parameter Region in Neural Learning

    Kazushi IKEDA  Shun-Ichi AMARI  

     
    PAPER-Neural Networks

      Vol:
    E79-A No:6
      Page(s):
    938-943

    In general, a learning machine will behave better as the number of training examples increases. It is important to know how fast and how well the behavior is improved. The average prediction error, the average of the probability that the trained machine mispredicts the output signal, is one of the most popular criteria to see the behavior. However, it is not easy to evaluate the average prediction error even in the simplest case, that is, the linear dichotomy (perceptron) case. When a continuous deterministic dichotomy machine is trained by t examples of input-output pairs produced from a realizable teacher, these examples limits the region of the parameter space which includes the true parameter. Any parameter in the region can explain the input-output behaviors of the examples. Such a region, celled the admissible region, forms in general a (curved) polyhedron in the parameter space, and it becomes smaller and smaller as the number of examples increases. The present paper studies the shape and volume of the admissible region. We use the stochastic geometrical approach to this problem. We have studied the stochastic geometrical features of the admissible region using the fact that it is dual to the convex hull the examples make in the example space. Since the admissible region is related to the average prediction error of the linear dichotomy, we derived the new upper and lower bounds of the average prediction error.

  • FOREWORD

    Shun-ichi AMARI  

     
    FOREWORD

      Vol:
    E76-A No:5
      Page(s):
    677-677
  • Independent Component Analysis (ICA) and Method of Estimating Functions

    Shun-ichi AMARI  

     
    INVITED PAPER-Theories

      Vol:
    E85-A No:3
      Page(s):
    540-547

    Independent component analysis (ICA) is a new method of extracting independent components from multivariate data. It can be applied to various fields such as vision and auditory signal analysis, communication systems, and biomedical and brain engineering. There have been proposed a number of algorithms. The present article shows that most of them use estimating functions from the statistical point of view, and give a unified theory, based on information geometry, to elucidate the efficiency and stability of the algorithms. This gives new efficient adaptive algorithms useful for various problems.

  • FOREWORD

    Tosio KOGA  Shun-ichi AMARI  

     
    FOREWORD

      Vol:
    E75-A No:5
      Page(s):
    529-530
  • FOREWORD

    Shun-ichi AMARI  

     
    FOREWORD

      Vol:
    E79-A No:10
      Page(s):
    1521-1521