The search functionality is under construction.
The search functionality is under construction.

Keyword Search Result

[Keyword] information-spectrum(8hit)

1-8hit
  • A Fundamental Inequality for Lower-Bounding the Error Probability for Classical and Classical-Quantum Multiple Access Channels and Its Applications

    Takuya KUBO  Hiroshi NAGAOKA  

     
    PAPER-Shannon Theory

      Vol:
    E98-A No:12
      Page(s):
    2376-2383

    In the study of the capacity problem for multiple access channels (MACs), a lower bound on the error probability obtained by Han plays a crucial role in the converse parts of several kinds of channel coding theorems in the information-spectrum framework. Recently, Yagi and Oohama showed a tighter bound than the Han bound by means of Polyanskiy's converse. In this paper, we give a new bound which generalizes and strengthens the Yagi-Oohama bound, and demonstrate that the bound plays a fundamental role in deriving extensions of several known bounds. In particular, the Yagi-Oohama bound is generalized to two different directions; i.e, to general input distributions and to general encoders. In addition we extend these bounds to the quantum MACs and apply them to the converse problems for several information-spectrum settings.

  • Redundancy-Optimal FF Codes for a General Source and Its Relationships to the Rate-Optimal FF Codes

    Mitsuharu ARIMURA  Hiroki KOGA  Ken-ichi IWATA  

     
    PAPER-Source Coding

      Vol:
    E96-A No:12
      Page(s):
    2332-2342

    In this paper we consider fixed-to-fixed length (FF) coding of a general source X with vanishing error probability and define two kinds of optimalities with respect to the coding rate and the redundancy, where the redundancy is defined as the difference between the coding rate and the symbolwise ideal codeword length. We first show that the infimum achievable redundancy coincides with the asymptotic width W(X) of the entropy spectrum. Next, we consider the two sets $mCH(X)$ and $mCW(X)$ and investigate relationships between them, where $mCH(X)$ and $mCW(X)$ denote the sets of all the optimal FF codes with respect to the coding rate and the redundancy, respectively. We give two necessary and sufficient conditions corresponding to $mCH(X) subseteq mCW(X)$ and $mCW(X) subseteq mCH(X)$, respectively. We can also show the existence of an FF code that is optimal with respect to both the redundancy and the coding rate.

  • On the Achievable Rate Region in the Optimistic Sense for Separate Coding of Two Correlated General Sources

    Hiroki KOGA  

     
    PAPER-Source Coding

      Vol:
    E95-A No:12
      Page(s):
    2100-2106

    This paper is concerned with coding theorems in the optimistic sense for separate coding of two correlated general sources X1 and X2. We investigate the achievable rate region Ropt (X1,X2) such that the decoding error probability caused by two encoders and one decoder can be arbitrarily small infinitely often under a certain rate constraint. We give an inner and an outer bounds of Ropt (X1,X2), where the outer bound is described by using new information-theoretic quantities. We also give two simple sufficient conditions under which the inner bound coincides with the outer bound.

  • Four Limits in Probability and Their Roles in Source Coding

    Hiroki KOGA  

     
    PAPER-Source Coding

      Vol:
    E94-A No:11
      Page(s):
    2073-2082

    In information-spectrum methods proposed by Han and Verdu, quantities defined by using the limit superior (or inferior) in probability play crucial roles in many problems in information theory. In this paper, we introduce two nonconventional quantities defined in probabilistic ways. After clarifying basic properties of these quantities, we show that the two quantities have operational meaning in the ε-coding problem of a general source in the ordinary and optimistic senses. The two quantities can be used not only for obtaining variations of the strong converse theorem but also establishing upper and lower bounds on the width of the entropy-spectrum. We also show that the two quantities are expressed in terms of the smooth Renyi entropy of order zero.

  • A New Unified Method for Fixed-Length Source Coding Problems of General Sources

    Tomohiko UYEMATSU  

     
    PAPER-Source Coding

      Vol:
    E93-A No:11
      Page(s):
    1868-1877

    This paper establishes a new unified method for fixed-length source coding problems of general sources. Specifically, we introduce an alternative definition of the smooth Renyi entropy of order zero, and show a unified approach to present the fixed-length coding rate in terms of this information quantity. Our definition of the smooth Renyi entropy has a clear operational meaning, and hence is easy to calculate for finite block lengths. Further, we represent various ε-source coding rate and the strong converse property for general sources in terms of the smooth Renyi entropy, and compare them with the results obtained by Han and Renner et al.

  • Large Deviation Theorems Revisited: Information-Spectrum Approach

    Te-Sun HAN  

     
    PAPER-Information Theory

      Vol:
    E91-A No:10
      Page(s):
    2704-2719

    In this paper we show some new look at large deviation theorems from the viewpoint of the information-spectrum (IS) methods, which has been first exploited in information theory, and also demonstrate a new basic formula for the large deviation rate function in general, which is expressed as a pair of the lower and upper IS rate functions. In particular, we are interested in establishing the general large deviation rate functions that are derivable as the Fenchel-Legendre transform of the cumulant generating function. The final goal is to show, under some mild condition, a necessary and sufficient condition for the IS rate function to be derivable as the Fenchel-Legendre transform of the cumulant generating function, i.e., to be a rate function of Gartner-Ellis type.

  • New Results on Optimistic Source Coding

    Naoki SATO  Hiroki KOGA  

     
    LETTER-Information Theory

      Vol:
    E87-A No:10
      Page(s):
    2577-2580

    Optimistic coding is a coding in which we require the existence of reliable codes for infinitely many block length. In this letter we consider the optimistic source coding theorems for a general source Z from the information-spectrum approach. We first formulate the problem to be considered clearly. We obtain the optimistic infimum achievable source coding rate Tε (Z) for the case where decoding error probability εn is asymptotically less than or equal to an arbitrarily given ε [0,1). In fact, Tε (Z) turns out to be expressed in a form similar to the ordinary infimum achievable source coding rate. A new expression for Tε (Z) is also given. In addition, we investigate the case where εn = 0 for infinitely many n and obtain the infimum achievable coding rate.

  • A Generalization of the Simmons' Bounds on Secret-Key Authentication Systems

    Hiroki KOGA  

     
    LETTER-Cryptography and Information Security

      Vol:
    E83-A No:10
      Page(s):
    1983-1986

    This paper analyzes a generalized secret-key authentication system from a viewpoint of the information-spectrum methods. In the generalized secret-key authentication system, for each n 1 a legitimate sender transmits a cryptogram Wn to a legitimate receiver sharing a key En in the presence of an opponent who tries to cheat the legitimate receiver. A generalized version of the Simmons' bounds on the success probabilities of the impersonation attack and a certain kind of substitution attack are obtained.