The search functionality is under construction.
The search functionality is under construction.

Keyword Search Result

[Keyword] smooth Renyi entropy(2hit)

1-2hit
  • Four Limits in Probability and Their Roles in Source Coding

    Hiroki KOGA  

     
    PAPER-Source Coding

      Vol:
    E94-A No:11
      Page(s):
    2073-2082

    In information-spectrum methods proposed by Han and Verdu, quantities defined by using the limit superior (or inferior) in probability play crucial roles in many problems in information theory. In this paper, we introduce two nonconventional quantities defined in probabilistic ways. After clarifying basic properties of these quantities, we show that the two quantities have operational meaning in the ε-coding problem of a general source in the ordinary and optimistic senses. The two quantities can be used not only for obtaining variations of the strong converse theorem but also establishing upper and lower bounds on the width of the entropy-spectrum. We also show that the two quantities are expressed in terms of the smooth Renyi entropy of order zero.

  • A New Unified Method for Fixed-Length Source Coding Problems of General Sources

    Tomohiko UYEMATSU  

     
    PAPER-Source Coding

      Vol:
    E93-A No:11
      Page(s):
    1868-1877

    This paper establishes a new unified method for fixed-length source coding problems of general sources. Specifically, we introduce an alternative definition of the smooth Renyi entropy of order zero, and show a unified approach to present the fixed-length coding rate in terms of this information quantity. Our definition of the smooth Renyi entropy has a clear operational meaning, and hence is easy to calculate for finite block lengths. Further, we represent various ε-source coding rate and the strong converse property for general sources in terms of the smooth Renyi entropy, and compare them with the results obtained by Han and Renner et al.