The search functionality is under construction.

Author Search Result

[Author] Tetsunao MATSUTA(14hit)

1-14hit
  • Joint Channel Coding and Intrinsic Randomness

    Tomohiko UYEMATSU  Tetsunao MATSUTA  

     
    PAPER-Shannon theory

      Vol:
    E101-A No:12
      Page(s):
    2091-2098

    This paper considers a joint channel coding and random number generation from the channel output. Specifically, we want to transmit a message to a receiver reliably and at the same time the receiver extracts pure random bits independent of the channel input. We call this problem as the joint channel coding and intrinsic randomness problem. For general channels, we clarify the trade-off between the coding rate and the random bit rate extracted from the channel output by using the achievable rate region, where both the probability of decoding error and the approximation error of random bits asymptotically vanish. We also reveal the achievable rate regions for stationary memoryless channels, additive channels, symmetric channels, and mixed channels.

  • Non-Asymptotic Bounds and a General Formula for the Rate-Distortion Region of the Successive Refinement Problem

    Tetsunao MATSUTA  Tomohiko UYEMATSU  

     
    PAPER-Shannon theory

      Vol:
    E101-A No:12
      Page(s):
    2110-2124

    In the successive refinement problem, a fixed-length sequence emitted from an information source is encoded into two codewords by two encoders in order to give two reconstructions of the sequence. One of two reconstructions is obtained by one of two codewords, and the other reconstruction is obtained by all two codewords. For this coding problem, we give non-asymptotic inner and outer bounds on pairs of numbers of codewords of two encoders such that each probability that a distortion exceeds a given distortion level is less than a given probability level. We also give a general formula for the rate-distortion region for general sources, where the rate-distortion region is the set of rate pairs of two encoders such that each maximum value of possible distortions is less than a given distortion level.

  • Parametric Forms of the Achievable Rate Region for Source Coding with a Helper

    Tetsunao MATSUTA  Tomohiko UYEMATSU  Ryutaroh MATSUMOTO  

     
    LETTER-Information Theory

      Vol:
    E95-A No:12
      Page(s):
    2493-2497

    Source coding with a helper is one of the most fundamental fixed-length source coding problem for correlated sources. For this source coding, Wyner and Ahlswede-Korner showed the achievable rate region which is the set of rate pairs of encoders such that the probability of error can be made arbitrarily small for sufficiently large block length. However, their expression of the achievable rate region consists of the sum of indefinitely many sets. Thus, their expression is not useful for computing the achievable rate region. This paper deals with correlated sources whose conditional distribution is related by a binary-input output-symmetric channel, and gives a parametric form of the achievable rate region in order to compute the region easily.

  • Equivalences among Some Information Measures for Individual Sequences and Their Applications for Fixed-Length Coding Problems

    Tomohiko UYEMATSU  Tetsunao MATSUTA  

     
    PAPER-Source Coding and Data Compression

      Pubricized:
    2023/08/16
      Vol:
    E107-A No:3
      Page(s):
    393-403

    This paper proposes three new information measures for individual sequences and clarifies their properties. Our new information measures are called as the non-overlapping max-entropy, the overlapping smooth max-entropy, and the non-overlapping smooth max-entropy, respectively. These measures are related to the fixed-length coding of individual sequences. We investigate these measures, and show the following three properties: (1) The non-overlapping max-entropy coincides with the topological entropy. (2) The overlapping smooth max-entropy and the non-overlapping smooth max-entropy coincide with the Ziv-entropy. (3) When an individual sequence is drawn from an ergodic source, the overlapping smooth max-entropy and the non-overlapping smooth max-entropy coincide with the entropy rate of the source. Further, we apply these information measures to the fixed-length coding of individual sequences, and propose some new universal coding schemes which are asymptotically optimum.

  • New Non-Asymptotic Bounds on Numbers of Codewords for the Fixed-Length Lossy Compression

    Tetsunao MATSUTA  Tomohiko UYEMATSU  

     
    PAPER-Source Coding and Data Compression

      Vol:
    E99-A No:12
      Page(s):
    2116-2129

    In this paper, we deal with the fixed-length lossy compression, where a fixed-length sequence emitted from the information source is encoded into a codeword, and the source sequence is reproduced from the codeword with a certain distortion. We give lower and upper bounds on the minimum number of codewords such that the probability of exceeding a given distortion level is less than a given probability. These bounds are characterized by using the α-mutual information of order infinity. Further, for i.i.d. binary sources, we provide numerical examples of tight upper bounds which are computable in polynomial time in the blocklength.

  • Random-Coding Exponential Error Bounds for Channels with Action-Dependent States

    Tetsunao MATSUTA  Tomohiko UYEMATSU  

     
    PAPER-Shannon Theory

      Vol:
    E96-A No:12
      Page(s):
    2324-2331

    Weissman introduced a coding problem for channels with action-dependent states. In this coding problem, there are two encoders and a decoder. An encoder outputs an action that affects the state of the channel. Then, the other encoder outputs a codeword of the message into the channel by using the channel state. The decoder receives a noisy observation of the codeword, and reconstructs the message. In this paper, we show an exponential error bound for channels with action-dependent states based on the random coding argument.

  • Achievable Rate Regions for Source Coding with Delayed Partial Side Information Open Access

    Tetsunao MATSUTA  Tomohiko UYEMATSU  

     
    PAPER-Shannon Theory

      Vol:
    E102-A No:12
      Page(s):
    1631-1641

    In this paper, we consider a source coding with side information partially used at the decoder through a codeword. We assume that there exists a relative delay (or gap) of the correlation between the source sequence and side information. We also assume that the delay is unknown but the maximum of possible delays is known to two encoders and the decoder, where we allow the maximum of delays to change by the block length. In this source coding, we give an inner bound and an outer bound on the achievable rate region, where the achievable rate region is the set of rate pairs of encoders such that the decoding error probability vanishes as the block length tends to infinity. Furthermore, we clarify that the inner bound coincides with the outer bound when the maximum of delays for the block length converges to a constant.

  • An Information-Theoretical Analysis of the Minimum Cost to Erase Information

    Tetsunao MATSUTA  Tomohiko UYEMATSU  

     
    PAPER-Shannon theory

      Vol:
    E101-A No:12
      Page(s):
    2099-2109

    We normally hold a lot of confidential information in hard disk drives and solid-state drives. When we want to erase such information to prevent the leakage, we have to overwrite the sequence of information with a sequence of symbols independent of the information. The overwriting is needed only at places where overwritten symbols are different from original symbols. Then, the cost of overwrites such as the number of overwritten symbols to erase information is important. In this paper, we clarify the minimum cost such as the minimum number of overwrites to erase information under weak and strong independence criteria. The former (resp. the latter) criterion represents that the mutual information between the original sequence and the overwritten sequence normalized (resp. not normalized) by the length of the sequences is less than a given desired value.

  • Universal Slepian-Wolf Source Codes Using Low-Density Parity-Check Matrices

    Tetsunao MATSUTA  Tomohiko UYEMATSU  Ryutaroh MATSUMOTO  

     
    PAPER-Source Coding

      Vol:
    E93-A No:11
      Page(s):
    1878-1888

    Low-density parity-check (LDPC) codes become very popular in channel coding, since they can achieve the performance close to maximum-likelihood (ML) decoding with linear complexity of the block length. Recently, Muramatsu et al. proposed a code using LDPC matrices for Slepian-Wolf source coding, and showed that their code can achieve any point in the achievable rate region of Slepian-Wolf source coding. However, since they employed ML decoding, their decoder needs to know the probability distribution of the source. Hence, it is an open problem whether there exists a universal code using LDPC matrices, where universal code means that the error probability of the code vanishes as the block length tends to infinity for all sources whose achievable rate region contains the rate pair of encoders. In this paper, we show the existence of a universal Slepian-Wolf source code using LDPC matrices for stationary memoryless sources.

  • A General Formula of the Capacity Region for Multiple-Access Channels with Deterministic Feedback

    Tetsunao MATSUTA  Tomohiko UYEMATSU  

     
    PAPER-Channel Coding

      Vol:
    E94-A No:11
      Page(s):
    2105-2120

    The multiple-access channel (MAC) becomes very popular in various communication systems, because multi-terminal communication systems have been widely used in practical systems, e.g., mobile phones and P2P, etc. For some MACs, it is known that feedback can enlarge the capacity region, where the capacity region is the set of rate pairs such that the error probability can be made arbitrarily small for sufficiently large block length. The capacity region for general MACs, which are not required to satisfy ergodicity and stationarity with perfect feedback was first shown by Tatikonda and Mitter without the proof, where perfect feedback means that the channel output is perfectly fed back to senders. In this paper, we generalize Tatikonda and Mitter's result to the case of deterministic feedback, where the values of deterministic functions of past channel outputs is fed back to senders. We show that the capacity region for general MACs with deterministic feedback can be represented by the information-spectrum formula introduced by Han and Verdu, and directed information introduced by Massey. We also investigate the compound MAC problem, the ε-coding problem, the strong converse property and the cost constraint problem for general MACs with deterministic feedback.

  • Achievable Rate Regions of Cache-Aided Broadcast Networks for Delivering Content with a Multilayer Structure

    Tetsunao MATSUTA  Tomohiko UYEMATSU  

     
    PAPER-Shannon Theory

      Vol:
    E100-A No:12
      Page(s):
    2629-2640

    This paper deals with a broadcast network with a server and many users. The server has files of content such as music and videos, and each user requests one of these files, where each file consists of some separated layers like a file encoded by a scalable video coding. On the other hand, each user has a local memory, and a part of information of the files is cached (i.e., stored) in these memories in advance of users' requests. By using the cached information as side information, the server encodes files based on users' requests. Then, it sends a codeword through an error-free shared link for which all users can receive a common codeword from the server without error. We assume that the server transmits some layers up to a certain level of requested files at each different transmission rate (i.e., the codeword length per file size) corresponding to each level. In this paper, we focus on the region of tuples of these rates such that layers up to any level of requested files are recovered at users with an arbitrarily small error probability. Then, we give inner and outer bounds on this region.

  • An Equivalent Expression for the Wyner-Ziv Source Coding Problem Open Access

    Tetsunao MATSUTA  Tomohiko UYEMATSU  

     
    PAPER-Information Theory

      Pubricized:
    2021/09/09
      Vol:
    E105-A No:3
      Page(s):
    353-362

    We consider the coding problem for lossy source coding with side information at the decoder, which is known as the Wyner-Ziv source coding problem. The goal of the coding problem is to find the minimum rate such that the probability of exceeding a given distortion threshold is less than the desired level. We give an equivalent expression of the minimum rate by using the chromatic number and notions of covering of a set. This allows us to analyze the coding problem in terms of graph coloring and covering.

  • Second-Order Intrinsic Randomness for Correlated Non-Mixed and Mixed Sources

    Tomohiko UYEMATSU  Tetsunao MATSUTA  

     
    PAPER-Shannon Theory

      Vol:
    E100-A No:12
      Page(s):
    2615-2628

    We consider the intrinsic randomness problem for correlated sources. Specifically, there are three correlated sources, and we want to extract two mutually independent random numbers by using two separate mappings, where each mapping converts one of the output sequences from two correlated sources into a random number. In addition, we assume that the obtained pair of random numbers is also independent of the output sequence from the third source. We first show the δ-achievable rate region where a rate pair of two mappings must satisfy in order to obtain the approximation error within δ ∈ [0,1), and the second-order achievable rate region for correlated general sources. Then, we apply our results to non-mixed and mixed independently and identically distributed (i.i.d.) correlated sources, and reveal that the second-order achievable rate region for these sources can be represented in terms of the sum of normal distributions.

  • On the Wyner-Ziv Source Coding Problem with Unknown Delay

    Tetsunao MATSUTA  Tomohiko UYEMATSU  

     
    PAPER-Shannon Theory

      Vol:
    E97-A No:12
      Page(s):
    2288-2299

    In this paper, we consider the lossy source coding problem with delayed side information at the decoder. We assume that delay is unknown but the maximum of delay is known to the encoder and the decoder, where we allow the maximum of delay to change with the block length. In this coding problem, we show an upper bound and a lower bound of the rate-distortion (RD) function, where the RD function is the infimum of rates of codes in which the distortion between the source sequence and the reproduction sequence satisfies a certain distortion level. We also clarify that the upper bound coincides with the lower bound when maximums of delay per block length converge to a constant. Then, we give a necessary and sufficient condition in which the RD function is equal to that for the case without delay. Furthermore, we give an example of a source which does not satisfy this necessary and sufficient condition.