1-11hit |
Youhei ISHIKAWA Toshihiro NOMOTO Takekazu OKADA Satoru SHINMURA Fumio KANAYA Shinichiro ICHIGUCHI Toshihito UMEGAKI
A signal-to-noise enhancer with a bandwidth that is six times as wide as that of the conventional type is presented. A new circuit construction, the combination of two MSSW filters which have the same insertion loss in the broadband and two 90 hybrids, is effective to remarkably extend the bandwidth. The enhancement of the enhancer amounts to 20 dB in the operating frequency range of 1.9 GHz150 MHz in 0 to 60 degrees centigrade. This enhancer has accomplished FM threshold extension because the S/N is improved by 1 to 7 dB below the C/N of 9 dB. It was demonstrated that this new enhancer is effective for noise reduction in practical DBS reception.
Fumio KANAYA Masao NAKAGAWA Osamu HIROTA
In this paper, we define the distortion at a certain complexity level, which is the dual quantity of the distortion-complexity. We prove a theorem dual to the theorem which we have given of the asymptotic property of the distortion-complexity. We also give a universal data-base for fixed-rate data compression with distortion and prove its asymptotic optimality.
Let {Xk}k=- be a stationary and ergodic information source, where each Xk takes values in a standard alphabet A with a distance function d: A A [0, ) defined on it. For each sample sequence X = (, x-1, x0, x1, ) and D > 0 let the approximate D-match recurrence time be defined by Rn (x, D) = min {m n: dn (Xn1, Xm+nm+1) D}, where Xji denotes the string xixi+1 xj and dn: An An [0, ) is a metric of An induced by d for each n. Let R (D) be the rate distortion function of the source {Xk}k=- relative to the fidelity criterion {dn}. Then it is shown that lim supn-1/n log Rn (X, D) R (D/2) a. s.
Decision tree design is an important issue because decision trees have many applications in, for example, fault diagnosis and character recognition. This paper describes an algorithm designing a probabilistic decision tree, in which a test applied to a state produces different outcomes with non-zero probability. The algorithm strictly minimizes the tree cost, which is defined as the weighted sum of expected loss and expected test execution time. The lower bound of the cost is utillized to remove the excessive computing time needed to search for the exact optimum. This lower bound is derived from information theory and is transformed to an easily computable representation by introducing a network model. The result of a computational experiment confirms the time efficiency of the proposed method.
We define the complexity and the distortion-complexity of an individual finite length string from a finite set. Assuming that the string is produced by a stationary ergodic source, we prove that the distortion-complexity per source letter and its expectation approximate arbitrarily close the rate-distortion function of this source as the length of the string grows. Furthermore, we apply this property to construct a universal data compression scheme with distortion.
A novel interpretation is given to the information-theoretic meaning of Fano's inequality from the viewpoint of the rate-risk function which expresses the relationship between information rate of observed data and Bayes risk in a decision making context.
Tomohiko UYEMATSU Fumio KANAYA
This paper considers the universal coding problem for stationary ergodic sources with countably infinite alphabets. We propose modified versions of LZ77 and LZ78 codes for sources with countably infinite alphabets. Then, we show that for any source µ with Eµ[log X1]<∞, both codes are asymptotically optimum, i.e. the code length per input symbol approaches its entropy rate with probability one. Further, we show that we can modify LZ77 and LZ78 codes so that both are asymptotically optimal for a family of ergodic sources satisfying Kieffer's condition.
A data-base for data compression is universal if in its construction no prior knowledge of the source distribution is assumed and is optimal if, when we encode the reference index of the data-base, its encoding rate achieves the optimal encoding rate for any given source: in the noiseless case the entropy rate and in the semifaithful case the rate-distortion function of the source. In the present paper, we construct a universal data-base for all stationary ergodic sources, and prove the optimality of the thus constructed data-base for two typical methods of referring to the data-base: one is a block-shift type reference and the other is a single-shift type reference.
A function R(L), named a rate-risk function, is introduced in the field of statistical decision theory. It specifies the minimal permissible rate R at which information about under-lying uncertainties must be conveyed to the decision maker in order to achieve the prescribed value L of the Bayes risk. Fundamental properties are also clarified for the rate-risk function.
Slepian, Wolf and Wyner proved famous source coding theorems for correlated i.i.d. sources. On the other hand recently Han and Verdú have shown the source and channel coding theorems on general sources and channels whose statistics can be arbitrary, that is, no assumption such as stationarity or ergodicity is imposed. We prove source coding theorems on correlated general sources by using the method which Han and Verdú developed to prove their theorems. Also, through an example, we show some new results which are essentially different from those already obtained for the i.i.d. source cases.