Hisao YAMAMOTO Takeo ABE Shinya NOGAMI Hironobu NAKANISHI
This paper describes IP traffic, especially the control of VoIP traffic, on the carrier-scale, and proposes algorithms for it. It examines a case that has already been introduced in the United States and discusses the trend of standardization for this control. Control techniques that will be introduced into the IP network in the future are considered from the viewpoints of both "quality" that users receive and the "control" that carriers perform.
Patrick BRINDEL Bruno DANY Delphine ROUVILLAIN Bruno LAVIGNE Patricia GUERBER Elodie BALMEFREZOL Olivier LECLERC
In this paper, we review recent developments in the field of optical regeneration for both ultra long-haul transmission and terrestrial networking applications. Different techniques (2R/3R) using nonlinear properties of materials and/or devices are proposed such as saturable absorber or InP based interferometer structures showing regenerative capabilities. Principles of operation as well as system experiments are described.
Human tissues conduct electricity about as well as semiconductors. However, there are large differences between tissues which have recently been shown to be determined mainly by the structure of the tissue. For example, the impedance spectrum of a layered tissue such as skin is very different to that of the underlying tissues. The way in which the cells are arranged and also the size of the nucleus are both important. Some of the recent developments in measurement and modelling techniques are described and the relationship between tissue structures and impedance spectra is outlined. The illustrations and examples look at the effect of premalignant changes on localised impedance spectra measured from cervical tissues. Electrical Impedance Tomographic measurements on lung tissue are used to show the maturational changes of lung structure in neonates. The conclusion contains some speculation as to what further research outcomes might occur over the next few years.
Improvement of the absorbing boundary conditions for triangle-hexagonal dual cell grids in the time domain method is described in this paper. The magnetic field components, which are evaluated by the electric fields at the circumcenters of the triangle cells, are conformed to Berenger's perfectly matched layer absorbing boundary conditions. The electric field is linearly interpolated by the fields at the vertices. The lower reflection coefficients in the frequency range for the equilateral and non-equilateral triangle cells are demonstrated.
Akira OTSUKA Goichiro HANAOKA Junji SHIKATA Hideki IMAI
We have introduced the first electronic cash scheme with unconditional security. That is, even malicious users with unlimited computational ability cannot forge a coin and cannot change user's identity secretly embedded in each coin. While, the spender's anonymity is preserved by our new blind signature scheme based on unconditionally secure signature proposed in [7]. But the anonymity is preserved only computationally under the assumption that Decisional Diffie-Hellman Problem is intractable.
Werapon CHIRACHARIT Kosin CHAMNONGTHAI
This paper presents a method for detection of calcification, which is an important early sign of breast cancer in mammograms. Since information of calcifications is located in inhomogeneous background and noises, it is hard to be detected. This method uses wavelet packet transform (WPT) for elimination of the background image related to low frequency components. However, very high frequency signals of noises exist with the calcifications and make it hard to suppress them. Since calcification location can be represented as vertical, horizontal, and diagonal edges in time-frequency domain, the edges in spatial domain can be utilized as a filter for noise suppression. Then the image from inverse transform will contain only required information. A free-response operating characteristic (FROC) curve is used to evaluate a performance of proposed method by applying it to thirty images of calcifications. The results show 82.19 percent true positive detection rate at the cost of 6.73 false positive per image.
Jeffrey C. BAMBER Paul E. BARBONE Nigel L. BUSH David O. COSGROVE Marvin M. DOYELY Frank G. FUECHSEL Paul M. MEANEY Naomi R. MILLER Tsuyoshi SHIINA Francois TRANQUART
A digest is provided of work carried out at the Institute of Cancer Research to develop freehand elastography and apply it to breast investigations. Topics covered include the development of freehand elastography and its relationship to other methods, a description of the system for off-line clinical evaluation of the freehand method, comparison of the physical performances of freehand and mechanical elastography, early clinical results on 70 breast tumours, real-time imaging, quantitative elastography and psychophysical aspects of the detection and assessment of elastic lesions. Progress in developing this new medical imaging modality is occurring rapidly throughout the world and its future looks promising.
Sunao IWAKI Mitsuo TONOIKE Shoogo UENO
In this paper, we propose a method to reconstruct current distributions in the human brain from neuromagnetic measurements. The proposed method is based on the weighted lead-field synthetic (WLFS) filtering technique with the weighting factors calculated from the results of previous source space scanning. In this method, in addition to the depth normalization technique, weighting factors of the WLFS are determined by the cost values previously calculated based on the multiple signal classification (MUSIC) scan. We performed computer simulations of this method under noisy measurement conditions and compared the results to those obtained with the conventional WLFS method. The results of the simulations indicate that the proposed method is effective for the reconstruction of the current distributions in the human brain using magnetoencephalographic (MEG) measurements, even if the signal-to-noise ratio of the measured data is relatively low. We applied the proposed method to the magnetoencephalographic data obtained during a mental image processing task that included object recognition and mental rotation operations. The results suggest that the proposed method can extract the neural activity in the extrastriate visual region and the parietal region. These results are in agreement with the results of previous positron emission tomography (PET) and functional magnetic resonance imaging (fMRI) studies.
Yoshiaki HORI Takeshi IKENAGA Yuji OIE
We have focused on the RIO queueing mechanism in statistical bandwidth allocation service, which uses AF-PHB. We have studied the parameterization of RIO to achieve both high throughput and low delay. We were able to parameterize RIO for that purpose in terms of both minth and maxp used in dropping OUT packets. Furthermore, we have also examined the parameterization regarding EWMA (Exponential Weighted Moving Average), i.e., weight factor wqout, and have shown that dropping OUT packets should depend upon the queue length without much delay unlike in RED. From our simulation results, we could see that our parameterization provided high throughput performance and also limited the queue length in a narrow range more effectively.
Hiroyoshi MIWA Kazunori KUMAGAI Shinya NOGAMI Takeo ABE Hisao YAMAMOTO
The explosive growth of World Wide Web usage is causing a number of performance problems, including slow response times, network congestion, and denial of service. Web site that has a huge number of accesses and requires high quality of services, such as a site offering hosting services, or content delivery services, usually uses a cache server to reduce the load on the original server offering the original content. To increase the throughput of the caching process and to improve service availability, multiple cache servers are often positioned in front of the original server. This requires a switch to direct incoming requests to one of the multiple cache servers. In this paper, we propose a routing algorithm for such a switch in front of clustered multiple cache servers and evaluate its performance by simulation. The results show that our routing algorithm is effective when content has request locality and a short period of validity, for example, news, map data, road traffic data, or weather information. We also identify points to consider when the proposed algorithm is applied to a real system.
Yuan-Sun CHU Ruey-Bin YANG Cheng-Shong WU Ming-Cheng LIANG
In a shared buffer packet switch, a good buffer management scheme is needed to reduce the overall packet loss probability and improve the fairness between different users. In this paper, a novel buffer control scheme called partial sharing and partial partitioning (PSPP) is proposed. The PSPP is an adaptive scheme that can be dynamically adjusted to the changing traffic conditions while simple to implement. The key idea of the PSPP is that part of the buffer space, proportional to the number of inactive output ports, is reserved for sharing between inactive output ports. This portion of buffer is called PS buffer. The residual buffer space, called PP buffer, is partitioned and distributed to active output ports equally. From the analysis results, we only need to reserve a small amount of PS buffer space to get good performance for the entire system. Computer simulation shows the PSPP control is very robust and very close to the performance of pushout (PO) buffer management scheme which is a scheme considered as optimal in terms of fairness and total loss ratio while too complicated for implementation.
Gergely SERES Arpad SZLAVIK Janos ZATONYI Jozsef BíRO
The provisioning of QoS in the Internet is gaining an increasing attention, thus the importance of methods capable of estimating the bandwidth requirement of traffic flows is constantly growing. This information can be used for a wide range of purposes. Admission control, QoS routing and load sharing all need the same basic information in order to be able to make decisions. This paper describes a number of methods that can be used to arrive at precise estimates of the bandwidth requirement focusing on those that are based on the theory of large deviations. A methodology is presented that allows the reformulation of earlier solutions based on the estimation of some form of an overflow probability so that their output becomes a bandwidth-type quantity, the format preferred by Internet control applications. The methodology provides two tracks for the conversion: an indirect method that encapsulates the overflow probability-type approach as an embedded calculation and a direct method that immediately results in the estimate of the bandwidth requirement. The paper introduces a novel method for the direct computation of the bandwidth requirement of Internet traffic flows using the many sources asymptotic regime of the large deviation theory. The direct bandwidth estimator method reduces the computational complexity of the calculations, since it results directly in the bandwidth requirement, allowing the omission of the frequent and costly computation of the buffer overflow probability. The savings arising from the reduction in computational complexity are demonstrated in a numerical example.
In this paper we discuss how one can delegate his power to authenticate or sign documents to others who, again, can delegate the power to someone else. A practical cryptographic solution would be to issue a certificate that consists of one's signature. The final verifier checks verifies the chain of these certificates. This paper provides an efficient and provably secure scheme that is suitable for such a delegation chain. We prove the security of our scheme against an adaptive chosen message attack in the random oracle model. Though our primary application would be agent systems where some agents work on behalf of a user, some other applications and variants will be discussed as well. One of the variants enjoys a threshold feature whereby one can delegate his power to a group so that they have less chance to abuse their power. Another application is an identity-based signature scheme that provides faster verification capability and less communication complexity compared to those provided by existing certificate-based public key infrastructure.
Deukjo HONG Jaechul SUNG Shiho MORIAI Sangjin LEE Jongin LIM
In this paper, we discuss the impossible differential cryptanalysis for the block cipher Zodiac. The main design principles of Zodiac include simplicity and efficiency. However, the diffusion layer in its round function is too simple to offer enough security. The impossible differential cryptanalysis exploits such weakness in Zodiac. Our attack using a 14-round impossible characteristic derives the 128-bit master key of the full 16-round Zodiac faster than the exhaustive search. The efficiency of the attack compared with exhaustive search increases as the key size increases.
Masataka SUZUKI Tsutomu MATSUMOTO
We describe a scheme of secret communication over the Internet utilizing the potentiality of the TCP/IP protocol suite in a non-standard way. Except for the sender and the receiver of the secret communication it does not need any entities installed with special software. Moreover it does not require them to share any key beforehand. Such features of the scheme stem from the use of IP datagrams with spoofed source addresses and their related error messages for the Internet Control Message Protocol (ICMP) induced by artificial faults. Countermeasures against IP spoofing are deployed in various places since it is often used together with attacks such as distributed denial of service (DDoS) and SPAM mailing. Thus we examine the environment where the scheme works as an intention and also clarify the conditions to obsolete the scheme. Furthermore we estimate the amount of secretly communicated data by the scheme and storage requirements for the receivers and those for the observers who monitor the traffic to detect the very existence of such a secret communication. We also discuss various issues including the sender anonymity achieved by the scheme.
Goichiro HANAOKA Junji SHIKATA Yuliang ZHENG Hideki IMAI
This paper addresses the problem of designing an unconditionally secure conference system that fulfills the requirements of both traceability and dynamic sender. In a so-called conference system, a common key is shared among all authorized users, and messages are encrypted using the shared key. It is known that a straightforward implementation of such a system may present a number of security weaknesses. Our particular concern lies in the possibility that unauthorized users may be able to acquire the shared key by illegal means, say from one or more authorized but dishonest users (called traitors). An unauthorized user who has successfully obtained the shared key can now decrypt scrambled messages without leaving any evidence on who the traitors were. To solve this problem, in this paper we propose a conference system that admits dynamic sender traceability. The new solution can detect traitors, even if the sender of a message is dynamically determined after a shared key is distributed to authorized users. We also prove that this scheme is unconditionally secure.
Masaharu HYODO Kazi SARWAR ABEDIN Noriaki ONODERA Kamal K. GUPTA Masayoshi WATANABE
Fourier synthesis of ultrafast optical-pulse trains was demonstrated using a simplified experimental configuration consisting of three independent continuous-wave lasers and a semiconductor optical amplifier (SOA) used as a four-wave mixer. When the three lasers were phase-locked, ultrafast optical-pulse trains were successfully generated at repetition frequencies ranging from 504 GHz to 1.8 THz with high waveform stability.
Soichi FURUYA Dai WATANABE Yoichi SETO Kazuo TAKARAGI
In many cryptographic protocols, a common-key encryption is used to provide a secure data-transmission channel. More precisely, the general idea of protocols is to have an encryption provide data authenticity as well as data confidentiality. In fact, there are known to be quite a few ways to provide both forms of security, however none of them are optimized enough to be efficient. We present a new encryption mode that uses a random number generator (RNG). Assuming the security of the RNG, we can prove not only perfect secrecy, but also message authentication. The proven probability of a successful forgery is (n-1)/(2b-1), where b is the number of bits in a block and n is the number of ciphertext blocks. The proposed scheme achieves very high practicality due to the potential advantages in efficiency. When we use a computationally secure RNG, such as instance a pseudorandom number generator PRNG, we have advantages in efficiency; in addition to the PRNG parallel computation, the scheme requires only a single-path process on the data stream so that even a limited hardware resource can operate an encryption of a very long data stream. We demonstrate the practicality of our scheme, by showing a realistic parameter set and the evaluations of its performance.
Atsuko MIYAJI Masao NONAKA Yoshinori TAKII
Various attacks against RC5 have been analyzed intensively. A known plaintext attack has not been reported that it works on so higher round as a chosen plaintext attack, but it can work more efficiently and practically. In this paper, we investigate a known plaintext attack against RC5 by improving a correlation attack. As for a known plaintext attack against RC5, the best known result is a linear cryptanalysis. They have reported that RC5-32 with 10 rounds can be broken by 264 plaintexts under the heuristic assumption: RC5-32 with r rounds can be broken with a success probability of 90% by using 26r+4 plaintexts. However, their assumption seems to be highly optimistic. Our known plaintext correlation attack can break RC5-32 with 10 rounds (20 half-rounds) in a more strict sense with a success probability of 90% by using 263.67 plaintexts. Furthermore, our attack can break RC5-32 with 21 half-rounds in a success probability of 30% by using 263.07 plaintexts.
Katsuyuki OKEYA Kouichi SAKURAI
We present a scalar multiplication algorithm with recovery of the y-coordinate on a Montgomery-form elliptic curve over any non-binary field. The previous algorithms for scalar multiplication on a Montgomery form do not consider how to recover the y-coordinate. So although they can be applicable to certain restricted schemes (e.g. ECDH and ECDSA-S), some schemes (e.g. ECDSA-V and MQV) require scalar multiplication with recovery of the y-coordinate. We compare our proposed scalar multiplication algorithm with the traditional scalar multiplication algorithms (including Window-methods on the Weierstrass form), and discuss the Montgomery form versus the Weierstrass form in the performance of implementation with several techniques of elliptic curve cryptosystems (including ECES, ECDSA, and ECMQV). Our results clarify the advantage of the cryptographic usage of Montgomery-form elliptic curve in constrained environments such as mobile devices and smart cards.