The search functionality is under construction.
The search functionality is under construction.

Keyword Search Result

[Keyword] Ti(30728hit)

20921-20940hit(30728hit)

  • Over 40-Gbit/s InP HEMT ICs for Optical Communication Systems

    Toshihide SUZUKI  Yasuhiro NAKASHA  Hideki KANO  Masaru SATO  Satoshi MASUDA  Ken SAWADA  Kozo MAKIYAMA  Tsuyoshi TAKAHASHI  Tatsuya HIROSE  Naoki HARA  Masahiko TAKIGAWA  

     
    INVITED PAPER

      Vol:
    E86-C No:10
      Page(s):
    1916-1922

    In this paper, we describe the operation of circuits capable of more than 40-Gbit/s that we have developed using InP HEMT technology. For example, we succeeded in obtaining 43-Gbit/s operation for a full-rate 4:1Multiplier (MUX), 50-Gbit/s operation for a Demultiplexer (DEMUX), 50-Gbit/s operation for a D-type flip-flop (D-FF), and a preamplifier with a bandwidth of 40 GHz. In addition, the achievement of 90-Gbit/s operation for a 2:1MUX and a distributed amplifier with over 110-GHz bandwidth indicates that InP HEMT technology is promising for system operations of over 100 Gbit/s. To achieve these results, we also developed several design techniques to improve frequency response above 80 GHz including a symmetric and separated layout of differential elements in the basic SCFL gate and inverted microstrip.

  • Adaptive Rekeying for Secure Multicast

    Sandeep KULKARNI  Bezawada BRUHADESHWAR  

     
    PAPER-Security

      Vol:
    E86-B No:10
      Page(s):
    2957-2965

    In this paper, we focus on the problem of secure multicast in dynamic groups. In this problem, a group of users communicate using a shared group key. Due to the dynamic nature of these groups, to preserve secrecy, it is necessary to change the group key whenever the group membership changes. While the group key is being changed, the group communication needs to be interrupted until the rekeying is complete. This interruption is especially necessary if the rekeying is done because a user has left (or is removed). We split the rekeying cost into two parts: the cost of the critical path--where each user receives the new group key, and the cost of the non-critical path--where each user receives any other keys that it needs to obtain. We present a family of algorithms that show the tradeoff between the cost of the critical path and the cost of the non-critical path. Our solutions allow the group controller to choose the appropriate algorithm for key distribution by considering the requirements on critical and non-critical cost. In our solutions, the group controller can dynamically change the algorithm for key distribution to adapt to changing application requirements. Moreover, we argue that our solutions allow the group controller to effectively manage heterogeneous groups where users have different requirements/capabilities.

  • Performance of a Decoding Algorithm for LDPC Codes Based on the Concave-Convex Procedure

    Tomoharu SHIBUYA  Kohichi SAKANIWA  

     
    LETTER-Coding Theory

      Vol:
    E86-A No:10
      Page(s):
    2601-2606

    In this letter, we show the effectiveness of a double-loop algorithm based on the concave-convex procedure (CCCP) in decoding linear codes. For this purpose, we numerically compare the error performance of CCCP-based decoding algorithm with that of a conventional iterative decoding algorithm based on belief propagation (BP). We also investigate computational complexity and its relation to the error performance.

  • Detailedly Represented Irregular Low-Density Parity-Check Codes

    Kenta KASAI  Tomoharu SHIBUYA  Kohichi SAKANIWA  

     
    PAPER-Coding Theory

      Vol:
    E86-A No:10
      Page(s):
    2435-2444

    Richardson and Urbanke developed a powerful method density evolution which determines, for various channels, the capacity of irregular low-density parity-check code ensembles. We develop generalized density evolution for minutely represented ensembles and show it includes conventional representation as a special case. Furthermore, we present an example of code ensembles used over binary erasure channel and binary input additive white Gaussian noise channel which have better thresholds than highly optimized ensembles with conventional representation.

  • Method to Generate Images for a Motion-Base in an Immersive Display Environment

    Toshio MORIYA  Haruo TAKEDA  

     
    PAPER-Image Processing, Image Pattern Recognition

      Vol:
    E86-D No:10
      Page(s):
    2231-2239

    We propose an image generation method for an immersive multi-screen environment that contains a motion ride. To allow a player to look around freely in a virtual world, a method to generate an arbitrary direction image is required, and this technology has already been established. In our environment, displayed images must also be updated according to the movement of the motion ride in order to keep a consistency between the player's viewpoint and the virtual world. In this paper, we indicate that this updating process can be performed by the similar method to generate looking-around images and the same data format can be applicable. Then we discuss the format in terms of the data size and the amount of calculations need to consider the performance in our display environment, and we propose new image formats which improve on the widely-used formats such as the perspective, or the fish-eye format.

  • Deformation of the Brillouin Gain Spectrum Caused by Parabolic Strain Distribution and Resulting Measurement Error in BOTDR Strain Measurement System

    Hiroshi NARUSE  Mitsuhiro TATEDA  Hiroshige OHNO  Akiyoshi SHIMADA  

     
    PAPER-Optoelectronics

      Vol:
    E86-C No:10
      Page(s):
    2111-2121

    In an optical time domain reflectometer type strain measurement system, we theoretically derive the shape of the Brillouin gain spectrum produced in an optical fiber under a parabolic strain distribution which is formed in a uniformly loaded beam. Based on the derived result, we investigate the effects of the parabolic strain distribution parameters and the measurement conditions such as the launched pulse width and the measurement position on the beam on the deformation of the Brillouin backscattered-light power spectrum shape. In addition, we investigate the strain measurement error resulting from the deformation of the power spectrum shape by analyzing the peak-power frequency at which the power spectrum is maximized.

  • Normalizing Syntactic Structures Using Part-of-Speech Tags and Binary Rules

    Seongyong KIM  Kong-Joo LEE  Key-Sun CHOI  

     
    PAPER

      Vol:
    E86-D No:10
      Page(s):
    2049-2056

    We propose a normalization scheme of syntactic structures using a binary phrase structure grammar with composite labels. The normalization adopts binary rules so that the dependency between two sub-trees can be represented in the label of the tree. The label of a tree is composed of two attributes, each of which is extracted from each sub-tree, so that it can represent the compositional information of the tree. The composite label is generated from part-of-speech tags using an automatic labelling algorithm. Since the proposed normalization scheme is binary and uses only part-of-speech information, it can readily be used to compare the results of different syntactic analyses independently of their syntactic description and can be applied to other languages as well. It can also be used for syntactic analysis, which performs higher than the previous syntactic description for Korean corpus. We implement a tool that transforms a syntactic description into normalized one based on this proposed scheme. It can help construct a unified syntactic corpus and extract syntactic information from various types of syntactic corpus in a uniform way.

  • Ultrahigh-Speed InP/InGaAs DHBTs with Very High Current Density

    Minoru IDA  Kenji KURISHIMA  Noriyuki WATANABE  

     
    INVITED PAPER

      Vol:
    E86-C No:10
      Page(s):
    1923-1928

    We describe 150-nm-thick collector InP-based double heterojunction bipolar transistors with two types of thin pseudomorphic bases. The emitter and collector layers are designed for high collector current operation. The collector current blocking is suppressed by the compositionally step-graded collector structure even at JC of over 500 kA/cm2 with practical breakdown characteristics. An HBT with a 20-nm-thick base achieves a high fT of 351 GHz at high JC of 667 kA/cm2, and a 30-nm-base HBT achieves a high value of 329 GHz for both fT and fmax at JC of 583 kA/cm2. An equivalent circuit analysis suggests that the extremely small carrier-transit-delay contributes to the ultrahigh fT.

  • Internal-State Reconstruction of a Stream Cipher RC4

    Yoshiaki SHIRAISHI  Toshihiro OHIGASHI  Masakatu MORII  

     
    LETTER-Information Security

      Vol:
    E86-A No:10
      Page(s):
    2636-2638

    Knudsen et al. proposed an efficient method based on a tree-search algorithm with recursive process for reconstructing the internal state of RC4 stream cipher. However, the method becomes infeasible for word size n > 5 because its time complexity to reconstruct the internal state is too large. This letter proposes a more efficient method than theirs. Our method can reconstruct the internal state by using the pre-known internal-state entries, which are fewer than their method.

  • JR East Contact-less IC Card Automatic Fare Collection System "Suica"

    Yasutomo SHIRAKAWA  Akio SHIIBASHI  

     
    INVITED PAPER

      Vol:
    E86-D No:10
      Page(s):
    2070-2076

    Suica is our contact-less IC card's nickname: Super Urban Intelligent CArd. There are two types of IC Card: One for Suica IO (SF) Card and the other for Suica Commuter Pass, which has a function of stored fare card and commuter pass. There are 6.54 million Suica holders (about 3.33 million Suica Season Pass holders and 3.21 million Suica IO Card holders) as of 16, June 2003.

  • Radiation Pattern of the Rectangular Microstrip Antenna on Anisotropy Substrates with an Air Gap and Dielectric Superstrate

    Joong Han YOON  Hwa Choon LEE  Kyung Sup KWAK  

     
    LETTER-Electromagnetic Theory

      Vol:
    E86-C No:10
      Page(s):
    2145-2150

    This study investigate the rectangular microstrip patch antenna on anisotropy substrates with superstrate and air gap, based on rigorous full-wave analysis and Galerkin's moment method. Results show that radiation patterns with varying air gap, permittivity of the superstrate and substrate, and thickness of the superstrate can be determined and analyzed.

  • Solution of Eigenvalue Integral Equation with Exponentially Oscillating Covariance Function

    Vitaly KOBER  Josue ALVAREZ-BORREGO  Tae Sun CHOI  

     
    LETTER-Digital Signal Processing

      Vol:
    E86-A No:10
      Page(s):
    2690-2692

    Karhunen-Loeve (KL) transform is optimal for many signal detection, communication and filtering applications. An explicit solution of the KL integral equation for a practical case when the covariance function of a stationary process is exponentially oscillating is proposed.

  • A Hierarchical Routing Protocol Based on Autonomous Clustering in Ad Hoc Networks

    Tomoyuki OHTA  Munehiko FUJIMOTO  Shinji INOUE  Yoshiaki KAKUDA  

     
    PAPER-Mobile Ad Hoc Networks

      Vol:
    E86-B No:10
      Page(s):
    2902-2911

    Recently, in wired networks, a hierarchical structure has been introduced to improve management and routing. In ad hoc networks, we introduce a hierarchical structure to achieve the same goal. However, it is difficult to introduce the hierarchical structure because all mobile hosts are always moving around the network. So, we proposed the clustering scheme to construct the hierarchical structure before. In this paper, we propose a new hierarchical routing protocol called Hi-TORA based on the clustering scheme. And we show the experimental evaluation of Hi-TORA with respect to the number of control packets, accuracy of packet delivery and hop counts in comparison with TORA.

  • Color Image Segmentation Using a Gaussian Mixture Model and a Mean Field Annealing EM Algorithm

    Jong-Hyun PARK  Wan-Hyun CHO  Soon-Young PARK  

     
    PAPER-Image Processing, Image Pattern Recognition

      Vol:
    E86-D No:10
      Page(s):
    2240-2248

    In this paper we present an unsupervised color image segmentation algorithm based on statistical models. We have adopted the Gaussian mixture model to represent the distribution of color feature vectors. A novel deterministic annealing EM and mean field theory from statistical mechanics are used to compute the posterior probability distribution of each pixel and estimate the parameters of the Gaussian Mixture Model. We describe the noncontexture segmentation algorithm that uses a deterministic annealing approach and the contexture segmentation algorithm that uses the mean field theory. The experimental results show that the deterministic annealing EM and mean field theory provide a global optimal solution for the maximum likelihood estimators and that these algorithms can efficiently segment the real image.

  • Elliptic Curve Cryptosystem on Smart Card Access with Threshold Scheme

    Shyi-Tsong WU  

     
    PAPER-Information Security

      Vol:
    E86-A No:10
      Page(s):
    2569-2576

    The application of Elliptic Curve Cryptosystem has gained more and more attention. ECC uses smaller key size and lower memory requirement to retain the security level and can be a crucial factor in the smart card system. In this paper, an ECC based implementation of security schemes in smart card system to access control the door of some confidential places is proposed. The confidential place, for example a coffer, a strong room in the bank is used to store treasures as well as cashes, and where the mutual vigilance could be required. For the safety consideration, the going in and out a coffer by a person is not permissive but a group of authorized people. It involves the problem of secret sharing. The adopted solution of sharing secret is threshold scheme. Every participant possesses a secret shadow, which will be saved in the smart card. After correct reconstructing the shared secrets, it is permissible to access the coffer's door. For resisting dishonest participants, cheating detection and cheater identification will be included. The user can change his password of smart card freely and need not memorize his assigned lengthy password and shadow as traditional ID-based schemes makes our implementation much more user friendly.

  • Autonomous Integration and Optimal Allocation of Heterogeneous Information Services for High-Assurance in Distributed Information Service System

    Xiaodong LU  Kinji MORI  

     
    PAPER-Agent-Based Systems

      Vol:
    E86-D No:10
      Page(s):
    2087-2094

    Information service provision and utilization is an important infrastructure in the high-assurance distributed information service system. In order to cope with the rapidly evolving situations of providers' and users' heterogeneous requirements, one autonomous information service system has been proposed, called Faded Information Field (FIF). FIF is a distributed information service system architecture, sustained by push/pull mobile agents, through a recursive demand-oriented provision of the most popular information closer to the users to make a tradeoff between the cost of service allocation and access. In this system, users' requests are autonomously driven by pull mobile agents in charge of finding the relevant service. In the case of a mono-service request, the system is designed to reduce the time needed for users to access the information and to preserve the consistency of the replicas. However, when the user requests joint selection of multiple services, synchronization of atomic actions and timeliness have to be assured by the system. In this paper, the relationship that exists among the contents, properties and access ratios of information services is clarified. Based on these factors, the ratio of correlation and degree of satisfaction are defined and the autonomous integration and optimal allocation of information services for heterogeneous FIFs to provide one-stop service for users' multi-service requirements are proposed. The effectiveness of the proposed technology is shown through evaluation and the results show that the integrated services can reduce the total users access time and increase services consumption compared with separate systems.

  • Autonomous Step-by-Step System Construction Technique Based on Assurance Evaluation

    Kazuo KERA  Keisuke BEKKI  Kinji MORI  

     
    PAPER-Reliability and Availability

      Vol:
    E86-D No:10
      Page(s):
    2145-2153

    The recent real time systems have the needs of system expandability with heterogeneous functions and operations. High assurance system is very important for such systems. In order to realize the high assurance system, we research the autonomous step-by-step construction technique based on assurance evaluation. In this paper we propose the average functional reliability as the best index to indicate the assurance performance for system construction. We also propose the autonomous step-by-step construction technique to decide the construction sequence to maximize the assurance performance.

  • An Integrated Approach for Implementing Imprecise Computations

    Hidenori KOBAYASHI  Nobuyuki YAMASAKI  

     
    PAPER

      Vol:
    E86-D No:10
      Page(s):
    2040-2048

    The imprecise computation model is one of the flexible computation models used to construct real-time systems. It is especially useful when the worst case execution times are difficult to estimate or the execution times vary widely. Although there are several ways to implement this model, they have not attained much attentions of real-world application programmers to date due to their unrealistic assumptions and high dependency on the execution environment. In this paper, we present an integrated approach for implementing the imprecise computation model. In particular, our research covers three aspects. First, we present a new imprecise computation model which consists of a mandatory part, an optional part, and another mandatory part called wind-up part. This wind-up part allows application programmers to explicitly incorporate into their programs the exact operations needed for safe degradation of performance when there is a shortage in resources. Second, we describe a scheduling algorithm called Mandatory-First with Wind-up Part (M-FWP) which is based on the Earliest Deadline First strategy. This algorithm, unlike scheduling algorithms developed for the classical imprecise computation model, is capable of scheduling a mandatory portion after an optional portion. Third, we present a dynamic priority server method for an efficient implementation of the M-FWP algorithm. We also show that the number of the proposed server at most needed per node is one. In order to estimate the performance of the proposed approach, we have implemented a real-time operating system called RT-Frontier. The experimental analyses have proven its ability to implement tasks based on the imprecise computation model without requiring any knowledge on the execution time of the optional part. Moreover, it also showed performance gain over the traditional checkpointing technique.

  • REX: A Reconfigurable Experimental System for Evaluating Parallel Computer Systems

    Yuetsu KODAMA  Toshihiro KATASHITA  Kenji SAYANO  

     
    PAPER

      Vol:
    E86-D No:10
      Page(s):
    2016-2024

    REX is a reconfigurable experimental system for evaluating and developing parallel computer systems. It consists of large-scale FPGAs, and enables the systems to be reconfigured from their processors to the network topology in order to support their evaluation and development. We evaluated REX using several implementations of parallel computer systems, and showed that it had enough scalability of gates, memory throughput and network throughput. We also showed that REX was an effective tool because of its emulation speed and reconfigurability to develop systems.

  • The Theory of Software Reliability Corroboration

    Bojan CUKIC  Erdogan GUNEL  Harshinder SINGH  Lan GUO  

     
    PAPER-Testing

      Vol:
    E86-D No:10
      Page(s):
    2121-2129

    Software certification is a notoriously difficult problem. From software reliability engineering perspective, certification process must provide evidence that the program meets or exceeds the required level of reliability. When certifying the reliability of a high assurance system very few, if any, failures are observed by testing. In statistical estimation theory the probability of an event is estimated by determining the proportion of the times it occurs in a fixed number of trials. In absence of failures, the number of required certification tests becomes impractically large. We suggest that subjective reliability estimation from the development lifecycle, based on observed behavior or the reflection of one's belief in the system quality, be included in certification. In statistical terms, we hypothesize that a system failure occurs with the hypothesized probability. Presumed reliability needs to be corroborated by statistical testing during the reliability certification phase. As evidence relevant to the hypothesis increases, we change the degree of belief in the hypothesis. Depending on the corroboration evidence, the system is either certified or rejected. The advantage of the proposed theory is an economically acceptable number of required system certification tests, even for high assurance systems so far considered impossible to certify.

20921-20940hit(30728hit)