Masaaki IIZUKA Masakazu NAKAMURA Kazuhiro KUDO Kuniaki TANAKA
We investigated the electrical properties of hole transport materials such as TPD, α-NPD and m-MTDATA using in-situ field effect measurement. TPD, α-NPD and m-MTDATA films showed p-type semiconducting properties, and their electrical parameters such as conductivity, carrier mobility and carrier concentration were obtained. We also examined the effect of the substrate temperature during vacuum deposition and the thermal treatment after deposition, on the electrical parameters of the films. Experimental results showed that conductivity and carrier mobility decreased as the substrate temperature increased over the glass transition temperature. These decreases in conductivity and carrier mobility as a result of thermal treatment appear to be strongly related to the degradation mechanism of organic electroluminescent devices.
Jian YANG Yingning PENG Yoshio YAMAGUCHI Wolfgang-Martin BOERNER
The concept of the equi-phase curve is introduced for the cross-polarized channel case. It is proved that the equi-phase curves are a series of half circles on the Poincare sphere, and that all these curves have two common ends. Based on the introduced concept, this letter demonstrates the distribution of the received voltage's phases on the Poincare sphere. In addition, it is shown theoretically that the cross-polarized phase of the off-diagonal elements of a scattering matrix is unstable for most natural targets. Therefore, the cross-polarized phase information cannot be used for extracting target characteristics in polarimetric radar remote sensing.
Young-yeol CHOO Yungoo HUH Cheeha KIM
The IETF Mobile IP defines two multicast options: remote subscription (RS) and bi-direction tunneling (BT). In order to synthesize the strong points of these two IETF multicast options, we propose a hybrid approach, mMOM, which selectively uses two IETF multicast options based on the mobility of mobile hosts. Whenever a mobile host requests its first registration to a certain foreign agent, the corresponding foreign agent starts the service using the BT option. Afterwards, if it requests re-registration to the same foreign agent, the foreign agent considers it to be relatively immobile and continues services using the RS option. We propose a new metric to compare heterogeneous algorithms. Simulation results show that our approach outperforms all others.
Kiejin PARK Hiroki MINAMI Toshihiro UEHARA Haruo OKUDA Sungsoo KIM
To understand the characteristics of a multimedia service, such as the large volume of data transfer and real-time constraints, it is necessary to have a performance evaluation tool for an HDD. Our HDD simulator is running on a PC operated on FreeBSD UNIX OS. We first investigate the seek time and the sustained rate of HDDs and then evaluate the performance of an HDD for an experimental VOD system. Applying the experimental results, we find the bottleneck of an HDD, and then suggest what HDDs are to be selected for a VOD system.
Measuring traffic dynamics during intervals of a few seconds is important in the management of network performance. If the distribution of average traffic volume during a few seconds is measured, an administrator can manage the quality of the networks using the α percentile of the distribution. We propose a method of estimating the distribution of traffic volume during short intervals, such as a few seconds, by using only traffic information from the management information base (MIB) of routers or switches. This estimation method is based on traffic characteristics that are observed in traffic measurements in actual networks. It imposes little additional load on routers or switches and the computation time required to estimate the distribution is also short. Numerical examples using actual traffic data are also given.
Koji CHIDA Shigenori UCHIYAMA Taiichi SAITO
Since the invention of the RSA scheme, a lot of public-key encryption and signature schemes based on the intractability of integer factoring have been proposed. Most employ integers of the form N = p q, such as the RSA scheme, but some employ integers of the form N = pr q. It has been reported that RSA decryption speed can be greatly improved by using N = pr q integers for large r. On the other hand, Boneh et al. proposed a novel integer factoring method for integers such as N = pr q for large r. This factoring algorithm, the so-called Lattice Factoring Method, is based on the LLL-algorithm. This paper proposes a new method for factoring integers of the form N = pr q for large r and gives a new characterization of r such that factoring integers N = pr q is easier. More precisely, the proposed method strongly depends on the size and smoothness of the exponent, r. The theoretical consideration of and implementation of our method presented in this paper show that if r satisfies a certain condition our method is faster than both Elliptic Curve Method and Lattice Factoring Method. In particular, the theoretical consideration in this paper mainly employs the techniques described in the excellent paper by Adleman, Pomerance and Rumely that addresses primality testing.
System specifications should be refined to meet stakeholders' requirements as much as possible, because the first specification does not satisfy all stakeholders in general. This paper presents a procedure to refine behavioral specification to satisfy stakeholders. Non-functional requirements are used for checking stakeholders' satisfaction. With this procedure, stakeholder-dissatisfaction can be reduced and new possibilities to satisfy or dissatisfy other stakeholders can be found, since a modification to cancel dissatisfaction can sometimes influence the satisfaction of the others.
A communication model and a computer assisted communication method are introduced. With this model incorrect communications between humans are explained and then a method to lead successful communications with computer is illustrated. This method improves qualities of communications and can be applied to co-operative works. On the basis of the communication method, we have been developing a co-operative visual software requirements definition method via network with a visual requirements language named VRDL. Our method will improve quality of software requirements specification (SRS).
Kimihiro TAJIMA Ryuichi KOBAYASHI Nobuo KUWABARA Masamitsu TOKUDA
An electric filed sensor using Mach-Zehnder interferometers has been designed to operate more than 10 GHz. The velocity of optical wave on the waveguide is investigated to determine the electrode length, and the characteristics of frequency response are analyzed using the moment method to determine the sensor element length. The electrode length of 1 mm and the element length of 8 mm are settled by these investigations. An isotropic electric field sensor is constructed using three sensors. The minimum detectable electric field strength is 22 mV/m at frequency bandwidth of 100 Hz. This is about 100 times for the conventional electric field sensor using the similar element. The sensitivity deviation is within 3 dB when temperature changes from 0 degree to 40 degree. The deviation of directivity can be tuned within 1 dB to calibrate the sensitivity of the each element. The sensitivity degradation is within 6 dB up to 5 GHz and within 10 dB up to 10 GHz. This is almost agree with the calculated results. The sensor can measure almost the same waveform as the applied electric field pulse whose width is 6 ns and rise time is less than 2.5 ns.
Narayan D. KATARIA Mukul MISRA
The measurement sensitivity of microwave surface resistance, Rs, of high temperature superconducting (HTS) thin films using half-wavelength microstrip resonator with copper and HTS ground plane is analyzed for fundamental and higher order modes of the resonator. The estimated sensitivity of Rs-measurement is at least an order of magnitude greater at fundamental resonant frequency compared to when measured using higher order harmonic modes.
Martin STEINBAUER Huseyin OZCELIK Helmut HOFSTETTER Christoph F. MECKLENBRAUKER Ernst BONEK
This contribution discusses which information can be derived from estimated directions of arrival (DOAs) and directions of departure (DODs) from a multiple-input multiple-output (MIMO) radio system, and establishes two new parameters describing the multipath spread at both link ends. We find that the multipath component separation, MCS, combines delay, (double-) angular and Doppler dispersion, as appropriate. MCS provides a system-independent radio characterization of propagation environments and aids in selecting optimum positions for smart-antenna deployment. Evaluation of double-directional measurements (antenna arrays at both link ends) in indoor environments show the usefulness and the limits of the multipath component separation concept.
A design method is proposed that yields the optimum remote pre-amplifier (RPRA) parameters considering cable repair, the results of include increased cable loss and insertion position uncertainty. The optimum RPRA location is given by the intersection point of optical SNR (OSNR) vs. RPRA location curves in two cases; the total cable repair loss is assumed to be inserted at the transmitter end and at the receiver end. This RPRA parameter gives the maximum OSNR in the worst loss insertion case by cable repair.
Kiyoko KATAYANAGI Yasuyuki MURAKAMI Masao KASAHARA
Recently, Kasahara and Murakami proposed new product-sum type public-key cryptosystems with the Chinese remainder theorem, Methods B-II and B-IV. They also proposed a new technique of selectable encryption key, which is referred to as 'Home Page Method (HP Method).' In this paper, first, we describe Methods B-II and B-IV. Second, we propose an effective attack for Method B-II and discuss the security of Methods B-II and B-IV. Third, applying the HP Method to Methods B-II and B-IV, we propose new product-sum type PKC with selectable encryption key. Moreover, we discuss the security of the proposed cryptosystems.
Sunao IWAKI Mitsuo TONOIKE Shoogo UENO
In this paper, we propose a method to reconstruct current distributions in the human brain from neuromagnetic measurements. The proposed method is based on the weighted lead-field synthetic (WLFS) filtering technique with the weighting factors calculated from the results of previous source space scanning. In this method, in addition to the depth normalization technique, weighting factors of the WLFS are determined by the cost values previously calculated based on the multiple signal classification (MUSIC) scan. We performed computer simulations of this method under noisy measurement conditions and compared the results to those obtained with the conventional WLFS method. The results of the simulations indicate that the proposed method is effective for the reconstruction of the current distributions in the human brain using magnetoencephalographic (MEG) measurements, even if the signal-to-noise ratio of the measured data is relatively low. We applied the proposed method to the magnetoencephalographic data obtained during a mental image processing task that included object recognition and mental rotation operations. The results suggest that the proposed method can extract the neural activity in the extrastriate visual region and the parietal region. These results are in agreement with the results of previous positron emission tomography (PET) and functional magnetic resonance imaging (fMRI) studies.
Yusuke KAWASAKI Naotaka NITTA Tsuyoshi SHIINA
Technique of Measuring 3-D velocity vector components is important for the correct diagnosis of the blood flow pattern and quantitative assessment of intratumor perfusion. However, present equipment based on ultrasonic Doppler can not provide us true 3-D velocity. To overcome the problem, we previously proposed a new method of 3-D velocity vector measurement. The method uses 2-D array probe and enable us to obtain three components of velocity vector with real time by integrating the Doppler phase shift on the each element with the relative small single aperture compared with conventional method. Basic performance of the method has been evaluated by computer simulation. In this paper, to evaluate the feasibility of the proposed method, experimental investigation using a simple ring array probe and a phantom were carried out. Three components of velocity vector for different velocity magnitude and flow direction were measured. Experimental results validated its ability of measuring 3-D velocity and its feasibility.
Aki AWATA Yuji KATO Koichi SHIMIZU
A technique was developed to reconstruct the cross-sectional image of the absorption distribution in a diffuse medium using backscattered light. In this technique, we illuminate an object with an ultra-short pulse, and measure the time-resolved pulse shape of the light backscattered from the object. The absorption distribution of the scattering object can be estimated using the propagation-path distribution of photons at each detection time and the optical impulse response of backscattered light. In a simulation, the effectiveness of this technique was verified in the cases of a layered absorber and a three dimensional absorber. The nonlinear relationship between the depth of the probing region and the propagation time was clarified. The accuracy of the image reconstruction was significantly improved by the aperiodic sampling of the backscattered impulse response according to the nonlinear relation. The feasibility of the proposed technique was verified in the experiment with a model phantom.
Satoru UEHARA Osamu MIZUNO Tohru KIKUNO
In this paper we discuss the estimation of effort needed to update program codes according to given design specification changes. In the Object-Oriented incremental development (OOID), the requirement changes occur frequently and regularly. When a requirement change occurs, a design specification is changed accordingly. Then a program code is updated for given design specification change. In order to construct the development plan dynamically, a simple and fast estimation method of efforts for code updating is strongly required by both developers and managers. However, existing estimation methods cannot be applied to the OOID. We therefore try to propose a straightforward approach to estimate effort for code updating, which reflects the specific properties of the OOID. We list up following factors of the effort estimation for OOID: (1) updating activities consist of creation, deletion, and modification, (2) the target to be updated has four kinds of types (void type, basic type, library type, and custom type), (3) the degree of information hiding is classified into private, protected and public, and (4) the degree of inheritance affects updating efforts. We then propose a new formula E(P,σ) to calculate the efforts needed to update a program P according to a set of design specification changes σ. The formula E(P,σ) includes weighting parameters: Wupd, Wtype, Winf-h and Winht according to the characteristics (1), (2), (3) and (4), respectively. Finally, we conduct experimental evaluations by applying the formula E(P,σ) to actual project data in a certain company. The evaluation results statistically showed the validity of the proposed approach to some extent.
Masato TSURU Tetsuya TAKINE Yuji OIE
In the Internet, because of huge scale and distributed administration, it is of practical importance to infer network-internal characteristics that cannot be measured directly. In this paper, based on a general framework we proposed previously, we present a feasible method of inferring packet loss rates of individual links from end-to-end measurement of unicast probe packets. Compared with methods using multicast probes, unicast-based inference methods are more flexible and widely applicable, whereas they have a problem with imperfect correlation in concurrent events on paths. Our method can infer link loss rates under this problem, and is applicable to various path-topologies including trees, inverse trees and their combinations. We also show simulation results which indicate potential of our unicast-based method.
In this paper, a network dimensioning approach suitable to the Internet is discussed. Differently from the traditional telephone networks, it is difficult to guarantee QoS for end-users even in a statistically sense due to an essential nature of an end-to-end communication architecture in the Internet. We should therefore adopt another approach, based on the traffic measurement. In the approach, the traffic measurement is performed for monitoring the end-to-end QoS. Then, the network adaptively controls the link capacities to meet the user's QoS demands. For this purpose, the underlying network should support such a capability that the link capacities can be flexibly reused. With the WDM network as an underlying network, an example scenario for network provisioning is finally illustrated.
Very reliable mode-locked semiconductor lasers have been developed. These devices provide high signal-to-noise ratio optical clock pulses of a few picoseconds temporal width in the 1.5-micrometer wavelength region. Potential applications of these lasers for high-bit-rate optical communication systems operating at over 40 Gbps including all-optical signal processing, and for very high-speed measurement systems are described.