Ozgur ERCETIN Ozgur GURBUZ Kerem BULBUL Ertugrul CIFTCIOGLU Aylin AKSU
The recent progress in sensor and wireless communication technologies has enabled the design and implementation of new applications such as sensor telemetry which is the use of wireless sensors to gather fine-grained information from products, people and places. In this work, we consider a realistic telemetry application in which an area is periodically monitored by a sensor network which gathers data from equally spaced sample points. The objective is to maximize the lifetime of the network by jointly selecting the sensing nodes, the node transmission powers and the route to the base station from each sensing node. We develop an optimization-based algorithm OPT-RE and a low complexity algorithm SP-RE for this purpose and analyze their dynamics through extensive numerical studies. Our results indicate that SP-RE is a promising algorithm which has comparable performance to that of the more computationally intensive OPT-RE algorithm. The energy consumption is significantly affected by the channel access method, and in this paper, we also compare the effects of the collision free TDMA and contention based CSMA/CA methods. We propose practical enhancements to CSMA/CA so that the energy consumption due to collisions is reduced. Our simulation results indicate that with the proposed enhancements contention based channel access can provide comparable performance to that of the collision free methods.
Ziv-Lempel incremental parsing [1] is a fundamental algorithm for lossless data compression. There is a simple enumerative implementation [7] which preserves a duality between the encoder and the decoder. However, due to its compactness, the implementation when combined with a complete integer code, allows only an input sequence with a length consistent with the parsing boundaries. In this letter, we propose a simple additional mechanism for post-processing a binary file of arbitrary length, provided the file punctuation is externally managed.
Qi WANG Kazunori SHIMIZU Takeshi IKENAGA Satoshi GOTO
In this paper we introduce an area and power efficient fully-parallel LDPC decoder design, which keeps the BER performance while consuming less hardware resources and lower power compared with conventional decoders. For this decoder, we firstly propose two improved simplified min-sum algorithms, which enable the decoder to reduce the hardware implementation complexity and area: hardware consumption of check operation module is reduced by 40%, while achieving a negligible performance loss compared with the general min-sum algorithm. To reduce the power dissipation of the decoder, we also proposed a power-saved strategy, according to which the message evolution halts as the parity-check condition is satisfied. This strategy reduces more than 50% power under good channel condition. The synthesis result in 0.18 µm CMOS technology shows our decoder based on (648,540) irregular LDPC code of WLAN (802.11n) protocol achieves 810 [Mbps] throughput with 283 [mW] power consumption.
Naoki HAYASHI Toshimitsu USHIO Fumiko HARADA Atsuko OHNO
This paper addresses a discrete-time consensus problem with non-linear performance functions over dynamically changing communication topologies. Each agent has a performance value based on its internal information state and exchanges the performance value with other agents to achieve consensus. We derive sufficient conditions for a global consensus using algebraic graph theory.
This letter presents an adaptive H∞ array beamforming scheme based on a generalized sidelobe canceller with lower computational load. It is shown that the adaptive H∞-based beamformer offers the advantages of faster convergence speed, insensitivity to dynamic estimation modeling error, and less sensitivity to pointing error over the conventional adaptive H∞ algorithm. Simulations confirm that the proposed technique achieves similar array performance of the adaptive H∞-based algorithm [4].
Zhou SU Masato OGURO Jiro KATTO Yasuhiko YASUDA
Content delivery network improves end-user performance by replicating Web contents on a group of geographically distributed sites interconnected over the Internet. However, with the development whereby content distribution systems can manage dynamically changing files, an important issue to be resolved is consistency management, which means the cached replicas on different sites must be updated if the originals change. In this paper, based on the analytical formulation of object freshness, web access distribution and network topology, we derive a novel algorithm as follows: (1) For a given content which has been changed on its original server, only a limited number of its replicas instead of all replicas are updated. (2) After a replica has been selected for update, the latest version will be sent from an algorithm-decided site instead of from its original server. Simulation results verify that the proposed algorithm provides better consistency management than conventional methods with the reduced the old hit ratio and network traffic.
Yasuhiro SUZUKI Hiroya TAKAMURA Manabu OKUMURA
In this paper, we present a method to automatically acquire a large-scale vocabulary of evaluative expressions from a large corpus of blogs. For the purpose, this paper presents a semi-supervised method for classifying evaluative expressions, that is, tuples of subjects, their attributes, and evaluative words, that indicate either favorable or unfavorable opinions towards a specific subject. Due to its characteristics, our semi-supervised method can classify evaluative expressions in a corpus by their polarities, starting from a very small set of seed training examples and using contextual information in the sentences the expressions belong to. Our experimental results with real Weblog data as our corpus show that this bootstrapping approach can improve the accuracy of methods for classifying favorable and unfavorable opinions. We also show that a reasonable amount of evaluative expressions can be really acquired.
Tri-Thanh NGUYEN Akira SHIMAZU
Named entities play an important role in many Natural Language Processing applications. Currently, most named entity recognition systems rely on a small set of general named entity (NE) types. Though some efforts have been proposed to expand the hierarchy of NE types, there are still a fixed number of NE types. In real applications, such as question answering or semantic search systems, users may be interested in more diverse specific NE types. This paper proposes a method to extract categories of person named entities from text documents. Based on Dual Iterative Pattern Relation Extraction method, we develop a more suitable model for solving our problem, and explore the generation of different pattern types. A method for validating whether a category is valid or not is proposed to improve the performance, and experiments on Wall Street Journal corpus give promising results.
Shangce GAO Hongwei DAI Gang YANG Zheng TANG
The Clonal Selection Algorithm (CSA) is employed by the natural immune system to define the basic features of an immune response to an antigenic stimulus. In the immune response, according to Burnet's clonal selection principle, the antigen imposes a selective pressure on the antibody population by allowing only those cells which specifically recognize the antigen to be selected for proliferation and differentiation. However ongoing investigations indicate that receptor editing, which refers to the process whereby antigen receptor engagement leads to a secondary somatic gene rearrangement event and alteration of the receptor specificity, is occasionally found in affinity maturation process. In this paper, we extend the traditional CSA approach by incorporating the receptor editing method, named RECSA, and applying it to the Traveling Salesman Problem. Thus, both somatic hypermutation (HM) of clonal selection theory and receptor editing (RE) are utilized to improve antibody affinity. Simulation results and comparisons with other general algorithms show that the RECSA algorithm can effectively enhance the searching efficiency and greatly improve the searching quality within reasonable number of generations.
Zhi-Ren TSAI Jiing-Dong HWANG Yau-Zen CHANG
This study introduces the fuzzy Lyapunov function to the fuzzy PID control systems, modified fuzzy systems, with an optimized robust tracking performance. We propose a compound search strategy called conditional linear matrix inequality (CLMI) approach which was composed of the proposed improved random optimal algorithm (IROA) concatenated with the simplex method to solve the linear matrix inequality (LMI) problem. If solutions of a specific system exist, the scheme finds more than one solutions at a time, and these fixed potential solutions and variable PID gains are ready for tracking performance optimization. The effectiveness of the proposed control scheme is demonstrated by the numerical example of a cart-pole system.
Jongsub CHA Keonkook LEE Joonhyuk KANG
In this paper, a computationally efficient stack-based iterative detection algorithm is proposed for V-BLAST systems. To minimize the receiver's efforts as much as possible, the proposed scheme employs iterative tree search for complexity reduction and storage saving. After an M-ary tree structure by QR decomposition of channel matrix is constructed, the full tree depth is divided into the first depth and the remaining ones. At tree depth of one, the proposed algorithm finds M candidate symbols. Based on these symbols, it iteratively searches the remaining symbols at second-to-last depth, until finding an optimal symbol sequence. Simulation results demonstrate that the proposed algorithm yields the performance close to that of sphere detection (SD) with significant saving in complexity and storage.
It has been shown that the output information produced by the soft output Viterbi algorithm (SOVA) is too optimistic. To compensate for this, the output information should be normalized. This letter proposes a simple normalization technique that extends the existing sign difference ratio (SDR) criterion. The new normalization technique counts the sign differences between the a-priori information and the extrinsic information, and then adaptively determines the corresponding normalization factor for each data block. Simulations comparing the new technique with other well-known normalization techniques show that the proposed normalization technique can achieve about 0.2 dB coding gain improvement on average while reducing up to about 1/2 iteration for decoding.
Supawan ANNANAB Tetsuki TANIGUCHI Yoshio KARASAWA
We introduce a novel configuration for a multi-user Multiple-Input Multiple-Output (MIMO) system in mobile communication over fast fading channels using space-time block coding (STBC) and adaptive array. The proposed scheme adopts the simultaneous transmission of data and pilot signals which reduces control errors caused by delay of obtaining channel state information (CSI). Data and pilot signals are then encoded using a space-time block code and are transmitted from two transmit antennas. In order to overcome the fast fading problem, implementation of adaptive array using recursive least squares (RLS) algorithms is considered at the base station. Through computer simulation, it is shown that the proposed scheme in this way can overcome Doppler spread in higher frequencies and suppress co-channel interference up to N-1 users for N receiving antennas.
Cheng-Wei QIU Hai-Ying YAO Shah-Nawaz BUROKUR Said ZOUHDI Le-Wei LI
Electromagnetic scattering properties of metamaterial cylinders due to a line source are studied by a multilayer algorithm based on eigenfunctional expansion. Closed forms of electric and magnetic fields are formulated. Both the fields inside the cylinder and field in outer space are plotted for different sizes of the cylinder. The focusing phenomena and the wave propagation in the presence of metamaterial cylinders are investigated and shown. Electromagnetic field distributions are presented for subwavelength metamaterial cylinders and cylinders fabricated by magnetoelectric materials, and resonant scattering and focusing properties are reported. Special designs of scatterer cloaking are proposed and calculated by multilayer algorithm which can reduce scattering cross sections.
Daiki KOIZUMI Naoto KOBAYASHI Toshiyasu MATSUSHIMA Shigeichi HIRASAWA
Reliability-based hybrid ARQ (RB-HARQ) is a kind of incremental-redundancy ARQ recently introduced. In the RB-HARQ, the receiver returns both NAK signal and set of unreliable bit indices if the received sequence is not accepted. Since each unreliable bit index is determined by the bitwise posterior probability, better approximation of that probability becomes crucial as the number of retransmissions increases. Assuming the systematic code for the initial transmission, the proposed RB-HARQ scheme has the following two features: (a) the sender retransmits newly encoded and interleaved parity bits corresponding to the unreliable information bits; (b) the retransmitted parity bits as well as the initial received sequence are put all together to perform the message passing decoding i.e. the suboptimal MAP decoding. Finally, simulation results are shown to evaluate the above two features.
Mitsuyoshi KISHIHARA Isao OHTA Kuniyoshi YAMANE
This paper proposes a new type of compact waveguide directional coupler, which is constructed from two crossed E-plane rectangular waveguide with two metallic posts in the square junction and one metallic post at each port. The metallic posts in the square junction are set symmetrically along a diagonal line to obtain the directivity properties. The metallic post inserted at each input/output waveguide port can realize a matched state. Tight-coupling properties 0.79-6 dB are realized by optimizing the dimension of the junction and the positions/radii of the posts. The design results are verified by an em-simulator (Ansoft HFSS) and experiments.
Iakovos OURANOS Petros STEFANEAS Panayiotis FRANGOS
We present MobileOBJ, a formal framework for specifying and verifying mobile systems. Based on hidden algebra, the components of a mobile system are specified as behavioral objects or Observational Transition Systems, a kind of transition system, enriched with special action and observation operators related to the distinct characteristics of mobile computing systems. The whole system comes up as the concurrent composition of these components. The implementation of the abstract model is achieved using CafeOBJ, an executable, industrial strength algebraic specification language. The visualization of the specification can be done using CafeOBJ graphical notation. In addition, invariant and behavioral properties of mobile systems can be proved through theorem proving techniques, such as structural induction and coinduction that are fully supported by the CafeOBJ system. The application of the proposed framework is presented through the modeling of a mobile computing environment and the services that need to be supported by the former.
Hiroyuki ISHIDA Tomokazu TAKAHASHI Ichiro IDE Yoshito MEKADA Hiroshi MURASE
We present a novel training method for recognizing traffic sign symbols. The symbol images captured by a car-mounted camera suffer from various forms of image degradation. To cope with degradations, similarly degraded images should be used as training data. Our method artificially generates such training data from original templates of traffic sign symbols. Degradation models and a GA-based algorithm that simulates actual captured images are established. The proposed method enables us to obtain training data of all categories without exhaustively collecting them. Experimental results show the effectiveness of the proposed method for traffic sign symbol recognition.
An audio-based shot classification method for audiovisual indexing is proposed in this paper. The proposed method mainly consists of two parts, an audio analysis part and a shot classification part. In the audio analysis part, the proposed method utilizes both principal component analysis (PCA) and Mahalanobis generalized distance (MGD). The effective features for the analysis can be automatically obtained by using PCA, and these features are analyzed based on MGD, which can take into account the correlations of the data set. Thus, accurate analysis results can be obtained by the combined use of PCA and MGD. In the shot classification part, the proposed method utilizes a fuzzy algorithm. By using the fuzzy algorithm, the mixing rate of the multiple audio sources can be roughly measured, and thereby accurate shot classification can be attained. Results of experiments performed by applying the proposed method to actual audiovisual materials are shown to verify the effectiveness of the proposed method.
This paper proposes a new computational optimization method modified from the dynamic encoding algorithm for searches (DEAS). Despite the successful optimization performance of DEAS for both benchmark functions and parameter identification, the problem of exponential computation time becomes serious as problem dimension increases. The proposed optimization method named univariate DEAS (uDEAS) is especially implemented to reduce the computation time using a univariate local search scheme. To verify the algorithmic feasibility for global optimization, several test functions are optimized as benchmark. Despite the simpler structure and shorter code length, function optimization performance show that uDEAS is capable of fast and reliable global search for even high dimensional problems.