The search functionality is under construction.
The search functionality is under construction.

IEICE TRANSACTIONS on Information

  • Impact Factor

    0.59

  • Eigenfactor

    0.002

  • article influence

    0.1

  • Cite Score

    1.4

Advance publication (published online immediately after acceptance)

Volume E86-D No.6  (Publication Date:2003/06/01)

    Regular Section
  • Semantics of Normal Goals as Acquisitors Caused by Negation as Failure

    Susumu YAMASAKI  

     
    PAPER-Theory and Models of Software

      Page(s):
    993-1000

    We are concerned with semantic views on an extended version of SLD resolution with negation as failure (SLDNF resolution) for normal logic programs, which Eshghi and Kowalski (1989) presented by making the SLDNF resolution capable of keeping negated predicates in memory and of extracting abducible predicates. This paper deals with its formal representation in relational form, for the purpose of interpreting the normal goal as an acquisitor of negated predicates stored in memory. Some set acquired by the derivations which the normal goal evokes is defined to be a semantics of the goal, under the constraint that the set is as large as possible and does not violate consistency in model theory. The semantics is discussed with relation to the 3-valued logic model theory, where the model theory is represented by alternating fixpoint semantics (Van Gelder, 1993). For simplicity of treatment, this paper is concerned with the normal logic program in the propositional logic.

  • WebSiteGen: Web-Based Database Application Generator

    Doohun EUM  Toshimi MINOURA  

     
    PAPER-Software Engineering

      Page(s):
    1001-1010

    We can easily access a remote database as well as a local database with HTML forms. Although implementing a database application with HTML forms is much simpler than implementing it with a proprietary graphical user-interface system, HTML forms and CGI programs still must be coded. We implemented a software tool that automatically generates the SQL statements that create a database for an application, the forms that are used as a user interface, and the Java servlets that retrieve the data requested through the forms. The database tables to be created and the forms to be generated are determined by the class diagram for the application. Our software tool, which we call WebSiteGen, thus simplifies the implementation of a Web-based database application.

  • Design Pattern Specification Language: Definition and Application

    Woochang SHIN  Chisu WU  

     
    PAPER-Software Engineering

      Page(s):
    1011-1023

    Design patterns can be regarded as an approach to encapsulate and reuse good design practices. However, most design patterns are specified using informal text and examples. To obtain all of the benefits of patterns, formal specification and tool support are indispensable. This paper proposes a Design Pattern Specification Language (DPSL) that is both manageable and effective. The DPSL provides software developers with the capability to treat design patterns as concrete design units without lowering abstraction. To demonstrate the usability of our DPSL and its application in design modeling, we have developed a prototype tool that supports the DPSL in UML diagrams. This prototype allows us to demonstrate the tool's support possibilities and the usability of patterns for software development applications.

  • Web Community Chart: A Tool for Navigating the Web and Observing Its Evolution

    Masashi TOYODA  Masaru KITSUREGAWA  

     
    PAPER-Databases

      Page(s):
    1024-1031

    We propose a web community chart that is a tool for navigating the Web and for observing its evolution through web communities. A web community is a set of web pages created by individuals or associations with a common interest in a topic. Recent research shows that such communities can be extracted by link analysis. Our web community chart is a graph of whole communities, in which relevant communities are connected by edges. Using this chart, we can navigate through related communities. Moreover we can answer historical queries about topics on the Web and understand sociology of web community creation, by observing when and how communities emerged and evolved. We observe the evolution of communities by comparing three charts built from Japanese web archives crawled in 1999, 2000, and 2001. Several metrics are introduced for measuring the degree of community evolution, such as growth rate, novelty. Finally, we develop a web community evolution viewer that allows us to extract evolving communities using the relevance and metrics. Several evolution examples are shown using this viewer.

  • A Dimensionality Reduction Method for Efficient Search of High-Dimensional Databases

    Zaher AGHBARI  Kunihiko KANEKO  Akifumi MAKINOUCHI  

     
    PAPER-Databases

      Page(s):
    1032-1041

    In this paper, we present a novel approach for efficient search of high-dimensional databases, such as video shots. The idea is to map feature vectors from the high-dimensional feature space into a point in a low-dimensional distance space. Then, a spatial access method, such as an R-tree, is used to cluster these points based on their distances in the low-dimensional space. Our mapping method, called topological mapping, guarantees no false dismissals in the result of a query. However, the result of a query might contain some false alarms. Hence, two refinement steps are performed to remove these false alarms. Comparative experiments on a database of video shots show the superior efficiency of the topological mapping method over other known methods.

  • Design for Two-Pattern Testability of Controller-Data Path Circuits

    Md. ALTAF-UL-AMIN  Satoshi OHTAKE  Hideo FUJIWARA  

     
    PAPER-Fault Tolerance

      Page(s):
    1042-1050

    This paper introduces a design for testability (DFT) scheme for delay faults of a controller-data path circuit. The scheme makes use of both scan and non-scan techniques. First, the data path is transformed into a hierarchically two-pattern testable (HTPT) data path based on a non-scan approach. Then an enhanced scan (ES) chain is inserted on the control lines and the status lines. The ES chain is extended via the state register of the controller. If necessary, the data path is further modified. Then a test controller is designed and integrated to the circuit. Our approach is mostly based on path delay fault model. However the multiplexer (MUX) select lines and register load lines are tested as register transfer level (RTL) segments. For a given circuit, the area overhead incurred by our scheme decreases substantially with the increase in bit-width of the data path of the circuit. The proposed scheme supports hierarchical test generation and can achieve fault coverage similar to that of the ES approach.

  • An Adaptive Visual Attentive Tracker with HMM-Based TD Learning Capability for Human Intended Behavior

    Minh Anh Thi HO  Yoji YAMADA  Yoji UMETANI  

     
    PAPER-Artificial Intelligence, Cognitive Science

      Page(s):
    1051-1058

    In the study, we build a system called Adaptive Visual Attentive Tracker (AVAT) for the purpose of developing a non-verbal communication channel between the system and an operator who presents intended movements. In the system, we constructed an HMM (Hidden Markov Models)-based TD (Temporal Difference) learning algorithm to track and zoom in on an operator's behavioral sequence which represents his/her intention. AVAT extracts human intended movements from ordinary walking behavior based on the following two algorithms: the first is to model the movements of human body parts using HMMs algorithm, and the second is to learn the model of the tracker's action using a model-based TD learning algorithm. In the paper, we describe the integrated algorithm of the above two methods: whose linkage is established by assigning the state transition probability in HMM as a reward in TD learning. Experimental results of extracting an operator's hand sign action sequence during her natural walking motion are shown which demonstrates the function of AVAT as it is developed within the framework of perceptual organization. Identification of the sign gesture context through wavelet analysis autonomously provides a reward value for optimizing AVAT's action patterns.

  • Improved Phoneme-History-Dependent Search Method for Large-Vocabulary Continuous-Speech Recognition

    Takaaki HORI  Yoshiaki NODA  Shoichi MATSUNAGA  

     
    PAPER-Speech and Hearing

      Page(s):
    1059-1067

    This paper presents an improved phoneme-history-dependent (PHD) search algorithm. This method is an optimum algorithm under the assumption that the starting time of a recognized word depends on only a few preceding phonemes (phoneme history). The computational cost and the number of recognition errors can be reduced if the phoneme-history-dependent search uses re-selection of the preceding word and an appropriate length of phoneme histories. These improvements increase the speed of decoding and help to ensure that the resulting word graph has the correct word sequence. In a 65k-word domain-independent Japanese read-speech dictation task and 1000-word spontaneous-speech airline-ticket-reservation task, the improved PHD search was 1.2-1.8 times faster than a traditional word-dependent search under the condition of equal word accuracy. The improved search reduced the number of errors by a maximum of 21% under the condition of equal processing time. The results also show that our search can generate more compact and accurate word graphs than those of the original PHD search method. In addition, we investigated the optimum length of the phoneme history in the search.

  • Vector Quantization Codebook Design Using the Law-of-the-Jungle Algorithm

    Hiroyuki TAKIZAWA  Taira NAKAJIMA  Kentaro SANO  Hiroaki KOBAYASHI  Tadao NAKAMURA  

     
    PAPER-Image Processing, Image Pattern Recognition

      Page(s):
    1068-1077

    The equidistortion principle[1] has recently been proposed as a basic principle for design of an optimal vector quantization (VQ) codebook. The equidistortion principle adjusts all codebook vectors such that they have the same contribution to quantization error. This paper introduces a novel VQ codebook design algorithm based on the equidistortion principle. The proposed algorithm is a variant of the law-of-the-jungle algorithm (LOJ), which duplicates useful codebook vectors and removes useless vectors. Due to the LOJ mechanism, the proposed algorithm can establish the equidistortion condition without wasting learning steps. This is significantly effective in preventing performance degradation caused when initial states of codebook vectors are improper to find an optimal codebook. Therefore, even in the case of improper initialization, the proposed algorithm can achieve minimization of quantization error based on the equidistortion principle. Performance of the proposed algorithm is discussed through experimental results.

  • Subspace Method for Efficient Face Recognition Using a Combination of Radon Transform and KL Expansion

    Tran Thai SON  Seiichi MITA  Le Hai NAM  

     
    PAPER-Image Processing, Image Pattern Recognition

      Page(s):
    1078-1086

    This paper describes an efficient face recognition method using a combination of the Radon transform and the KL expansion. In this paper, each facial image is transformed into many sets of line integrals resulting from the Radon transform in 2D space. Based on this transformation, a new face-recognition method is proposed by using many subspaces generated from the vector spaces of the Radon transform. The efficiencies of the proposed method are proved by the classification rate of 100% in the experimental results, and the reduction to O(n4) instead of O(n6) of the operation complexity in KL(Karhunen-Loeve) expansion, where n is the size of sample images.

  • An Adaptive DCT Coding with Geometrical Edge Representation

    Yuji ITOH  

     
    PAPER-Image Processing, Image Pattern Recognition

      Page(s):
    1087-1094

    Discrete cosine transform (DCT) coding has been proven to be an efficient means of image compression coding. A lot of efforts have been made to improve the coding efficiency of DCT based coding. This paper presents an adaptive DCT coding based on geometrical edge representation. This scheme is designed to properly exploit the correlation between edge direction and distribution of DCT coefficients. Edges are extracted from original images first. Then, sub-optimal block-size and scanning order are determined at each block based on the extracted edges. In this way an adaptive DCT scheme taking account of local characteristics of image can be achieved. It is shown through the simulations that the proposed algorithm outperforms a conventional coding scheme in terms of coding efficiency by 10-15%.

  • Outlier Removal for Motion Tracking by Subspace Separation

    Yasuyuki SUGAYA  Kenichi KANATANI  

     
    PAPER-Image Processing, Image Pattern Recognition

      Page(s):
    1095-1102

    Many feature tracking algorithms have been proposed for motion segmentation, but the resulting trajectories are not necessarily correct. In this paper, we propose a technique for removing outliers based on the knowledge that correct trajectories are constrained to be in a subspace of their domain. We first fit an appropriate subspace to the detected trajectories using RANSAC and then remove outliers by considering the error behavior of actual video tracking. Using real video sequences, we demonstrate that our method can be applied if multiple motions exist in the scene. We also confirm that the separation accuracy is indeed improved by our method.

  • Performance Analysis and Comparison of Non-embedded and Embedded Wavelet Coders

    Hyun Joo SO  Young Jun JUNG  Jong Seog KOH  Nam Chul KIM  

     
    PAPER-Image Processing, Image Pattern Recognition

      Page(s):
    1103-1109

    In this paper, we analyze wavelet-based coding in a rate-distortion (R-D) sense by using Laplacian and Markov models and verify the results with the performance of the typical embedded coders, EZW and SPIHT, and the non-embedded coder implemented here. Laplacian represents the probability density function (pdf) of wavelet coefficients and Markov statistical dependency within and among subbands. The models allow us to easily understand the behavior of a thresholding and quantization part and a lossless coding part and associate the embedded coders with the nonembedded coder, which is the point the paper approaches. The analytic results are shown to coincide well with the actual coding results.

  • Compression of 3D Models by Remesh on Texture Images

    Masahiro OKUDA  Kyoko NAGATOMO  Masaaki IKEHARA  Shin-ichi TAKAHASHI  

     
    PAPER-Computer Graphics

      Page(s):
    1110-1115

    Due to the rapid development of computer and information technology, 3D modeling and rendering capabilities are becoming increasingly important in many applications, including industrial design, architecture, CAD/CAM, video games, and medical imaging. Since 3D mesh models often have huge amounts of the data, it is time-consuming to retrieve from a storage device or to download from the network. Most 3D viewing applications need to obtain the entire file of a 3D model in order to display the model, even when the user is interested only in a low-resolution version of the model. Therefore, progressive coding that enables multiresolution transmission of 3D models is desired. In this paper, we propose the progressive coding scheme of 3D meshes with texture, in which we convert irregular meshes to semi-regular using texture coordinates, map them on planes, and apply 2D image coding algorithm to mesh compression. As our method uses the wavelet transform, the encoded bitstream has a progressive nature. We gain high compression rate with the same visual quality as original models.

  • Heart Sound Recognition through Analysis of Wavelet Transform and Neural Network

    Jun-Pyo HONG  Jung-Jun LEE  Sang-Bong JUNG  Seung-Hong HONG  

     
    PAPER-Medical Engineering

      Page(s):
    1116-1121

    Heart sound is an acoustic wave generated by the mechanical movement of the heart and blood flow, and is a complicated, non-stationary signal composed of many signal sources. It can be divided into normal heart sounds and heart murmurs. Murmurs are abnormal signals that appear over wider ranges of frequency than normal heart sounds. They are generated at random spots in the whole period of heart sounds. The recognition of heart sounds is to differentiate heart murmurs through patterns that appear according to the generation time of murmurs. In this paper, a group of heart sounds was classified into normal heart sounds, pre-systolic murmurs, early systolic murmurs, late systolic murmurs, early diastolic murmurs, and continuous murmurs. The suggested algorithm was standardized by re-sampling and then added as an input to the neural network through wavelet transform. The neural network used Error Back - Propagation algorithm, which is a representative learning method, and controlled the number of hidden layers and the learning rate for optimal construction of networks. As a result of recognition, the suggested algorithm obtained a higher recognition rate than that of existing research methods. The best result was obtained with the average of 88% of the recognition rate when it consisted of 15 hidden layers. The suggested algorithm was considered effective for the recognition of automatic diagnosis of heart sound recognition programs.

  • VLSI Implementation for Fuzzy Membership-Function Generator

    Pei-Yin CHEN  

     
    LETTER-VLSI Systems

      Page(s):
    1122-1125

    Correct and quick generation of a membership function is the key point when we implement a real-time fuzzy logic controller. In this Letter, we presented two efficient VLSI architectures, one to generate triangle-shaped and the other to generate trapezoid-shaped membership functions. Simulation results show that our designs require lower hardware cost but achieve faster working rate.

  • Mitigating Data Fragmentation for Small File Accesses

    Woo Hyun AHN  Daeyeon PARK  

     
    LETTER-Software Systems

      Page(s):
    1126-1133

    In traditional file systems, data clustering and grouping have improved small file performance. These schemes make it possible for file systems to use large data transfers in accessing small files, reducing disk I/Os. However, as file systems age, disks become too fragmented to support the grouping and clustering. To offer a solution to this problem, we describe a De-Fragmented File System (DFFS), which gradually alleviates fragmentation of small files. By using data cached in memory, DFFS dynamically relocates blocks of small fragmented files, clustering them on the disks contiguously. In addition, DFFS relocates small related files in the same directory, grouping them at contiguous disk locations.

  • Empirical Study on the Improvement of the Usability of a Touch Panel for the Elderly--Comparison of Usability between a Touch Panel and a Mouse--

    Hirokazu IWASE  Atsuo MURATA  

     
    LETTER-Software Engineering

      Page(s):
    1134-1138

    In this study, we clarified the differences in the pointing time required when using a touch panel and a PC mouse for three age groups: young, middle-aged, and elderly. We constructed a performance model for a touch panel operation (Experiment 1). Moreover, we investigated the visual interference caused by a multi-target presentation (Experiment 2). The delay caused by visual interference for the right-hand target was longer than that for the left-hand target, and that for the upper target was longer than that for the lower target.

  • Relational Interface for Natural Language-Based Information Sources

    Zenshiro KAWASAKI  Keiji SHIBATA  Masato TAJIMA  

     
    LETTER-Databases

      Page(s):
    1139-1143

    This paper presents an extension of the database query language SQL to include queries against a database with natural language annotations. The proposed scheme is based on Concept Coupling Model, a language model for handling natural language sentence structures. Integration of the language model with the conventional relational data model provides a unified environment for manipulating information sources comprised of relational tables and natural language texts.

  • A Motion Compensated Filter for Channel Equalization and Video Restoration

    Mohammed ELHASSOUNI  El Hassane IBNELAHJ  Driss ABOUTAJDINE  

     
    LETTER-Image Processing, Image Pattern Recognition

      Page(s):
    1144-1148

    An important area in visual communications is the restoration of image sequences degraded by channel and noise. Since a nonlinearity is commonly involved in image transmitting procedure, an adaptive nonlinear equalizer is required. In this paper we address this problem by proposing a 3D adaptive nonlinear filter, namely the 3D adaptive Volterra filter with an LMS type of adaptation algorithm. This adaptive filter is used for equalizing an unknown 2-D channel with some point-wise nonlinearity and restoring image sequences degraded by this channel. Prior to filtering, motion is estimated from the sequence and compensated for. For this purpose, a robust region-recursive Higher Order Statistics (HOS) based motion estimation method is employed. The overall combination is able to adequately remove undesired effects of communication channel and noise. The performance of this algorithm is examined using real image sequences demonstrated by experimental results.

  • Microwave Radio-Thermometry Based on Material Characteristic Estimation for Measuring Subcutaneous Temperature

    Tae-Woo KIM  Jeong-Hwan LEE  Gilwon YOON  

     
    LETTER-Medical Engineering

      Page(s):
    1149-1153

    This paper presents a modified microwave radio-thermometer (MRTM) with material characteristic estimator and multiple temperature conversion tables to measure subcutaneous temperature of a living body. This estimator provides a temperature retrieval unit with material characteristics such as permittivity, conductivity, thickness and geometry of the living body. The temperature retrieval unit with multiple temperature conversion tables can select one of the tables and compute temperature value corresponding to measured radiation power. In the experiments, it was shown that the radio-thermometer could reduce measurement errors of about 0.82 to 7.68 for the cases of distilled water and mixed liquid # 5 with thickness of 29.5 cm and 9.5 cm at the temperature of 37.