The search functionality is under construction.

IEICE TRANSACTIONS on Information

  • Impact Factor

    0.59

  • Eigenfactor

    0.002

  • article influence

    0.1

  • Cite Score

    1.4

Advance publication (published online immediately after acceptance)

Volume E81-D No.12  (Publication Date:1998/12/25)

    Special Issue on Knowledge-Based Software Engineering
  • FOREWORD

    Seiichi KOMIYA  

     
    FOREWORD

      Page(s):
    1321-1322
  • A Program Normalization to Improve Flexibility of Knowledge-Based Program Understander

    Haruki UENO  

     
    PAPER-Theory and Methodology

      Page(s):
    1323-1329

    This paper discusses the experimental evaluation of the knowledge-based program understander ALPUS and methods of program normalization based on the evaluation to improve the flexibility of the system performance. ALPUS comprehends students' buggy Pascal programs using four kinds of programming knowledge, detects logical bugs, infers user's intentions and gives advice for fixing bugs. By means of the combination of the pattern matching technique and the HPG-based formalism of programming knowledge in addition to program normalization high performance of comprehension has been achieved for relatively complex programs such as Quicksort programs. The experimental evaluation told that program normalization would solve some 55% of unsucceeded student programs. Program normalization has contributed both in decreasing the number of knowledge patterns and increasing the flexibility. This paper proposes a five-step normalization procedure which works well in an experimental situation.

  • The Underlying Ontology of a DSS Generator for Transportation Demand Forecasting

    Cristina FIERBINTEANU  Toshio OKAMOTO  Naotugu NOZUE  

     
    PAPER-Theory and Methodology

      Page(s):
    1330-1338

    We introduce an ontology for transportation systems demand forecasting and its implementation into a decision support system (DSS) generator. The term ontology, as we use it here, means a collection of building blocks necessary and sufficient to construct a skeleton of a specific DSS, that is a task ontology. The ontology is specified in constraint logic, which also ensures a good support for modularity.

  • MALL: A Multi-Agent Learning Language for Competitive and Uncertain Environments

    Sidi O. SOUEINA  Behrouz Homayoun FAR  Teruaki KATSUBE  Zenya KOONO  

     
    PAPER-Theory and Methodology

      Page(s):
    1339-1349

    A Multi-Agent Learning Language (MALL) is defined as being necessary for agents in environments where they encounter crucial situations in which they have to learn about the environment, other parties moves and strategies, and then construct an optimal plan. The language is based on two major factors, the level of certainty in fully monitoring (surveying) the agents and the environment, and optimal plan construction, in an autonomous way. Most of the work related to software agents is based on the assumption that other agents are trustworthy. In the growing Internet environment this may not be true. The proposed new learning language allows agents to learn about the environment and the strategies of their opponents while devising their own plans. The language is being tested in our project of software agents for Electronic Commerce that operates in various security zones. The language is flexible and adaptable to a variety of agents applications.

  • A Model for Recording Software Design Decisions and Design Rationale

    Seiichi KOMIYA  

     
    PAPER-Theory and Methodology

      Page(s):
    1350-1363

    For the improvement of software quality and productivity, the author aims at realizing a software development environment to develop software through utilizing the merits of group work. Since networking is necessary for collaborative software development, he has developed a software distributed development environment for collaborative software development. In this environment, discussions about software design are held through a communication network, and the contents of discussions are recorded as software design decisions and decision rationale. One feature of this environment is that the contents of discussions can be recorded in on-line real time and reused without reconstructing the information recorded through this environment. This paper clarifies the essential conditions for actualizing this environment and proposes an information structure model for recording the contents of discussions that actualizes the above-mentioned feature. The effectiveness of the proposed model is proved through an example of its application to software design discussions.

  • A Metric for Class Structural Complexity Focusing on Relationships among Class Members

    Hirohisa AMAN  Torao YANARU  Masahiro NAGAMATSU  Kazunori MIYAMOTO  

     
    PAPER-Theory and Methodology

      Page(s):
    1364-1373

    In this paper, we represent a class structure using directed graph in which each node corresponds to each member of the class. To quantify the dependence relationship among members, we define weighted closure. Using this quantified relationship and effort equation proposed by M. H. Halstead, we propose a metric for class structural complexity.

  • An Integrated Reasoning and Learning Environment for WWW Based Software Agents for Electronic Commerce

    Behrouz Homayoun FAR  Sidi O.SOUEINA  Hassan HAJJI  Shadan SANIEPOUR  Anete Hiromi HASHIMOTO  

     
    PAPER-System

      Page(s):
    1374-1386

    A major topic in the field of network and telecommunications is doing business on the World Wide Web (WWW), which is called Electronic Commerce (EC). Another major topic is blending Artificial Intelligence (AL) techniques with the WWW. In the Ex-W-Pert Project we have proposed an agent model for EC components that blends the traditional expert systems' reasoning engine with a multi-layer knowledge base, communication and documentation engines. In this project, EC is viewed as a society of software agents, such as customer, search, catalog, manufacturer, dealer, delivery and banker agents, interacting and negotiating with each other. Each agent has a knowledge-base and a reasoning engine, a communication engine and a documentation engine. The knowledge-base is organized in three layers: skill layer, rule layer and knowledge layer (S-R-K layers). In this project, for each EC agent, we identify the class of problems to be solved and build the knowledge base gradually for each layer. We believe that using this multi-layer knowledge base system will speed up the reasoning and ultimately reduce the operation costs.

  • A Process-Centered Software Engineering Environment Using Ontologies

    Takahira YAMAGUCHI  Satoshi KOMORI  Kaori MORI  Tomohiko SHIOZAWA  

     
    PAPER-System

      Page(s):
    1387-1393

    In order to build up a process-centered software engineering environment using ontologies, we present a methodology to manually construct the following ontologies: an object ontology constructed based on constituent elements to make up objects (products), and a process ontology constructed based on the relationships between inputs and outputs. Afterwards, using the constructed ontologies, the environment generates software process plans good for user queries, with both user interaction and constraints satisfaction by the Generate and Test paradigm. Furthermore, case studies show us that the environment works well in generating software process plans good for a query about the intermediate stage of development, between basic design and detailed design.

  • Flage: A Programming Language for Adaptive Software

    Fumihiro KUMENO  Akihiko OHSUGA  Shinichi HONIDEN  

     
    PAPER-System

      Page(s):
    1394-1403

    We propose a programming language, Flage, for building software systems which dynamically adapt to changing local situations. In our language, we construct applications by agents; concurrent mobile objects with the metalevel architecture. Metalevel programming facilities realize a self-control of an agent's actions and an autonomous adaptation to changes. We also introduce another kind of program element called field. A field represents a local situation around agents. For example, one field represents a virtual place to get local information in a network environment and another represents a virtual place where agents do cooperative works. If an agent enters a field, it gets programs and shared information in the field. By moving field to field, an agent can change its program composition by itself and it adapts to changing local situations. In this paper, we describe the language specification of Flage, the implementation of the platform for Flage programming and show some program examples.

  • Workload Management Facilities for Software Project Management

    Atsuo HAZEYAMA  Seiichi KOMIYA  

     
    PAPER-System

      Page(s):
    1404-1414

    Workers involved in software projects are unlike those working on a production line in a manufacturing field usually engaged in plural work (that is, not only main development work but also various other work), concurrently. Such other work might put pressure on the schedule of the whole project. Therefore, to manage the whole project, not only main development work but also various other work should be dealt with as management objects and workers' workload should be taken into consideration (that is, who is doing what work at what workload at what time). This paper proposes a framework for workload management facilities for managing software projects. This framework proposes to relate not only main development work but also various other work and each work step within cooperative work to the workers. This paper also shows the behavior of the facilities by using an example and shows its usefulness based on the application of a prototype system. Using this system, users can assign work to workers by simulating workers' workload. These facilities help managers grasp workers' workload as well as help workers grasp their assigned work.

  • A Meta-Model of Work Structure of Software Project and a Framework for Software Project Management Systems

    Seiichi KOMIYA  Atsuo HAZEYAMA  

     
    PAPER-System

      Page(s):
    1415-1428

    Development of large-scale software is usually conducted through a project to unite a work force. In addition, no matter what kind of life cycle model is employed, a development plan is required for a software development project in order for the united work force to function effectively. For the project to be successful, it is also necessary to set management objectives based on this plan and confirm that they are achieved. This method is considered to be effective, but actually making a software development project and following the achievement of the management objectives at each step is not easy because predicting the necessary work amount and risks that the project involves is difficult in software development. Therefore, it is necessary to develop a system to support software project management so that the project manager can manage the entire project and the work load is reduced. This paper proposes a meta-model of work structure of software development projects for project management by using an object-oriented database with constraints as well as a framework for software project management systems based on this meta-model. Also proven, through an example of a system that analyzes repercussions on progress of a software development project, is that the meta-model and framework are effective in software project management.

  • A Support Tool for Specifying Requirements Using Structures of Documents

    Tomofumi UETAKE  Morio NAGATA  

     
    PAPER-Application

      Page(s):
    1429-1438

    The software requirements specification process consists of three steps; requirements capture and analysis, requirements definition and specification, and requirements validation. At the beginning of the second step which this paper focuses on, there have been several types of massive documents generated in the first step. Since the developers and the clients/users of the new software system may not have common knowledge in the field which the system deals with, it is difficult for the developers to produce correct requirements specification by using these documents. There has been few research work to solve this problem. The authors have developed a support tool to produce correct requirements specification by arranging and restructuring those documents into clearly understandable forms. In the second step, the developers must specify the functions and their constraints of the new system from those documents. Analyzing the developers' real activities for designing the support tool, the authors propose a model of this step as the following four activities. To specify the functions of the new system, the developers must collect the sentences which may suggest the functions scattering those documents. To define the details of each function, the developers must gather the paragraphs including the descriptions of the functions. To verify the correctness of each function, the developers must survey all related documents. To perform above activities successfully, the developers must manage various versions of those documents correctly. According to these four types of activities, the authors propose the effective ways to support the developers by arranging those documents. This paper shows algorithms based on this model by using the structures of the documents and keywords which may suggest the functions or constraints. To examine the feasibility of their proposal, the authors implemented a prototype tool. Their tool extracts complete information scattering those documents. The effectiveness of their proposal is demonstrated by their experiments.

  • Software Creation: An Intelligent CASE Tool Featuring Automatic Design for Structured Programming

    Hui CHEN  Nagayasu TSUTSUMI  Hideki TAKANO  Zenya KOONO  

     
    PAPER-Application

      Page(s):
    1439-1449

    This paper reports on an Intelligent CASE tool, applicable in a structured programming phase, or from detailed design to coding. This is automation of the bottom level in the hierarchical design process of detailed design and coding, where the largest man-hours are consumed. The main idea is that human designers use a CASE tool for the initial design of a software system, and the design knowledge is automatically acquired from the structured charts and stored in the knowledge base. The acquired design knowledge may be reused in designs. By reusing it, a similar software system may be designed automatically. It has been shown that knowledge acquired in this way has a Logarithmic Learning Effect. Based on this, a quantitative evaluation of productivity is made. By accumulating design experiences (e. g. 10 times), more than 80% of the detailing designs are performed automatically, and productivity increases by up to 4 times. This tool features universality, an essentially zero start-up cost for automatic design, and a substantial increase in software productivity after enough experiences have been accumulated. This paper proposes a new basic idea and its implementation, a quantitative evaluation applying techniques from Industrial Engineering, which proves the effectiveness of the proposed system.

  • Internet/Intranet Application Development System WebBASE and Its Evaluation

    Shuichiro YAMAMOTO  Ryuji KAWASAKI  Toshihiro MOTODA  Koji TOKUMARU  

     
    PAPER-Application

      Page(s):
    1450-1457

    There is increasing demand for corporate information systems that have a simple human interface and are easy to access via WWW browsers. This paper proposes WebBASE, which integrates the WWW and relational databases. Experimental evaluation shows that WebBASE offers superior performance compared to existing products. Field studies of actual WebBASE applications show that it can improve the productivity of software developers for intranet application development.

  • Patterned Versus Conventional Object-Oriented Analysis Methods: A Group Project Experiment

    Shuichiro YAMAMOTO  Hiroaki KUROKI  

     
    PAPER-Experiment

      Page(s):
    1458-1465

    Object-oriented analysis methods can be grouped into data-driven and behavior-driven approaches. With data-driven approaches, object models are developed based on a list of objects and their inter-relationships, which describe a static view of the real world. With behavior-oriented approaches, a system usage scenario is analyzed before developing the object models. Although qualitative comparisons of these two types of methods have been made, there was no statistical study has evaluated them based on controlled experiments. This paper proposes the patterned object-oriented method, POOM, which is a behavior-oriented approach, and compares it to OMT, a data-driven approach, using small team experiments. The effectiveness of POOM is shown in terms of productivity and homogeneity.

  • A Boolean Factorization Using an Extended Boolean Matrix

    Oh-Hyeong KWON  Sung Je HONG  Jong KIM  

     
    PAPER-Computer Hardware and Design

      Page(s):
    1466-1472

    A factorization, which provides a factored form, is an extremely important part of multi-level logic synthesis. The number of literals in a factored form is a good estimate of the complexity of a logic function, and can be translated directly into the number of transistors required for implementation. Factored forms are described as either algebraic or Boolean, according to the trade-off between run-time and optimization. A Boolean factored form contains fewer number of literals than an algebraic factored form. In this paper, we present a new method for a Boolean factorization. The key idea is to build an extended Boolean matrix using cokernel/kernel pairs and kernel/kernel pairs together. The extended Boolean matrix makes it possible to yield a Boolean factored form. We also propose a heuristic method for covering of the extended Boolean matrix. Experimental results on various benchmark circuits show the improvements in literal counts over the algebraic factorization based on Brayton's Boolean matrix.

  • Buddy Coherence: An Adaptive Granularity Handling Scheme for Page-Based DSM

    Sangbum LEE  Inbum JUNG  Joonwon LEE  

     
    PAPER-Computer Systems

      Page(s):
    1473-1482

    Page-based DSM systems suffer from false sharing since they use a large page as a coherence unit. The optimal page size is dynamically affected by application characteristics. Therefore, a fixed-size page cannot satisfy various applications even if it is small as a cache line size. In this paper we present a software-only coherence protocol called BCP (Buddy Coherence Protocol) to support multiple page sizes that vary adaptively according to the behavior of each application during run time. In BCP, the address of a remote access and the address of the most recent local access is compared. If they are to the different halves of a page, BCP considers it as false sharing and demotes the page to two subpages of equal size. If two contiguous pages belong to the same node, BCP promotes two pages to a superpage to reduce the number of the following coherence activities. We also suggest a mechanism to detect data sharing patterns to optimize the protocol. It detects and keeps the sharing pattern for each page by a state transition mechanism. By referring to those patterns, BCP selectively demotes the page and increases the effectiveness of a demotion. Self-invalidation of the migratorily shared page is also employed to reduce the number of invalidations. Our simulations show that the optimized BCP outperforms almost all the best cases of the write-invalidate protocols using fixed-size pages. BCP improves performance by 42.2% for some applications when compared against the case of the fixed-size page.

  • Signature Pattern Recognition Using Moments Invariant and a New Fuzzy LVQ Model

    Payam NASSERY  Karim FAEZ  

     
    PAPER-Image Processing,Computer Graphics and Pattern Recognition

      Page(s):
    1483-1493

    In this paper we have introduced a new method for signature pattern recognition, taking advantage of some image moment transformations combined with fuzzy logic approach. For this purpose first we tried to model the noise embedded in signature patterns inherently and separate it from environmental effects. Based on the first step results, we have performed a mapping into the unit circle using the error least mean square (LMS) error criterion, to get ride of the variations caused by shifting or scaling. Then we derived some orientation invariant moments introduced in former reports and studied their statistical properties in our special input space. Later we defined a fuzzy complex space and also a fuzzy complex similarity measure in this space and constructed a new training algorithm based on fuzzy learning vector quantization (FLVQ) method. A comparison method has also been proposed so that any input pattern could be compared to the learned prototypes through the pre-defined fuzzy similarity measure. Each set of the above image moments were used by the fuzzy classifier separately and the mis-classifications were detected as a measure of error magnitude. The efficiency of the proposed FLVQ model has been numerically shown compared to the conventional FLVQs reported so far. Finally some satisfactory results are derived and also a comparison is made between the above considered image transformations.

  • New Performance Evaluation of Parallel Thinning Algorithms Based on PRAM and MPRAM Models

    Phill-Kyu RHEE  Che-Woo LA  

     
    PAPER-Image Processing,Computer Graphics and Pattern Recognition

      Page(s):
    1494-1506

    The objective of thinning is to reduce the amount of information in image patterns to the minimum needed for recognition. Thinned image helps the extraction of important features such as end points, junction points, and connections from image patterns. The ultimate goal of parallel algorithms is to minimize the execution time while producing high quality thinned image. Though much research has been performed for parallel thinning algorithms, there has been no systematical approach for comparing the execution speed of parallel thinning algorithms. Several rough comparisons have been done in terms of iteration numbers. But, such comparisons may lead to wrong guides since the time required for iterations varies from one algorithm to the other algorithm. This paper proposes a formal method to analyze the performance of parallel thinning algorithms based on PRAM (Parallel Random Access Machine) model. Besides, the quality of skeletons, robustness to boundary noise sensitivity, and execution speed are considered. Six parallel algorithms, which shows relatively high performance, are selected, and analyzed based on the proposed analysis method. Experiments show that the proposed analysis method is sufficiently accurate to evaluate the performance of parallel thinning algorithms.

  • A New Constructive Compound Neural Networks Using Fuzzy Logic and Genetic Algorithm 1 Application to Artificial Life

    Jianjun YAN  Naoyuki TOKUDA  Juichi MIYAMICHI  

     
    LETTER-Bio-Cybernetics and Neurocomputing

      Page(s):
    1507-1516

    This paper presents a new compound constructive algorithm of neural networks whereby the fuzzy logic technique is explored as an efficient learning algorithm to implement an optimal network construction from an initial simple 3-layer network while the genetic algorithm is used to help design an improved network by evolutions. Numerical simulations on artificial life demonstrate that compared with the existing network design algorithms such as the constructive algorithms, the pruning algorithms and the fixed, static architecture algorithm, the present algorithm, called FuzGa, is efficient in both time complexity and network performance. The improved time complexity comes from the sufficiently small 3 layer design of neural networks and the genetic algorithm adopted partly because the relatively small number of layers facilitates an utilization of an efficient steepest descent method in narrowing down the solution space of fuzzy logic and partly because trappings into local minima can be avoided by genetic algorithm, contributing to considerable saving in time in the processing of network learning and connection. Compared with 54. 8 minutes of MLPs with 65 hidden neurons, 63. 1 minutes of FlexNet or 96. 0 minutes of Pruning, our simulation results on artificial life show that the CPU time of the present method reaching the target fitness value of 100 food elements eaten for the present FuzGa has improved to 42. 3 minutes by SUN's SPARCstation-10 of SuperSPARC 40 MHz machine for example. The role of hidden neurons is elucidated in improving the performance level of the neural networks of the various schemes developed for artificial life applications. The effect of population size on the performance level of the present FuzGa is also elucidated.

  • Ultrasonic Closing Click of the Prosthetic Cardiac Valve

    Jun HASEGAWA  Kenji KOBAYASHI  Hiroshi MATSUMOTO  

     
    LETTER-Bio-Cybernetics and Neurocomputing

      Page(s):
    1517-1521

    Mechanical prosthetic cardiac valves generate not only the widely recognized audible closing clicks but also ultrasonic closing clicks, as previously reported by us. A personal-computer-based measurement and analysis system with the bandwidth of 625 kHz has been developed to clarify the characteristics of these ultrasonic closing clicks. Fifty cases in total were assessed clinically, including cases with tilting disk valves, bileaflet valves, and flat disk valves. The ultrasonic closing clicks are damped vibrations continuing for about two milliseconds, and their frequency range was confirmed to be from 8 kHz to 625 kHz, while that of the audible click was up to 8 kHz. Although the sensitivity of the sensor decreased by approximately 30 dB at 625 kHz, effective power of the ultrasonic closing click was confirmed at this frequency. Moreover, it was shown that, surprisingly, the signal power at 625 kHz was still at the same level as that at around 100 kHz. Those wide bandwidth signal components exist independent of the type of mechanical valve, but the spectral pattern shows some dependence on the valve type.