1-3hit |
Unlike in existing mobile networks, a variety of services having different quality requirements will be provided in future mobile networks, where any single group of service users can not characterize the whole traffic distribution in the system. Beginning at the mobile network design, the population of service subscribers is estimated and then base stations are located. As the service market evolves, the volume of users might grow or the population of users distributed between the multiple services as well as the located cells might change. In this case, two questions are of interest: how much growth in user population and what change in user distribution can be accommodated in the current cell configuration. If such shifts could not be admitted, current frequency and base station allocations should be expanded or reallocated. In this paper, we provide a framework that can decide whether the present network configuration is able to admit the changes of interest. Admissibility decision rules are addressed with proofs.
Akira HIRABAYASHI Hidemitsu OGAWA Akiko NAKASHIMA
In supervised learning, one of the major learning methods is memorization learning (ML). Since it reduces only the training error, ML does not guarantee good generalization capability in general. When ML is used, however, acquiring good generalization capability is expected. This usage of ML was interpreted by one of the present authors, H. Ogawa, as a means of realizing 'true objective learning' which directly takes generalization capability into account, and introduced the concept of admissibility. If a learning method can provide the same generalization capability as a true objective learning, it is said that the objective learning admits the learning method. Hence, if admissibility does not hold, making it hold becomes important. In this paper, we introduce the concept of realization of admissibility, and devise a realization method of admissibility of ML with respect to projection learning which directly takes generalization capability into account.
Akira HIRABAYASHI Hidemitsu OGAWA Yukihiko YAMASHITA
In learning of feed-forward neural networks, so-called 'training error' is often minimized. This is, however, not related to the generalization capability which is one of the major goals in the learning. It can be interpreted as a substitute for another learning which considers the generalization capability. Admissibility is a concept to discuss whether a learning can be a substitute for another learning. In this paper, we discuss the case where the learning which minimizes a training error is used as a substitute for the projection learning, which considers the generalization capability, in the presence of noise. Moreover, we give a method for choosing a training set which satisfies the admissibility.