The search functionality is under construction.

Author Search Result

[Author] Yuichi TAKANO(3hit)

1-3hit
  • Temperature Dependence of Andreev Reflection Current of N–I–S Junction

    Shigeru YOSHIMORI  Masanori SUEYOSHI  Ryuichi TAKANO  Akiko FUJIWARA  Mitsuo KAWAMURA  

     
    LETTER

      Vol:
    E77-A No:11
      Page(s):
    1954-1956

    Precise measurements of temperature dependence of the Andreev reflection current for the N–I–S junctions were carried out. Au and Pb were used as N (normal metal) and S (superconducting material), respectively. The experimental results agreed with the analyses based on the Arnold theory.

  • Feature Subset Selection for Ordered Logit Model via Tangent-Plane-Based Approximation

    Mizuho NAGANUMA  Yuichi TAKANO  Ryuhei MIYASHIRO  

     
    PAPER-Fundamentals of Information Systems

      Pubricized:
    2019/02/21
      Vol:
    E102-D No:5
      Page(s):
    1046-1053

    This paper is concerned with a mixed-integer optimization (MIO) approach to selecting a subset of relevant features from among many candidates. For ordinal classification, a sequential logit model and an ordered logit model are often employed. For feature subset selection in the sequential logit model, Sato et al.[22] recently proposed a mixed-integer linear optimization (MILO) formulation. In their MILO formulation, a univariate nonlinear function contained in the sequential logit model was represented by a tangent-line-based approximation. We extend this MILO formulation toward the ordered logit model, which is more commonly used for ordinal classification than the sequential logit model is. Making use of tangent planes to approximate a bivariate nonlinear function involved in the ordered logit model, we derive an MILO formulation for feature subset selection in the ordered logit model. Our computational results verify that the proposed method is superior to the L1-regularized ordered logit model in terms of solution quality.

  • Stochastic Discrete First-Order Algorithm for Feature Subset Selection

    Kota KUDO  Yuichi TAKANO  Ryo NOMURA  

     
    PAPER-Artificial Intelligence, Data Mining

      Pubricized:
    2020/04/13
      Vol:
    E103-D No:7
      Page(s):
    1693-1702

    This paper addresses the problem of selecting a significant subset of candidate features to use for multiple linear regression. Bertsimas et al. [5] recently proposed the discrete first-order (DFO) algorithm to efficiently find near-optimal solutions to this problem. However, this algorithm is unable to escape from locally optimal solutions. To resolve this, we propose a stochastic discrete first-order (SDFO) algorithm for feature subset selection. In this algorithm, random perturbations are added to a sequence of candidate solutions as a means to escape from locally optimal solutions, which broadens the range of discoverable solutions. Moreover, we derive the optimal step size in the gradient-descent direction to accelerate convergence of the algorithm. We also make effective use of the L2-regularization term to improve the predictive performance of a resultant subset regression model. The simulation results demonstrate that our algorithm substantially outperforms the original DFO algorithm. Our algorithm was superior in predictive performance to lasso and forward stepwise selection as well.