The search functionality is under construction.

Keyword Search Result

[Keyword] sparse modeling(2hit)

1-2hit
  • Bridging between Soft and Hard Thresholding by Scaling

    Katsuyuki HAGIWARA  

     
    PAPER-Artificial Intelligence, Data Mining

      Pubricized:
    2022/06/09
      Vol:
    E105-D No:9
      Page(s):
    1529-1536

    This study considered an extension of a sparse regularization method with scaling, especially in thresholding methods that are simple and typical examples of sparse modeling. In this study, in the setting of a non-parametric orthogonal regression problem, we developed and analyzed a thresholding method in which soft thresholding estimators are independently expanded by empirical scaling values. The scaling values have a common hyper-parameter that is an order of expansion of an ideal scaling value to achieve hard thresholding. We simply refer to this estimator as a scaled soft thresholding estimator. The scaled soft thresholding method is a bridge method between soft and hard thresholding methods. This new estimator is indeed consistent with an adaptive LASSO estimator in the orthogonal case; i.e., it is thus an another derivation of an adaptive LASSO estimator. It is a general method that includes soft thresholding and non-negative garrote as special cases. We subsequently derived the degree of freedom of the scaled soft thresholding in calculating the Stein's unbiased risk estimate. We found that it is decomposed into the degree of freedom of soft thresholding and the remainder term connecting to the hard thresholding. As the degree of freedom reflects the degree of over-fitting, this implies that the scaled soft thresholding has an another source of over-fitting in addition to the number of un-removed components. The theoretical result was verified by a simple numerical example. In this process, we also focused on the non-monotonicity in the above remainder term of the degree of freedom and found that, in a sparse and large sample setting, it is mainly caused by useless components that are not related to the target function.

  • Radio Techniques Incorporating Sparse Modeling Open Access

    Toshihiko NISHIMURA  Yasutaka OGAWA  Takeo OHGANE  Junichiro HAGIWARA  

     
    INVITED SURVEY PAPER-Digital Signal Processing

      Pubricized:
    2020/09/01
      Vol:
    E104-A No:3
      Page(s):
    591-603

    Sparse modeling is one of the most active research areas in engineering and science. The technique provides solutions from far fewer samples exploiting sparsity, that is, the majority of the data are zero. This paper reviews sparse modeling in radio techniques. The first half of this paper introduces direction-of-arrival (DOA) estimation from signals received by multiple antennas. The estimation is carried out using compressed sensing, an effective tool for the sparse modeling, which produces solutions to an underdetermined linear system with a sparse regularization term. The DOA estimation performance is compared among three compressed sensing algorithms. The second half reviews channel state information (CSI) acquisitions in multiple-input multiple-output (MIMO) systems. In time-varying environments, CSI estimated with pilot symbols may be outdated at the actual transmission time. We describe CSI prediction based on sparse DOA estimation, and show excellent precoding performance when using the CSI prediction. The other topic in the second half is sparse Bayesian learning (SBL)-based channel estimation. A base station (BS) has many antennas in a massive MIMO system. A major obstacle for using the massive MIMO system in frequency-division duplex mode is an overhead for downlink CSI acquisition because we need to send many pilot symbols from the BS and to get the feedback from user equipment. An SBL-based channel estimation method can mitigate this issue. In this paper, we describe the outline of the method, and show that the technique can reduce the downlink pilot symbols.