The search functionality is under construction.

Keyword Search Result

[Keyword] quadratic function(2hit)

1-2hit
  • Two Classes of Linear Codes with Two or Three Weights

    Guangkui XU  Xiwang CAO  Jian GAO  Gaojun LUO  

     
    PAPER-Coding Theory

      Vol:
    E101-A No:12
      Page(s):
    2366-2373

    Many linear codes with two or three weights have recently been constructed due to their applications in consumer electronics, communication, data storage system, secret sharing, authentication codes, association schemes, and strongly regular graphs. In this paper, two classes of p-ary linear codes with two or three weights are presented. The first class of linear codes with two or three weights is obtained from a certain non-quadratic function. The second class of linear codes with two weights is obtained from the images of a certain function on $mathbb{F}_{p^m}$. In some cases, the resulted linear codes are optimal in the sense that they meet the Griesmer bound.

  • Dictionary Learning with Incoherence and Sparsity Constraints for Sparse Representation of Nonnegative Signals

    Zunyi TANG  Shuxue DING  

     
    PAPER-Biocybernetics, Neurocomputing

      Vol:
    E96-D No:5
      Page(s):
    1192-1203

    This paper presents a method for learning an overcomplete, nonnegative dictionary and for obtaining the corresponding coefficients so that a group of nonnegative signals can be sparsely represented by them. This is accomplished by posing the learning as a problem of nonnegative matrix factorization (NMF) with maximization of the incoherence of the dictionary and of the sparsity of coefficients. By incorporating a dictionary-incoherence penalty and a sparsity penalty in the NMF formulation and then adopting a hierarchically alternating optimization strategy, we show that the problem can be cast as two sequential optimal problems of quadratic functions. Each optimal problem can be solved explicitly so that the whole problem can be efficiently solved, which leads to the proposed algorithm, i.e., sparse hierarchical alternating least squares (SHALS). The SHALS algorithm is structured by iteratively solving the two optimal problems, corresponding to the learning process of the dictionary and to the estimating process of the coefficients for reconstructing the signals. Numerical experiments demonstrate that the new algorithm performs better than the nonnegative K-SVD (NN-KSVD) algorithm and several other famous algorithms, and its computational cost is remarkably lower than the compared algorithms.