The search functionality is under construction.

IEICE TRANSACTIONS on Electronics

Improvement of Data Utilization Efficiency for Cache Memory by Compressing Frequent Bit Sequences

Ryotaro KOBAYASHI, Ikumi KANEKO, Hajime SHIMADA

  • Full Text Views

    0

  • Cite this

Summary :

In the most recent processor designs, memory access latency is shortened by adopting a memory hierarchy. In this configuration, the memory consists of a main memory, which comprises dynamic random-access memory (DRAM), and a cache memory, which consists of static random-access memory (SRAM). A cache memory, which is now used in increasingly large volumes, accounts for a vast proportion of the energy consumption of the overall processor. There are two ways to reduce the energy consumption of the cache memory: by decreasing the number of accesses, and by minimizing the energy consumed per access. In this study, we reduce the size of the L1 cache by compressing frequent bit sequences, thus cutting the energy consumed per access. A “frequent bit sequence” is a specific bit pattern that often appears in high-order bits of data retained in the cache memory. Our proposed mechanism, which is based on measurements using a software simulator, cuts energy consumption by 41.0% on average as compared with conventional mechanisms.

Publication
IEICE TRANSACTIONS on Electronics Vol.E99-C No.8 pp.936-946
Publication Date
2016/08/01
Publicized
Online ISSN
1745-1353
DOI
10.1587/transele.E99.C.936
Type of Manuscript
Special Section PAPER (Special Section on Low-Power and High-Speed Chips)
Category

Authors

Ryotaro KOBAYASHI
  Toyohashi University of Technology
Ikumi KANEKO
  Toyohashi University of Technology
Hajime SHIMADA
  Nagoya University

Keyword