Improving Residue-Level Sparsity in RNS-based Neural Network Hardware Accelerators via Regularization

  • E. Kavvousanos
  • , V. Sakellariou
  • , I. Kouretas
  • , V. Paliouras
  • , T. Stouraitis

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    Abstract

    Residue Number System (RNS) has recently attracted interest for the hardware implementation of inference in machine-learning systems as it provides promising trade-offs in the area, time, and power dissipation space. In this paper we introduce a technique that utilizes regularization during training, and increases the percentage of residues which are zero, when the parameters of an artificial neural network (ANN) are expressed in an RNS. The proposed technique can also be used as a post-processing stage, allowing the optimization of pre-trained models for RNS implementation. By increasing the number of residues being zero, i.e., residue-level sparsity, the proposed technique facilitates new hardware architectures for RNS-based inference, allowing new trade-offs and improving performance over prior art without practically compromising accuracy. The introduced method increases residue sparsity by a factor of 4× to 6× in certain cases.

    Original languageBritish English
    Title of host publicationProceedings - 2023 IEEE 30th Symposium on Computer Arithmetic, ARITH 2023
    PublisherInstitute of Electrical and Electronics Engineers Inc.
    Pages102-109
    Number of pages8
    ISBN (Electronic)9798350319224
    DOIs
    StatePublished - 2023
    Event30th IEEE Symposium on Computer Arithmetic, ARITH 2023 - Portland, United States
    Duration: 4 Sep 20236 Sep 2023

    Publication series

    NameProceedings - Symposium on Computer Arithmetic

    Conference

    Conference30th IEEE Symposium on Computer Arithmetic, ARITH 2023
    Country/TerritoryUnited States
    CityPortland
    Period4/09/236/09/23

    Keywords

    • hardware acceleration
    • neural networks
    • residue number system
    • sparsity

    Fingerprint

    Dive into the research topics of 'Improving Residue-Level Sparsity in RNS-based Neural Network Hardware Accelerators via Regularization'. Together they form a unique fingerprint.

    Cite this