Convergent time-stepping schemes for analog Relu networks

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

With the phenomenal growth of deep learning paradigms based on the use of the rectified linear unit (ReLU) as activation function and the importance attached to the hardware acceleration of such learning approaches, there is a pressing need for the development of numerical simulation algorithms that are tailored for the specific context of analog ReLU networks. In this paper, we propose two time-stepping schemes for the transient analysis of analog ReLU networks and provide rigorous proofs of their convergence under mild conditions on the ReLU network connectivity matrix. Simulation examples are provided that illustrate the numerical stability of these schemes and contrast their convergence rates.

Original languageBritish English
Title of host publication2021 IEEE International Symposium on Circuits and Systems, ISCAS 2021 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781728192017
DOIs
StatePublished - 2021
Event53rd IEEE International Symposium on Circuits and Systems, ISCAS 2021 - Daegu, Korea, Republic of
Duration: 22 May 202128 May 2021

Publication series

NameProceedings - IEEE International Symposium on Circuits and Systems
Volume2021-May
ISSN (Print)0271-4310

Conference

Conference53rd IEEE International Symposium on Circuits and Systems, ISCAS 2021
Country/TerritoryKorea, Republic of
CityDaegu
Period22/05/2128/05/21

Keywords

  • Analog networks
  • Circuit simulation
  • Neural networks
  • Numerical stability
  • ReLU function
  • Transient analysis

Fingerprint

Dive into the research topics of 'Convergent time-stepping schemes for analog Relu networks'. Together they form a unique fingerprint.

Cite this