On the Stability of Analog ReLU Networks

Research output: Contribution to journalArticlepeer-review

16 Scopus citations

Abstract

Rectified linear unit (ReLU) networks have become widely used in machine learning and automated inference using neural networks. Various forms of hardware accelerators based on ReLU networks have also been under development. In this brief, the stability problem in analog ReLU networks is addressed. Using the Lyapunov stability theory, it is shown that the origin of an unforced, analog ReLU dynamical system is globally asymptotically stable if the induced Euclidean norm of its connectivity matrix is less than one. An example is given to demonstrate that this upper bound is the best that can be achieved. In particular, the stability result holds for the case of a nonsymmetric connectivity matrix as may occur in some mathematical models of neurobiology.

Original languageBritish English
Pages (from-to)2426-2430
Number of pages5
JournalIEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems
Volume40
Issue number11
DOIs
StatePublished - 1 Nov 2021

Keywords

  • Analog networks
  • Lyapunov theory
  • rectified linear unit (ReLU) activation
  • stability analysis

Fingerprint

Dive into the research topics of 'On the Stability of Analog ReLU Networks'. Together they form a unique fingerprint.

Cite this