On the Stability of Analog ReLU Networks

Research output: Contribution to journalArticlepeer-review

9 Scopus citations


Rectified linear unit (ReLU) networks have become widely used in machine learning and automated inference using neural networks. Various forms of hardware accelerators based on ReLU networks have also been under development. In this brief, the stability problem in analog ReLU networks is addressed. Using the Lyapunov stability theory, it is shown that the origin of an unforced, analog ReLU dynamical system is globally asymptotically stable if the induced Euclidean norm of its connectivity matrix is less than one. An example is given to demonstrate that this upper bound is the best that can be achieved. In particular, the stability result holds for the case of a nonsymmetric connectivity matrix as may occur in some mathematical models of neurobiology.

Original languageBritish English
Pages (from-to)2426-2430
Number of pages5
JournalIEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems
Issue number11
StatePublished - 1 Nov 2021


  • Analog networks
  • Lyapunov theory
  • rectified linear unit (ReLU) activation
  • stability analysis


Dive into the research topics of 'On the Stability of Analog ReLU Networks'. Together they form a unique fingerprint.

Cite this