Optimal Sparsity Tradeoff in l0-NLMS Algorithm

Abdullah Al-Shabili, Luis Weruaga, Shihab Jimaa

Research output: Contribution to journalArticlepeer-review

6 Scopus citations


The l0-normalized least mean squares (l0-NLMS) is arguably the reference gradient adaptive algorithm for sparse system estimation. However, alike all sparse gradient adaptive algorithms, the l0-NLMS performance is sensitive to the (adequate) selection of the tradeoff parameter. Highlighted in this letter, the existence of two convergence modes, linked to the negligible and to the significant taps, paves the way for the convergence analysis, which results in a set of nonlinear (quadratic) convergence equations. Therefrom, the minimization of the steady-state misalignment concludes in the optimal tradeoff, which happens to relate to the NLMS step size, filter length, plant sparsity, and noise level in an extremely compact fashion. Exhaustive simulation experiments show strong agreement between the analytical predictions and the empirical performance.

Original languageBritish English
Article number7503121
Pages (from-to)1121-1125
Number of pages5
JournalIEEE Signal Processing Letters
Issue number8
StatePublished - Aug 2016


  • l-norm
  • NLMS algorithm
  • sparsity tradeoff


Dive into the research topics of 'Optimal Sparsity Tradeoff in l0-NLMS Algorithm'. Together they form a unique fingerprint.

Cite this