Abstract
The l0-normalized least mean squares (l0-NLMS) is arguably the reference gradient adaptive algorithm for sparse system estimation. However, alike all sparse gradient adaptive algorithms, the l0-NLMS performance is sensitive to the (adequate) selection of the tradeoff parameter. Highlighted in this letter, the existence of two convergence modes, linked to the negligible and to the significant taps, paves the way for the convergence analysis, which results in a set of nonlinear (quadratic) convergence equations. Therefrom, the minimization of the steady-state misalignment concludes in the optimal tradeoff, which happens to relate to the NLMS step size, filter length, plant sparsity, and noise level in an extremely compact fashion. Exhaustive simulation experiments show strong agreement between the analytical predictions and the empirical performance.
| Original language | British English |
|---|---|
| Article number | 7503121 |
| Pages (from-to) | 1121-1125 |
| Number of pages | 5 |
| Journal | IEEE Signal Processing Letters |
| Volume | 23 |
| Issue number | 8 |
| DOIs | |
| State | Published - Aug 2016 |
Keywords
- l-norm
- NLMS algorithm
- sparsity tradeoff
Fingerprint
Dive into the research topics of 'Optimal Sparsity Tradeoff in l0-NLMS Algorithm'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver