Robust subspace learning: Robust PCA, robust subspace tracking, and robust subspace recovery

Namrata Vaswani, Thierry Bouwmans, Sajid Javed, Praneeth Narayanamurthy

Research output: Contribution to journalArticlepeer-review

270 Scopus citations

Abstract

Principal component analysis (PCA) is one of the most widely used dimension reduction techniques. A related easier problem is termed subspace learning or subspace estimation. Given relatively clean data, both are easily solved via singular value decomposition (SVD). The problem of subspace learning or PCA in the presence of outliers is called robust subspace learning (RSL) or robust PCA (RPCA). For long data sequences, if one tries to use a single lower-dimensional subspace to represent the data, the required subspace dimension may end up being quite large. For such data, a better model is to assume that it lies in a low-dimensional subspace that can change over time, albeit gradually. The problem of tracking such data (and the subspaces) while being robust to outliers is called robust subspace tracking (RST). This article provides a magazine-style overview of the entire field of RSL and tracking.

Original languageBritish English
Pages (from-to)32-55
Number of pages24
JournalIEEE Signal Processing Magazine
Volume35
Issue number4
DOIs
StatePublished - Jul 2018

Fingerprint

Dive into the research topics of 'Robust subspace learning: Robust PCA, robust subspace tracking, and robust subspace recovery'. Together they form a unique fingerprint.

Cite this