TY - JOUR
T1 - A Unified Framework for Sparse Relaxed Regularized Regression
T2 - SR3
AU - Zheng, Peng
AU - Askham, Travis
AU - Brunton, Steven L.
AU - Kutz, J. Nathan
AU - Aravkin, Aleksandr Y.
N1 - Funding Information:
The work of P. Zheng and A. Y. Aravkin was supported by the Washington Research Foundation Data Science Professorship. The work of T. Askham and J. N. Kutz was supported by the Air Force Office of Scientific Research under Grant FA9550-17-1-0329. The work of S. L. Brunton was supported by the Army Research Office through the Young Investigator Program under Grant W911NF-17-1-0422.
Publisher Copyright:
© 2013 IEEE.
PY - 2019
Y1 - 2019
N2 - Regularized regression problems are ubiquitous in statistical modeling, signal processing, and machine learning. Sparse regression, in particular, has been instrumental in scientific model discovery, including compressed sensing applications, variable selection, and high-dimensional analysis. We propose a broad framework for sparse relaxed regularized regression, called SR3. The key idea is to solve a relaxation of the regularized problem, which has three advantages over the state-of-the-art: 1) solutions of the relaxed problem are superior with respect to errors, false positives, and conditioning; 2) relaxation allows extremely fast algorithms for both convex and nonconvex formulations; and 3) the methods apply to composite regularizers, essential for total variation (TV) as well as sparsity-promoting formulations using tight frames. We demonstrate the advantages of SR3 (computational efficiency, higher accuracy, faster convergence rates, and greater flexibility) across a range of regularized regression problems with synthetic and real data, including applications in compressed sensing, LASSO, matrix completion, TV regularization, and group sparsity. Following standards of reproducible research, we also provide a companion MATLAB package that implements these examples.
AB - Regularized regression problems are ubiquitous in statistical modeling, signal processing, and machine learning. Sparse regression, in particular, has been instrumental in scientific model discovery, including compressed sensing applications, variable selection, and high-dimensional analysis. We propose a broad framework for sparse relaxed regularized regression, called SR3. The key idea is to solve a relaxation of the regularized problem, which has three advantages over the state-of-the-art: 1) solutions of the relaxed problem are superior with respect to errors, false positives, and conditioning; 2) relaxation allows extremely fast algorithms for both convex and nonconvex formulations; and 3) the methods apply to composite regularizers, essential for total variation (TV) as well as sparsity-promoting formulations using tight frames. We demonstrate the advantages of SR3 (computational efficiency, higher accuracy, faster convergence rates, and greater flexibility) across a range of regularized regression problems with synthetic and real data, including applications in compressed sensing, LASSO, matrix completion, TV regularization, and group sparsity. Following standards of reproducible research, we also provide a companion MATLAB package that implements these examples.
KW - LASSO
KW - Nonconvex optimization
KW - compressed sensing
KW - matrix completion
KW - sparse regression
KW - total variation regularization
UR - http://www.scopus.com/inward/record.url?scp=85058637987&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85058637987&partnerID=8YFLogxK
U2 - 10.1109/ACCESS.2018.2886528
DO - 10.1109/ACCESS.2018.2886528
M3 - Article
AN - SCOPUS:85058637987
SN - 2169-3536
VL - 7
SP - 1404
EP - 1423
JO - IEEE Access
JF - IEEE Access
M1 - 8573778
ER -