TY - JOUR
T1 - Robust Spline Adaptive Filter with Whiplash Accelerated Gradient Descent
AU - Yang, Yihui
AU - Guan, Sihai
AU - Zhang, Chuanwu
AU - Biswal, Bharat
N1 - Publisher Copyright:
© The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2025.
PY - 2025
Y1 - 2025
N2 - In traditional research, the stochastic gradient descent (SGD), as the mainstream learning strategy in adaptive linear/nonlinear optimal estimation, is prone to severe oscillations in the region of the local minimum saddle point when dealing with the analysis of linear/nonlinear systems, resulting in a significant decline in convergence efficiency. Although combining SGD with the spline adaptive filter (SAF) has brought some optimizations, it is still challenging to overcome the limitation of local minimum saddle points, which significantly restricts the algorithm’s performance. To address this issue, this paper proposes the SAF-WSGD algorithm, which is based on the Whiplash SGD method (WSGD). This algorithm leverages the unique acceleration mechanism of WSGD, which enables it to effectively bypass saddle point obstacles that are challenging to overcome with traditional methods, thereby significantly accelerating convergence speed. Through theoretical analysis, numerical modeling experiments, and verification using public datasets, this algorithm demonstrates excellent convergence performance in both time-varying and time-invariant systems, as well as in environments with both non-Gaussian and Gaussian noise. It provides a reliable solution for estimating nonlinear systems.
AB - In traditional research, the stochastic gradient descent (SGD), as the mainstream learning strategy in adaptive linear/nonlinear optimal estimation, is prone to severe oscillations in the region of the local minimum saddle point when dealing with the analysis of linear/nonlinear systems, resulting in a significant decline in convergence efficiency. Although combining SGD with the spline adaptive filter (SAF) has brought some optimizations, it is still challenging to overcome the limitation of local minimum saddle points, which significantly restricts the algorithm’s performance. To address this issue, this paper proposes the SAF-WSGD algorithm, which is based on the Whiplash SGD method (WSGD). This algorithm leverages the unique acceleration mechanism of WSGD, which enables it to effectively bypass saddle point obstacles that are challenging to overcome with traditional methods, thereby significantly accelerating convergence speed. Through theoretical analysis, numerical modeling experiments, and verification using public datasets, this algorithm demonstrates excellent convergence performance in both time-varying and time-invariant systems, as well as in environments with both non-Gaussian and Gaussian noise. It provides a reliable solution for estimating nonlinear systems.
KW - Local saddle point
KW - Nonlinear systems
KW - SAF
KW - WSGD
UR - https://www.scopus.com/pages/publications/105017025646
UR - https://www.scopus.com/pages/publications/105017025646#tab=citedBy
U2 - 10.1007/s00034-025-03321-4
DO - 10.1007/s00034-025-03321-4
M3 - Article
AN - SCOPUS:105017025646
SN - 0278-081X
JO - Circuits, Systems, and Signal Processing
JF - Circuits, Systems, and Signal Processing
ER -