Robust Spline Adaptive Filter with Whiplash Accelerated Gradient Descent

Research output: Contribution to journalArticlepeer-review

Abstract

In traditional research, the stochastic gradient descent (SGD), as the mainstream learning strategy in adaptive linear/nonlinear optimal estimation, is prone to severe oscillations in the region of the local minimum saddle point when dealing with the analysis of linear/nonlinear systems, resulting in a significant decline in convergence efficiency. Although combining SGD with the spline adaptive filter (SAF) has brought some optimizations, it is still challenging to overcome the limitation of local minimum saddle points, which significantly restricts the algorithm’s performance. To address this issue, this paper proposes the SAF-WSGD algorithm, which is based on the Whiplash SGD method (WSGD). This algorithm leverages the unique acceleration mechanism of WSGD, which enables it to effectively bypass saddle point obstacles that are challenging to overcome with traditional methods, thereby significantly accelerating convergence speed. Through theoretical analysis, numerical modeling experiments, and verification using public datasets, this algorithm demonstrates excellent convergence performance in both time-varying and time-invariant systems, as well as in environments with both non-Gaussian and Gaussian noise. It provides a reliable solution for estimating nonlinear systems.

Original languageEnglish (US)
JournalCircuits, Systems, and Signal Processing
DOIs
StateAccepted/In press - 2025

All Science Journal Classification (ASJC) codes

  • Signal Processing
  • Applied Mathematics

Keywords

  • Local saddle point
  • Nonlinear systems
  • SAF
  • WSGD

Fingerprint

Dive into the research topics of 'Robust Spline Adaptive Filter with Whiplash Accelerated Gradient Descent'. Together they form a unique fingerprint.

Cite this