Abstract
Variable time delay estimation in radar and sonar systems is considered. An iterative algorithm for variable time delay estimation (TDE) is presented. It is based on minimum mean squared error criterion. The source, delay, and noise signals are assumed to be sample functions from random processes with known covariance functions. An iterative prediction-correction scheme is devised by utilizing the iteration between the delay estimates. In addition, the extrapolation between the iterations is accomplished using variable gains that have been obtained through the minimization of the trace of the error covariance matrix of the delay vector. Simulations are run to test the performance of the algorithm. Results of the simulations are presented.
Original language | English (US) |
---|---|
Pages (from-to) | 273-278 |
Number of pages | 6 |
Journal | Oceans Conference Record (IEEE) |
State | Published - Sep 1990 |
Externally published | Yes |
Event | Conference Proceedings - OCEANS '90 - Washington, DC, USA Duration: Sep 24 1990 → Sep 26 1990 |
All Science Journal Classification (ASJC) codes
- Ocean Engineering
- Oceanography