AR Algorithms

23.08.2021

The Autoregressive estimation algorithms are used in the AR (AutoRegressive) spectral procedure.

The AR coefficients can be computed in a variety of ways. The coefficients can be computed from autocorrelation estimates, from partial autocorrelation (reflection) coefficients, and from least-squares matrix procedures. Further, an AR model using the autocorrelation method will depend on the truncation threshold (maximum lag) used to compute the correlations. The partial autocorrelation method will depend on the specific definition for the reflection coefficient. The least-squares methods will also yield results that are a function of how data are treated at the bounds (matrix size) as well as whether the data matrix or normal equations are fitted.

Autocorrelation Method

The single autocorrelation based algorithm, Autocorrelation, offers the lowest resolution. Endpoint effects arise from the lag-truncated autocorrelation sequence estimate (the infinite ACS is approximated by a finite subset). This is the AR algorithm found in many time-series and statistics packages. It is sometimes referenced as the Yule-Walker method. The biased algorithm used here always results in a stable AR filter (roots are all within the unit circle). Since a recursion is used, the coefficients for all lower orders are also computed.

Versions of the Autocorrelation AR procedure can be found in the STARPAC public domain time series package, and also in the Marple (p. 239) and Kay (p. 260) references. FlexPro's Autocorrelation procedure follows the STARPAC implementation.

Maximum Entropy Method

The Burg algorithm is probably the most widely known AR procedure. Because of its derivation in the context of maximum entropy methods, the algorithm is sometimes designated "MEM". The procedure computes the AR coefficients directly from the data by estimating the reflection coefficients (partial autocorrelations) at successive orders. Since the computed coefficients are the harmonic mean between the forward and backward partial autocorrelation estimates, the Burg procedure is also known as the "Harmonic" algorithm. The algorithm will exhibit some bias in estimating the central frequencies of sine components, and higher order fits are notorious for "splitting", a phenomenon where multiple spectral peaks are generated where only a single feature is present. The Burg method also produces stable AR coefficients with roots inside or on the unit circle. It is recursive as well, and computes the coefficients for lesser orders en route to the solution.

Versions of the Burg procedure can be found in the CMLIB public domain time series package, and also in the Marple (p. 240) and Kay (p. 265) references. A version of the Burg procedure can be found in the Press reference. FlexPro's Burg procedure follows the CMLIB algorithm.

Least-Squares Algorithms

The remaining methods involve simultaneous least-squares estimates of all AR coefficients in the model. The normal equations models implement the normal equations used in typical least-squares estimations. Square matrices are formed using sums of forward or backward data elements.

Normal Equations FB is the forward-backward prediction variant of the least-squares normal equations approach. The sums incur a loss of precision that can result in reduced resolution (compared to direct data methods) and lessened stability of the coefficients. When some noise is present, however, the normal equations generally yield viable results. These algorithms compute the coefficients only for the order specified.

The Data Matrix FB models implement the "modified covariance" methods of linear prediction. The covariance in this context has nothing to do with fitting a covariance matrix. Rather the full data or trajectory matrix, usually rectangular, is fitted directly to the vector of data elements. Data Matrix FB is a fast algorithm that exploits the symmetry of the data matrix. This algorithm produces an optimum least-squares AR fit without the precision losses characteristic of the normal equations approach. For the basic AR methods, the Data Matrix FB algorithm is likely to be the most accurate frequency estimator for undamped sinusoids.

Versions of the "modified covariance" method can be found in the Marple (p.248) and Kay (p. 262) books. FlexPro's Data Matrix FB procedure follows the Marple algorithm. The normal equations procedures are original algorithms authored for FlexPro.

SVD Based Least-Squares Algorithms

Within FlexPro’s AR procedures, there is no count or threshold selection of peaks. The power spectrum for an AR model is an all-pole polynomial. Peaks will occur exactly at frequencies corresponding to roots in the polynomial.

FlexPro’s AR procedures have been designed to facilitate the use of the  (singular value decomposition) routines. For non-SVD algorithms, all of the poles are treated as relevant peaks. To pare the spectrum to only the signal components of interest, an SVD algorithm must be used.

Specify a signal space of 2 for each narrowband harmonic to be treated as signal. For example, to model a signal containing six sinusoids and noise, a signal space of 12 is needed to capture the harmonics, and a total model order of 40-80 will be needed to effectively capture the noise components.

Apart from longer processing times, there are no disadvantages to using an SVD procedure, and the advantages are numerous when extracting harmonics is the primary aim of the modeling. A full signal space SVD fit, one where the signal space equals the model order, produces the same results as the non-SVD algorithms.

For sinusoidal retrieval, the Data Matrix FB SVD is a superb algorithm. It is similar to the Principal Component AR (PCAR) procedure. For large data sets, however, this procedures can be quite slow. If there is noise present (which is why SVD is used in the first place), the normal equations equivalent Normal Equations FB SVD, will produce similar results far more swiftly.

Most of the AR algorithms in FlexPro are least-squares procedures since these produce the best spectral estimates. Least-squares methods that offer in-situ separation of signal and noise through singular value decomposition (SVD) are the most robust of FlexPro’s AR methods. These algorithms are built-into the AR procedures (there is, for example, no separate Principal Component AutoRegressive or PCAR option).

All of the SVD-based procedures are original algorithms authored for FlexPro.

References

Excellent coverage of AR spectral algorithms can be found in the following references:

S. Lawrence Marple, Jr., "Digital Spectral Analysis with Applications", Prentice-Hall, 1987, p.172-284.

Steven M. Kay, "Modern Spectral Estimation", Prentice Hall, 1988, p.153-270.

Also of interest may be the Burg algorithm information in:

W. H. Press, et. al, "Numerical Recipes in C", Cambridge University Press, 1992, p.564-575.

See Also

Spectral Analysis Option

Spectral Estimators Analysis Object - AR Spectral Estimator

Autoregressive Modeling

ARMA Algorithms

Spectral Estimators Tutorial

Share article or send as email:

You might be interested in these articles