ARMA Algorithms

23.08.2021

ARMA (Autoregressive moving average) estimation algorithms are used in the ARMA (AutoRegressive Moving Average) spectral option.

Linear Methods

The simplest and fastest ARMA models are linear sequential algorithms. In a sequential linear procedure, an AR model is first fitted to the data, and then an MA model is fitted to the residuals. The AR and MA coefficients are not simultaneously estimated and the overall solution will be suboptimal. For this reason, FlexPro offers only the optimal non-linear procedures.

Non-Linear Methods

The four non-linear ARMA procedures consist of full iterative Levenburg-Marquardt minimizations. Unlike many ARMA implementations, the FlexPro ARMA filter first proceeds toward the initial data element with backward prediction/averaging and then forward across the full data sequence. Both the ARMA model and a partial derivative for each parameter must be computed point by point at each iteration. The fitting process can be very slow with large data sets and high AR, MA model orders.

The four NL ARMA procedures are original algorithms authored for FlexPro.

Non-Linear

The Non-Linear algorithm imposes no constraints as parameters are allowed to vary freely.

Non-Linear Spectral Factorization

The Non-Linear Spectral Factorization algorithm adds full spectral factorization to the problem. The ARMA filter will be minimum phase as both the AR and MA roots will lie within the unit circle. Although the unconstrained Non-Linear algorithm can sometimes offer a better goodness of fit, the Non-Linear procedure with spectral factorization is often close statistically. Despite the overhead of the spectral factorization, the  algorithm can sometimes be faster since a good measure of a non-linear ARMA fit involves parameters wandering about in regions of instability. The resultant fit will usually have poorer goodness of fit statistics, however, as the true global least-squares minimum often has one or more roots outside the unit circle.

Non-Linear SVD and Non-Linear Spectral Factorization SVD

FlexPro also offers the Non-Linear and Non-Linear Spectral Factorization versions including SVD. Just as in FlexPro’s AR SVD options, a signal space is selected that should contain the principal singular values of the least-squares problem. While one of the uses of ARMA models is to also characterize observation noise, there are still benefits to truncating eigenmodes with SVD. If considerable fitting time is expended wandering about in n-dimensional space fitting weak noise components, faster fits can be achieved by discarding these eigenmodes. Also, deep nulls and sharp peaks are treated equally in the least-squares problem. A principal eigenmode may be associated with a null if this MA component significantly impacts the least-squares fit merit function. Thus SVD retains nulls that significantly impact the model.

ARMA Filter

In the ARMA procedure, The  used in minimizing the least-squares merit function uses backward prediction/averaging from an initial data position down, and forward prediction/averaging from the model order up. The initial position for the backward prediction/averaging is the smaller of (1) the data count minus the AR model order and (2) the maximum of 100 and the sum of the AR and MA model orders. This approach conserves the degrees of freedom, as an estimate is made for each of the input data elements (there is no gap at one end of the data stream).

References

Excellent coverage of AR spectral algorithms can be found in the following references:

S. Lawrence Marple, Jr., "Digital Spectral Analysis with Applications", Prentice-Hall, 1987, p.172-284.

Steven M. Kay, "Modern Spectral Estimation", Prentice Hall, 1988, p.153-270.

See Also

Spectral Analysis Option

Spectral Estimators Analysis Object - ARMA Spectral Estimator

Autoregressive Modeling

AR Algorithms

Spectral Estimators Tutorial

Share article or send as email:

You might be interested in these articles