Publication
Systems and Control Letters
Paper

Learning low-complexity autoregressive models via proximal alternating minimization

View publication

Abstract

We consider the estimation of the state transition matrix in vector autoregressive models, when time sequence data is limited but nonsequence steady-state data is abundant. To leverage both sources of data, we formulate the least squares minimization problem regularized by a Lyapunov penalty. We impose cardinality or rank constraints to reduce the complexity of the autoregressive model. The resulting nonconvex, nonsmooth problem is solved by using the proximal alternating linearization method (PALM). We prove that PALM is globally convergent to a critical point and that the estimation error monotonically decreases. Explicit formulas are obtained for the proximal operators to facilitate the implementation of PALM. We demonstrate the effectiveness of the developed method by numerical experiments.

Date

Publication

Systems and Control Letters

Authors

Share