Modeling and identification for high dimensional (i.e. signals with many components) data sets poses severe challenges to off-the-shelf techniques for system identification. This is particularly so when relatively small data sets, as compared to the number signal components, have to be used. It is often the case that each component of the measured signal can be described in terms of few other measured variables and these dependences can be encoded in a graphical way via so called Dynamic Bayesian Networks''. The problem of finding the interconnection structure as well as estimating the dynamic models can be posed as a system identification problem which involves variables selection. While this variable selection could be performed via standard selection techniques, computational complexity may however be a critical issue, being combinatorial in the number of inputs and outputs. In this paper we introduce two new nonparametric techniques which borrow ideas from a recently introduced kernel estimator called stable-spline'' as well as from sparsity inducing priors which use $\ell_1$-type penalties. Numerical experiments regarding estimation of large scale sparse (ARMAX) models show that this technique provides a definite advantage over a group LAR algorithm and state-of-the-art parametric identification techniques based on prediction error minimization.

### A Bayesian approach to sparse dynamic network identification

#### Abstract

Modeling and identification for high dimensional (i.e. signals with many components) data sets poses severe challenges to off-the-shelf techniques for system identification. This is particularly so when relatively small data sets, as compared to the number signal components, have to be used. It is often the case that each component of the measured signal can be described in terms of few other measured variables and these dependences can be encoded in a graphical way via so called Dynamic Bayesian Networks''. The problem of finding the interconnection structure as well as estimating the dynamic models can be posed as a system identification problem which involves variables selection. While this variable selection could be performed via standard selection techniques, computational complexity may however be a critical issue, being combinatorial in the number of inputs and outputs. In this paper we introduce two new nonparametric techniques which borrow ideas from a recently introduced kernel estimator called stable-spline'' as well as from sparsity inducing priors which use $\ell_1$-type penalties. Numerical experiments regarding estimation of large scale sparse (ARMAX) models show that this technique provides a definite advantage over a group LAR algorithm and state-of-the-art parametric identification techniques based on prediction error minimization.
##### Scheda breve Scheda completa Scheda completa (DC)
File in questo prodotto:
File
Autosparse_v12_Bayesian_SENT.pdf

accesso aperto

Tipologia: Preprint (submitted version)
Licenza: Accesso gratuito
Dimensione 741.83 kB
Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11577/2494977