Recurrent Neural Networks Modelling based on Riemannian Symmetric Positive Definite Manifold
Résumé
State estimation with Kalman Filters (KF) regularly encounters covariance matrices that are unknown or empirically determined, causing sub-optimal performances. Solutions to lift these uncertainties are opening up to estimation techniques based on the hybridization of KF with deep learning methods. In fact, inferring covariance matrices from neural networks gives rise to enforcing symmetric positive definite outputs. In this work, a new Recurrent Neural Network (RNN) model is explored, based on the geometric properties of the Riemannian Symmetric Positive Definite (SPD) manifold. To do so, a neuron function is defined based on the Riemannian exponential map, depending on unknown weights lying on the tangent space of the manifold. In this way, a Riemannian cost function is deduced, enabling to learn the weights as Euclidean parameters with a conventional Gauss-Newton algorithm. It involves the computation of a closedform Jacobian. Through optimization on a simulated covariance dataset, we demonstrate the possibilities of this new approach for RNNs.
Origine | Fichiers produits par l'(les) auteur(s) |
---|