Table of Links Abstract and 1. Introduction Preliminaries Proposed Approach 3.1 Notation 3.2 Nueral Networks on SPD Manifolds 3.3 MLR in Structure Spaces 3.4 Neural Networks on Grassmann Manifolds Experiments Conclusion and References A. Notations B. MLR in Structure Spaces C. Formulation of MLR from the Perspective of Distances to Hyperplanes D. Human Action Recognition E. Node Classification F. Limitations of our work G. Some Related Definitions H. Computation of Canonical Representation I. Proof of Proposition 3.2 J. Proof of Proposition 3.4 K. Proof of Proposition 3.5 L. Proof of Proposition 3.6 M. Proof of Proposition 3.11 N. Proof of Proposition 3.12 F LIMITATIONS OF OUR WORK Our SPD network GyroSpd++ relies on different Riemannian metrics across the layers, i.e., the convolutional layer is based on Affine-Invariant metrics while the MLR layer is based on LogEuclidean metrics. Although we have provided the experimental results demonstrating that GyroSpd++ achieves good performance on all the datasets compared to state-of-the-art methods, it is not clear if our design is optimal for the human action recognition task. When it comes to building a deep SPD architecture, it is useful to provide insights into Riemannian metrics one should use for each network block in order to obtain good performance on a target task. In our Grassmann network Gr-GCN++, the feature transformation and bias and nonlinearity operations are performed on Grassmann manifolds, while the aggregation operation is performed in tangent spaces. Previous works (Dai et al., 2021; Chen et al., 2022) on HNNs have shown that this hybrid method limits the modeling ability of networks. Therefore, it is desirable to develop GCNs where all the operations are formalized on Grassmann manifolds. Authors: (1) Xuan Son Nguyen, ETIS, UMR 8051, CY Cergy Paris University, ENSEA, CNRS, France (xuan-son.nguyen@ensea.fr); (2) Shuo Yang, ETIS, UMR 8051, CY Cergy Paris University, ENSEA, CNRS, France (son.nguyen@ensea.fr); (3) Aymeric Histace, ETIS, UMR 8051, CY Cergy Paris University, ENSEA, CNRS, France (aymeric.histace@ensea.fr). This paper is available on arxiv under CC by 4.0 Deed (Attribution 4.0 International) license. Table of Links Abstract and 1. Introduction Abstract and 1. Introduction Preliminaries Proposed Approach 3.1 Notation 3.2 Nueral Networks on SPD Manifolds 3.3 MLR in Structure Spaces 3.4 Neural Networks on Grassmann Manifolds Experiments Conclusion and References Preliminaries Preliminaries Preliminaries Proposed Approach 3.1 Notation 3.2 Nueral Networks on SPD Manifolds 3.3 MLR in Structure Spaces 3.4 Neural Networks on Grassmann Manifolds Proposed Approach 3.1 Notation 3.1 Notation 3.2 Nueral Networks on SPD Manifolds 3.2 Nueral Networks on SPD Manifolds 3.3 MLR in Structure Spaces 3.3 MLR in Structure Spaces 3.4 Neural Networks on Grassmann Manifolds 3.4 Neural Networks on Grassmann Manifolds Experiments Experiments Experiments Conclusion and References Conclusion and References Conclusion and References A. Notations A. Notations B. MLR in Structure Spaces B. MLR in Structure Spaces C. Formulation of MLR from the Perspective of Distances to Hyperplanes C. Formulation of MLR from the Perspective of Distances to Hyperplanes D. Human Action Recognition D. Human Action Recognition E. Node Classification E. Node Classification F. Limitations of our work F. Limitations of our work G. Some Related Definitions G. Some Related Definitions H. Computation of Canonical Representation H. Computation of Canonical Representation I. Proof of Proposition 3.2 I. Proof of Proposition 3.2 J. Proof of Proposition 3.4 J. Proof of Proposition 3.4 K. Proof of Proposition 3.5 K. Proof of Proposition 3.5 L. Proof of Proposition 3.6 L. Proof of Proposition 3.6 M. Proof of Proposition 3.11 M. Proof of Proposition 3.11 N. Proof of Proposition 3.12 N. Proof of Proposition 3.12 F LIMITATIONS OF OUR WORK Our SPD network GyroSpd++ relies on different Riemannian metrics across the layers, i.e., the convolutional layer is based on Affine-Invariant metrics while the MLR layer is based on LogEuclidean metrics. Although we have provided the experimental results demonstrating that GyroSpd++ achieves good performance on all the datasets compared to state-of-the-art methods, it is not clear if our design is optimal for the human action recognition task. When it comes to building a deep SPD architecture, it is useful to provide insights into Riemannian metrics one should use for each network block in order to obtain good performance on a target task. In our Grassmann network Gr-GCN++, the feature transformation and bias and nonlinearity operations are performed on Grassmann manifolds, while the aggregation operation is performed in tangent spaces. Previous works (Dai et al., 2021; Chen et al., 2022) on HNNs have shown that this hybrid method limits the modeling ability of networks. Therefore, it is desirable to develop GCNs where all the operations are formalized on Grassmann manifolds. Authors: (1) Xuan Son Nguyen, ETIS, UMR 8051, CY Cergy Paris University, ENSEA, CNRS, France (xuan-son.nguyen@ensea.fr); (2) Shuo Yang, ETIS, UMR 8051, CY Cergy Paris University, ENSEA, CNRS, France (son.nguyen@ensea.fr); (3) Aymeric Histace, ETIS, UMR 8051, CY Cergy Paris University, ENSEA, CNRS, France (aymeric.histace@ensea.fr). Authors: Authors: (1) Xuan Son Nguyen, ETIS, UMR 8051, CY Cergy Paris University, ENSEA, CNRS, France (xuan-son.nguyen@ensea.fr); (2) Shuo Yang, ETIS, UMR 8051, CY Cergy Paris University, ENSEA, CNRS, France (son.nguyen@ensea.fr); (3) Aymeric Histace, ETIS, UMR 8051, CY Cergy Paris University, ENSEA, CNRS, France (aymeric.histace@ensea.fr). This paper is available on arxiv under CC by 4.0 Deed (Attribution 4.0 International) license. This paper is available on arxiv under CC by 4.0 Deed (Attribution 4.0 International) license. available on arxiv available on arxiv