paint-brush
Researchers Unlock Advanced Building Blocks for Neural Networks on Matrix Manifoldsby@hyperbole
126 reads

Researchers Unlock Advanced Building Blocks for Neural Networks on Matrix Manifolds

by HyperboleDecember 1st, 2024
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

This section reviews SPDNet and GrNet, landmark neural networks for SPD and Grassmann manifolds, and highlights gyrovector space approaches as key to generalizing DNN architectures on these manifolds.
featured image - Researchers Unlock Advanced Building Blocks for Neural Networks on Matrix Manifolds
Hyperbole HackerNoon profile picture

Abstract and 1. Introduction

  1. Preliminaries

  2. Proposed Approach

    3.1 Notation

    3.2 Nueral Networks on SPD Manifolds

    3.3 MLR in Structure Spaces

    3.4 Neural Networks on Grassmann Manifolds

  3. Experiments

  4. Conclusion and References

A. Notations

B. MLR in Structure Spaces

C. Formulation of MLR from the Perspective of Distances to Hyperplanes

D. Human Action Recognition

E. Node Classification

F. Limitations of our work

G. Some Related Definitions

H. Computation of Canonical Representation

I. Proof of Proposition 3.2

J. Proof of Proposition 3.4

K. Proof of Proposition 3.5

L. Proof of Proposition 3.6

M. Proof of Proposition 3.11

N. Proof of Proposition 3.12

2 PRELIMINARIES

2.1 SPD MANIFOLDS

2.2 GRASSMANN MANIFOLDS

2.3 NEURAL NETWORKS ON SPD AND GRASSMANN MANIFOLDS

2.3.1 NEURAL NETWORKS ON SPD MANIFOLDS


The work in Huang & Gool (2017) introduces SPDNet with three novel layers, i.e., Bimap, LogEig, and ReEig layers that has become one of the most successful architectures in the field. In Brooks et al. (2019), the authors further improve SPDNet by developing Riemannian versions of batch normalization layers. Following these works, some works (Nguyen et al., 2019; Nguyen, 2021; Wang et al., 2021; Kobler et al., 2022; Ju & Guan, 2023) design variants of Bimap and batch normalization layers in SPD neural networks. The work in Chakraborty et al. (2020) presents a different approach based on intrinsic operations on SPD manifolds. Their proposed layers have nice theoretical properties. A common limitation of the above works is that they do not provide necessary mathematical tools for constructing many essential building blocks of DNNs on SPD manifolds. Recently, some works (Nguyen, 2022a;b; Nguyen & Yang, 2023) take a gyrovector space approach that enables natural generalizations of some building blocks of DNNs, e.g., MLR for SPD neural networks.


2.3.2 NEURAL NETWORKS ON GRASSMANN MANIFOLDS


In Huang et al. (2018), the authors propose GrNet that explores the same rule of matrix backpropagation (Ionescu et al., 2015) as SPDNet. Some existing works (Wang & Wu, 2020; Souza et al.,2020) are also inspired by GrNet. Like their SPD counterparts, most existing Grassmann neural networks are not built upon a mathematical framework that allows one to generalize a broad class of DNNs to Grassmann manifolds. Using a gyrovector space approach, Nguyen & Yang (2023) has shown that some concepts in Euclidean spaces can be naturally extended to Grassmann manifolds.


Authors:

(1) Xuan Son Nguyen, ETIS, UMR 8051, CY Cergy Paris University, ENSEA, CNRS, France ([email protected]);

(2) Shuo Yang, ETIS, UMR 8051, CY Cergy Paris University, ENSEA, CNRS, France ([email protected]);

(3) Aymeric Histace, ETIS, UMR 8051, CY Cergy Paris University, ENSEA, CNRS, France ([email protected]).


This paper is available on arxiv under CC by 4.0 Deed (Attribution 4.0 International) license.