Revisiting Euclidean Normalization: A Second Look

Written by batching | Published 2025/02/25
Tech Story Tags: batching | euclidean-normalization | network-training | euclidean-dnn | batch-normalization | batch-variance | internal-covariate | what-is-batch-normalization

TLDRIn Euclidean DNNs, normalization stands as a pivotal technique for accelerating network training by mitigating the issue of internal covariate shiftvia the TL;DR App

Table of Links

Abstract and 1 Introduction

2 Preliminaries

3. Revisiting Normalization

3.1 Revisiting Euclidean Normalization

3.2 Revisiting Existing RBN

4 Riemannian Normalization on Lie Groups

5 LieBN on the Lie Groups of SPD Manifolds and 5.1 Deformed Lie Groups of SPD Manifolds

5.2 LieBN on SPD Manifolds

6 Experiments

6.1 Experimental Results

7 Conclusions, Acknowledgments, and References

APPENDIX CONTENTS

A Notations

B Basic layes in SPDnet and TSMNet

C Statistical Results of Scaling in the LieBN

D LieBN as a Natural Generalization of Euclidean BN

E Domain-specific Momentum LieBN for EEG Classification

F Backpropagation of Matrix Functions

G Additional Details and Experiments of LieBN on SPD manifolds

H Preliminary Experiments on Rotation Matrices

I Proofs of the Lemmas and Theories in the Main Paper

3.1 REVISITING EUCLIDEAN NORMALIZATION

In Euclidean DNNs, normalization stands as a pivotal technique for accelerating network training by mitigating the issue of internal covariate shift (Ioffe & Szegedy, 2015). While various normalization methods have been introduced (Ioffe & Szegedy, 2015; Ba et al., 2016; Ulyanov et al., 2016; Wu & He, 2018), they all share a common fundamental concept: the regulation of the first and second statistical moments. In this paper, we focus on batch normalization only.

This paper is available on arxiv under CC BY-NC-SA 4.0 DEED license.

Authors:

(1) Ziheng Chen, University of Trento;

(2) Yue Song, University of Trento and a Corresponding author;

(3) Yunmei Liu, University of Louisville;

(4) Nicu Sebe, University of Trento.


Written by batching | Batching converges tasks in a single go, maximizing productivity and minimizing overhead.
Published by HackerNoon on 2025/02/25