Authors: (1) Bobby He, Department of Computer Science, ETH Zurich (Correspondence to: bobby.he@inf.ethz.ch.); (2) Thomas Hofmann, Department of Computer Science, ETH Zurich. Table of Links Abstract and Introduction Related Work Preliminaries Simplifying Transformer Blocks Further Experimental Analysis Discussion, Reproducibility Statement, Acknowledgements and References A Duality Between Downweighted Residual and Restricting Updates In Linear Layers B Block Layouts C Additional Experiments D Implementation Details B BLOCK LAYOUTS In Fig. 9 and Fig. 10 we show the layouts of our SAS block (Sec. 4.2) and parallel SAS-P block (Sec. 4.3). These are the equivalent plots to the layouts in Fig. 1. Mathematically, our SAS attention sub-block computes (in the notation of Eq. (2)): This paper is available on arxiv under CC 4.0 license. Authors: (1) Bobby He, Department of Computer Science, ETH Zurich (Correspondence to: bobby.he@inf.ethz.ch.); (2) Thomas Hofmann, Department of Computer Science, ETH Zurich. Authors: Authors: (1) Bobby He, Department of Computer Science, ETH Zurich (Correspondence to: bobby.he@inf.ethz.ch.); (2) Thomas Hofmann, Department of Computer Science, ETH Zurich. Table of Links Abstract and Introduction Abstract and Introduction Related Work Related Work Preliminaries Preliminaries Simplifying Transformer Blocks Simplifying Transformer Blocks Further Experimental Analysis Further Experimental Analysis Discussion, Reproducibility Statement, Acknowledgements and References Discussion, Reproducibility Statement, Acknowledgements and References A Duality Between Downweighted Residual and Restricting Updates In Linear Layers A Duality Between Downweighted Residual and Restricting Updates In Linear Layers B Block Layouts B Block Layouts C Additional Experiments C Additional Experiments D Implementation Details D Implementation Details B BLOCK LAYOUTS In Fig. 9 and Fig. 10 we show the layouts of our SAS block (Sec. 4.2) and parallel SAS-P block (Sec. 4.3). These are the equivalent plots to the layouts in Fig. 1. Mathematically, our SAS attention sub-block computes (in the notation of Eq. (2)): This paper is available on arxiv under CC 4.0 license. This paper is available on arxiv under CC 4.0 license. available on arxiv