Authors:
(1) Vladislav Trifonov, Skoltech ([email protected]);
(2) Alexander Rudikov, AIRI, Skoltech;
(3) Oleg Iliev, Fraunhofer ITWM;
(4) Ivan Oseledets, AIRI, Skoltech;
(5) Ekaterina Muravleva, Skoltech.
Table of Links
2 Neural design of preconditioner
3 Learn correction for ILU and 3.1 Graph neural network with preserving sparsity pattern
5.1 Experiment environment and 5.2 Comparison with classical preconditioners 5.3 Loss function
5.4 Generalization to different grids and datasets
7 Conclusion and further work, and References
5.3 Loss function
As stated in Section 2, one should focus on approximation of low frequency components. In the Table 2 we can see that the proposed loss does indeed reduce the distance between extreme eigenvalues compared to IC(0). Moreover, the gap between the extreme eigenvalues is covered by the increase in the minimum eigenvalue, which supports the hypothesis of low frequency cancellation. Maximum eigenvalue also grows but with way less order of magnitude.
This paper is