paint-brush
Extending GNN Learning: 11 Additional Framework Applicationsby@computational

Extending GNN Learning: 11 Additional Framework Applications

tldt arrow

Too Long; Didn't Read

Beyond initial dataset labeling, the proposed framework has several applications in GNN learning paradigms, including fair k-shot learning—where a model classifies new examples based on a limited number of labeled instances—and fairness constraints, which ensure equitable predictive performance across structural groups by penalizing parameters with low distortion.
featured image - Extending GNN Learning: 11 Additional Framework Applications
Computational Technology for All HackerNoon profile picture

Authors:

(1) Junwei Su, Department of Computer Science, the University of Hong Kong and [email protected];

(2) Chuan Wu, Department of Computer Science, the University of Hong Kong and [email protected].

Abstract and 1 Introduction

2 Related Work

3 Framework

4 Main Results

5 A Case Study on Shortest-Path Distance

6 Conclusion and Discussion, and References

7 Proof of Theorem 1

8 Proof of Theorem 2

9 Procedure for Solving Eq. (6)

10 Additional Experiments Details and Results

11 Other Potential Applications

11 Other Potential Applications

In addition to the initial dataset labelling problem discussed above, there are other potential applications of the proposed framework in other GNNs learning paradigms. Some examples include:


1. Fair k-shot learning: K-shot learning is a type of machine learning where the model is trained to classify new examples based on a small number of labelled examples in the training set. This type of learning is particularly useful in many situations where the amount of labelled data is limited. In addition to the active learning discussed earlier, k-shot learning can also be applied in choosing a representative experience replay buffer in incremental learning and transfer learning.


Fig. 10. Graph distance vs. embedding distance. Additional Results on CoraFull


Fig. 11. Graph distance vs. embedding distance. Additional Results on Ogbn-arxiv


2. Fairness constraint: We can apply the derived results to ensure a fair predictive performance of GNNs for specified structural groups. As we have shown in the paper, the key factor of unfair predictive performance towards different structural groups is the distortion between structural distance and embedding distance. We can penalize the parameters with low distortion, which amounts to the following learning objective:



The proposed framework has promising potential for studying and being applied to the above-mentioned problems. Further research into investigating the actual application results would be an interesting future work.


This paper is available on arxiv under CC BY 4.0 DEED license.