This story draft by @escholar has not been reviewed by an editor, YET.

Conclusion, Acknowledgments and Disclosure of Funding, and References

EScholar: Electronic Academic Papers for Scholars HackerNoon profile picture
0-item

Table of Links

Abstract and 1. Introduction

  1. Related work

  2. HypNF Model

    3.1 HypNF Model

    3.2 The S1/H2 model

    3.3 Assigning labels to nodes

  3. HypNF benchmarking framework

  4. Experiments

    5.1 Parameter Space

    5.2 Machine learning models

  5. Results

  6. Conclusion, Acknowledgments and Disclosure of Funding, and References


A. Empirical validation of HypNF

B. Degree distribution and clustering control in HypNF

C. Hyperparameters of the machine learning models

D. Fluctuations in the performance of machine learning models

E. Homophily in the synthetic networks

F. Exploring the parameters’ space

7 Conclusion

In this study, we introduced HypNF benchmarking framework for graph machine learning, and in particular for graph neural networks. This framework, leveraging the S 1/H2 and bipartite-S1/H2 models, enables the generation of synthetic networks with controllable properties such as degree distributions, average degrees, clustering coefficients, and homophily levels. Our findings underscore the significance of topology-feature correlations in influencing GNN performance. Specifically, models embedded in the hyperbolic space, with its intrinsic hierarchical structure, outperformed the rest of the models in the LP task. This research contributes to the broader understanding of graph machine learning, providing insights into the suitability of various models under different network conditions. Furthermore, our benchmarking framework serves as a valuable tool for the community, enabling fair and standardized comparisons of new GNN architectures against established models. A promising direction for future work is to extend the bipartite-S1/H2 model to incorporate weighted features. This will enable HypNF to generate networks with non-binary features, allowing for more nuanced and flexible representations [4]. Additionally, the framework can be extended to accommodate community structures by using a non-homogeneous distribution of nodes within the S1/H2 similarity space, as done in [41] and [13].

Acknowledgments and Disclosure of Funding

We acknowledge support from: Grant TED2021-129791B-I00 funded by MCIN/AEI/10.13039/501100011033 and the “European Union NextGenerationEU/PRTR”; Grant PID2022-137505NB-C22 funded by MCIN/AEI/10.13039/501100011033; Generalitat de Catalunya grant number 2021SGR00856. R. J. acknowledge support from the fellowship FI-SDUR funded by Generalitat de Catalunya. M. B. acknowledges the ICREA Academia award, funded by the Generalitat de Catalunya.

References

[1] M. A. Abdullah, N. Fountoulakis, and M. Bode. 2017. Typical distances in a geometric model for complex networks. Internet Math. 1 (2017). https://doi.org/10.24166/im.13.2017


[2] Sarwan Ali, Muhammad Haroon Shakeel, Imdadullah Khan, Safiullah Faizullah, and Muhammad Asad Khan. 2021. Predicting attributes of nodes using network structure. ACM Transactions on Intelligent Systems and Technology (TIST) 12, 2 (2021), 1–23.


[3] Roya Aliakbarisani, M. Ángeles Serrano, and Marián Boguñá. 2023. Feature-enriched hyperbolic network geometry. arXiv:2307.14198 [physics.soc-ph]


[4] Antoine Allard, M. Ángeles Serrano, Guillermo García-Pérez, and Marián Boguñá. 2017. The geometric nature of weights in real complex networks. Nature Communications 8, 1 (2017), 14103. https://doi.org/10.1038/ncomms14103


[5] Christopher Bishop. 2006. Pattern recognition and machine learning. Springer, New York, NY.


[6] M. Boguñá, I. Bonamassa, M. De Domenico, S. Havlin, D. Krioukov, and M. Á. Serrano. 2021. Network geometry. Nat. Rev. Phys. 3 (2021), 114–135. https://doi.org/10.1038/ s42254-020-00264-4


[7] M. Boguñá, D. Krioukov, P. Almagro, and M. Á. Serrano. 2020. Small worlds and clustering in spatial networks. Phys. Rev. Res. 2 (Apr 2020), 023040. Issue 2. https://doi.org/10. 1103/PhysRevResearch.2.023040


[8] E. Candellero and N. Fountoulakis. 2016. Clustering and the Hyperbolic Geometry of Complex Networks. Internet Math. 12, 1-2 (2016), 2–53. https://doi.org/10.1080/15427951. 2015.1067848


[9] I. Chami, Z. Ying, Ch. Ré, and J. Leskovec. 2019. Hyperbolic Graph Convolutional Neural Networks. In Advances in Neural Information Processing Systems, H. Wallach, H. Larochelle, A. Beygelzimer, F. d'Alché-Buc, E. Fox, and R. Garnett (Eds.), Vol. 32. Curran Associates, Inc. https://proceedings.neurips.cc/paper_files/paper/2019/file/ 0415740eaa4d9decbc8da001d3fd805f-Paper.pdf


[10] N. Fountoulakis, P. van der Hoorn, T. Müller, and M. Schepers. 2021. Clustering in a hyperbolic model of complex networks. Electron. J. Probab. 26 (2021), 1 – 132. https://doi.org/10. 1214/21-EJP583


[11] T. Friedrich and A. Krohmer. 2018. On the Diameter of Hyperbolic Random Graphs. SIAM J. Discrete Math. 32, 2 (2018), 1314–1334. https://doi.org/10.1137/17M1123961


[12] O. Ganea, G. Bécigneul, and T. Hofmann. 2018. Hyperbolic neural networks. In Advances in Neural Information Processing Systems, S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, and R. Garnett (Eds.), Vol. 31. Curran Associates, Inc.


[13] Guillermo García-Pérez, M. Ángeles Serrano, and Marián Boguñá. 2018. Soft Communities in Similarity Space. Journal of Statistical Physics 173, 3 (2018), 775–782. https://doi.org/ 10.1007/s10955-018-2084-z


[14] L. Gugelmann, K. Panagiotou, and U. Peter. 2012. Random Hyperbolic Graphs: Degree Sequence and Clustering. In Autom Lang Program (ICALP 2012, Part II), LNCS 7392. https: //doi.org/10.1007/978-3-642-31585-5_51


[15] Weihua Hu, Matthias Fey, Marinka Zitnik, Yuxiao Dong, Hongyu Ren, Bowen Liu, Michele Catasta, and Jure Leskovec. 2020. Open graph benchmark: Datasets for machine learning on graphs. Advances in neural information processing systems 33 (2020), 22118–22133.


[16] Robert Jankowski, Antoine Allard, Marián Boguñá, and M Ángeles Serrano. 2023. The DMercator method for the multidimensional hyperbolic embedding of real networks. Nature Communications 14, 1 (2023), 7585.


[17] Robert Jankowski, Pegah Hozhabrierdi, Marián Boguñá, and M. Á Serrano. 2024. Feature-aware ultra-low dimensional reduction of real networks. arXiv preprint arXiv:2401.09368 (2024).


[18] T. N. Kipf and M. Welling. 2017. Semi-supervised Classification With Graph Convolutional Networks. In International Conference on Learning Representations.


[19] D. Krioukov, F. Papadopoulos, M. Kitsak, A. Vahdat, and M. Boguñá. 2010. Hyperbolic geometry of complex networks. Phys. Rev. E 82, 3 (2010), 036106. https://doi.org/10. 1103/PhysRevE.82.036106


[20] A. Lancichinetti, S. Fortunato, and F. Radicchi. 2008. Benchmark graphs for testing community detection algorithms. Physical Review E 78 (Oct 2008), 046110. Issue 4. https://doi.org/ 10.1103/PhysRevE.78.046110


[21] Derek Lim, Xiuyu Li, Felix Hohne, and Ser-Nam Lim. 2021. New benchmarks for learning on non-homophilous graphs. arXiv preprint arXiv:2104.01404 (2021).


[22] Yao Ma, Xiaorui Liu, Neil Shah, and Jiliang Tang. 2021. Is homophily a necessity for graph neural networks? arXiv preprint arXiv:2106.06134 (2021).


[23] Seiji Maekawa, Koki Noda, Yuya Sasaki, et al. 2022. Beyond real-world benchmark datasets: An empirical study of node classification with GNNs. Advances in Neural Information Processing Systems 35 (2022), 5562–5574.


[24] Seiji Maekawa, Yuya Sasaki, George Fletcher, and Makoto Onizuka. 2023. GenCAT: Generating attributed graphs with controlled relationships between classes, attributes, and topology. Information Systems 115 (2023), 102195. https://doi.org/10.1016/j.is.2023.102195


[25] Francisco Melo. 2013. Area under the ROC Curve. Springer, New York, NY, 38–39. https: //doi.org/10.1007/978-1-4419-9863-7_209


[26] T. Müller and M. Staps. 2019. The diameter of KPKVB random graphs. Adv. Appl. Probab. 51, 2 (2019), 358–377. https://doi.org/10.1017/apr.2019.23


[27] J. Palowitch, A. Tsitsulin, B. Mayer, and B. Perozzi. 2022. GraphWorld: Fake Graphs Bring Real Insights for GNNs. In Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (Washington DC, USA) (KDD ’22). Association for Computing Machinery, New York, NY, USA, 3691–3701. https://doi.org/10.1145/3534678.3539203


[28] H. Pei, B. Wei, K. C. Chang, Y. Lei, and B. Yang. 2020. Geom-GCN: Geometric Graph Convolutional Networks. In International Conference on Learning Representations.


[29] Franco Scarselli, Marco Gori, Ah Chung Tsoi, Markus Hagenbuchner, and Gabriele Monfardini. 2008. The graph neural network model. IEEE transactions on neural networks 20, 1 (2008), 61–80.


[30] M. Á. Serrano and M. Boguñá. 2022. The Shortest Path to Network Geometry: A Practical Guide to Basic Models and Applications. Cambridge University Press. https://doi.org/ 10.1017/9781108865791


[31] M. Á. Serrano, D. Krioukov, and M. Boguñá. 2008. Self-Similarity of Complex Networks and Hidden Metric Spaces. Phys. Rev. Lett. 100, 7 (2008), 078701. https://doi.org/10.1103/ PhysRevLett.100.078701


[32] Neil Shah. 2020. Scale-Free, Attributed and Class-Assortative Graph Generation to Facilitate Introspection of Graph Neural Networks. In KDD Mining and Learning with Graphs.


[33] P. Velickovic, G. Cucurull, A. Casanova, A. Romero, P. Liò, and Y. Bengio. 2018. Graph Attention Networks. In International Conference on Learning Representations.


[34] Chaokun Wang, Binbin Wang, Bingyang Huang, Shaoxu Song, and Zai Li. 2021. FastSGG: Efficient Social Graph Generation Using a Degree Distribution Generation Model. In 2021 IEEE 37th International Conference on Data Engineering (ICDE). 564–575. https://doi. org/10.1109/ICDE51399.2021.00055


[35] Junfu Wang, Yuanfang Guo, Liang Yang, and Yunhong Wang. 2024. Understanding Heterophily for Graph Neural Networks. arXiv preprint arXiv:2401.09125 (2024).


[36] Lingfei Wu, Peng Cui, Jian Pei, Liang Zhao, and Xiaojie Guo. 2022. Graph Neural Networks: Foundation, Frontiers and Applications. In Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (Washington DC, USA) (KDD ’22). Association for Computing Machinery, New York, NY, USA, 4840–4841. https://doi.org/10.1145/ 3534678.3542609


[37] Zonghan Wu, Shirui Pan, Fengwen Chen, Guodong Long, Chengqi Zhang, and Philip S. Yu. 2021. A comprehensive survey on graph neural networks. IEEE Transactions on Neural Networks and Learning Systems 32, 1 (2021), 4–24. https://doi.org/10.1109/TNNLS. 2020.2978386


[38] M. Yasir, J. Palowitch, A. Tsitsulin, L. Tran-Thanh, and B. Perozzi. 2023. Examining the Effects of Degree Distribution and Homophily in Graph Learning Models. arXiv:2307.08881


[39] Jie Zhou, Ganqu Cui, Shengding Hu, Zhengyan Zhang, Cheng Yang, Zhiyuan Liu, Lifeng Wang, Changcheng Li, and Maosong Sun. 2020. Graph neural networks: A review of methods and applications. AI Open 1 (2020), 57–81. https://doi.org/10.1016/j.aiopen.2021.01. 001


[40] Jiong Zhu, Yujun Yan, Lingxiao Zhao, Mark Heimann, Leman Akoglu, and Danai Koutra. 2020. Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in neural information processing systems 33 (2020), 7793–7804.


[41] Konstantin Zuev, Marián Boguñá, Ginestra Bianconi, and Dmitri Krioukov. 2015. Emergence of Soft Communities from Geometric Preferential Attachment. Scientific Reports 5, 1 (2015), 9421. https://doi.org/10.1038/srep09421


Authors:

(1) Roya Aliakbarisani, this author contributed equally from Universitat de Barcelona & UBICS ([email protected]);

(2) Robert Jankowski, this author contributed equally from Universitat de Barcelona & UBICS ([email protected]);

(3) M. Ángeles Serrano, Universitat de Barcelona, UBICS & ICREA ([email protected]);

(4) Marián Boguñá, Universitat de Barcelona & UBICS ([email protected]).


This paper is available on arxiv under CC by 4.0 Deed (Attribution 4.0 International) license.


L O A D I N G
. . . comments & more!

About Author

EScholar: Electronic Academic Papers for Scholars HackerNoon profile picture
EScholar: Electronic Academic Papers for Scholars@escholar
We publish the best academic work (that's too often lost to peer reviews & the TA's desk) to the global tech community

Topics

Around The Web...

Trending Topics

blockchaincryptocurrencyhackernoon-top-storyprogrammingsoftware-developmenttechnologystartuphackernoon-booksBitcoinbooks