paint-brush
Leveraging Natural Supervision: Learning Semantic Knowledge from Wikipediaby@textmodels

Leveraging Natural Supervision: Learning Semantic Knowledge from Wikipedia

tldt arrow

Too Long; Didn't Read

In this study, researchers exploit rich, naturally-occurring structures on Wikipedia for various NLP tasks.
featured image - Leveraging Natural Supervision: Learning Semantic Knowledge from Wikipedia
Writings, Papers and Blogs on Text Models HackerNoon profile picture

Author:

(1) Mingda Chen.

CHAPTER 4 - LEARNING SEMANTIC KNOWLEDGE FROM WIKIPEDIA

In this chapter, we describe our contributions to exploiting rich, naturally-occurring structures on Wikipedia for various NLP tasks. In Section 4.1, we use hyperlinks to learn entity representations. The resultant models use contextualized representations rather than a fixed set of vectors for representing entities (unlike most prior work). In Section 4.2, we use article structures (e.g., paragraph positions and section titles) to make sentence representations aware of the broader context in which they situate, leading to improvements across various discourse-related tasks. In Section 4.3, we use article category hierarchies to learn concept hierarchies that improve model performance on textual entailment tasks.


The material in this chapter is adapted from Chen et al. (2019a), Chen et al. (2019b), and Chen et al. (2020a).



This paper is available on arxiv under CC 4.0 license.