paint-brush
Leveraging Natural Supervision for Language Representation Learning and Generation: Acknowledgementsby@textmodels

Leveraging Natural Supervision for Language Representation Learning and Generation: Acknowledgements

tldt arrow

Too Long; Didn't Read

In this study, researchers describe three lines of work that seek to improve the training and evaluation of neural models using naturally-occurring supervision.
featured image - Leveraging Natural Supervision for Language Representation Learning and Generation: Acknowledgements
Writings, Papers and Blogs on Text Models HackerNoon profile picture

Author:

(1) Mingda Chen.

ACKNOWLEDGEMENTS

Like all great travellers, I have seen more than I remember, and remember more than I have seen.


– Benjamin Disraeli


The Ph.D. journey is an adventure mixed with daunting challenges, unanticipated bafflements, and instant delights. Many people have guided me through the challenges, clarified my confusion, and shared my happiness. I am enormously grateful for their help along the journey.


First, I want to thank my advisor Kevin Gimpel for the technical insight and research philosophy throughout these years. He was always knowledgeable about everything we worked on, meticulous about every word we wrote on papers, and patient about my mistakes. The work in this thesis would not be possible without his positive, steady influence.


I thank the rest of my thesis committee: Karen Livescu, Sam Wiseman, and Luke Zettlemoyer, for being generous with their time and insight. I also thank Karl Stratos for his guidance.


I thank my fellow students at Toyota Technological Institute at Chicago and the University of Chicago, especially Qingming Tang for the technical (and nontechnical) conversations, Bumeng Zhuo for the fun activities during weekends, and Zewei Chu for the bike rides at the lakefront in Chicago. I am also grateful to my fellow interns and mentors at Google and Facebook.


Lastly, I would like to thank my family for inspiring my interest in learning, encouraging me to apply to graduate school, and being supportive and interested in listening to my research ramblings.


This paper is available on arxiv under CC 4.0 license.