A Beginner Guide to Incorporating Tabular Data via HuggingFace Transformers

Written by codekgu | Published 2020/11/11
Tech Story Tags: machine-learning | natural-language-processing | transformers | pytorch | python-machine-learning | huggingface | data-science | artificial-intelligence

TLDR Transformer-based models are a game-changer when it comes to using unstructured text data. The top-performing models in the General Language Understanding Evaluation (GLUE) benchmark are all BERT models. Transformer models can learn long-range dependencies between text and can be trained in parallel (as opposed to sequence to sequence models), meaning they can be pre-trained on large amounts of data. We set out to explore how text and tabular data could be used together to provide stronger signals in our projects.via the TL;DR App

no story

Written by codekgu | Passionate about NLP and Graph Deep Learning Research. Georgian. UCLA
Published by HackerNoon on 2020/11/11