paint-brush
A Brief Intro to the GPT-3 Algorithm by@albertchristopher
3,290 reads
3,290 reads

A Brief Intro to the GPT-3 Algorithm

by albertchristopher3mJune 19th, 2021
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Generative Pre-trained Transformer 3 (GPT-3) embraces and augments the GPT-2 model architecture, including pre-normalization, modified initialization, and reversible tokenization. It exhibits strong performance on many Natural Language Processing (NLP) tasks. It is a massive artificial neural network that takes help from deep learning to generate human-like text and is trained on huge text datasets with thousands of billions of words. The total number of weights the OpenAI Gpt-3 dynamically holds in its memory and utilizes to process every query is 175 billion.

Company Mentioned

Mention Thumbnail
featured image - A Brief Intro to the GPT-3 Algorithm
albertchristopher HackerNoon profile picture
albertchristopher

albertchristopher

@albertchristopher

L O A D I N G
. . . comments & more!

About Author

TOPICS

THIS ARTICLE WAS FEATURED IN...

Permanent on Arweave
Read on Terminal Reader
Read this story in a terminal
 Terminal
Read this story w/o Javascript
Read this story w/o Javascript
 Lite