paint-brush
Galactica is an AI Model Trained on 120 Billion Parametersby@whatsai
2,485 reads
2,485 reads

Galactica is an AI Model Trained on 120 Billion Parameters

by Louis Bouchard6mNovember 26th, 2022
Read on Terminal Reader
Read this story w/o Javascript

Too Long; Didn't Read

MetaAI and Papers with Code announced the release of Galactica, a game-changer, open-source large language model trained on scientific knowledge with 120 billion parameters. The model can write whitepapers, reviews, Wikipedia pages, and code. It knows how to cite and how to write equations. It’s kind of a big deal for AI and science. On November 17th, Galactica was shut down because it didn’t understand the task at hand and was wrong in many cases. Still, the model is available to researchers, and I believe it is important to keep it open-sourced.
featured image - Galactica is an AI Model Trained on 120 Billion Parameters
Louis Bouchard HackerNoon profile picture
Louis Bouchard

Louis Bouchard

@whatsai

I explain Artificial Intelligence terms and news to non-experts.

About @whatsai
LEARN MORE ABOUT @WHATSAI'S
EXPERTISE AND PLACE ON THE INTERNET.
L O A D I N G
. . . comments & more!

About Author

Louis Bouchard HackerNoon profile picture
Louis Bouchard@whatsai
I explain Artificial Intelligence terms and news to non-experts.

TOPICS

Languages

THIS ARTICLE WAS FEATURED IN...

Permanent on Arweave
Read on Terminal Reader
Read this story in a terminal
 Terminal
Read this story w/o Javascript
Read this story w/o Javascript
 Lite