Galactica is an AI Model Trained on 120 Billion Parametersby@whatsai
1,452 reads

Galactica is an AI Model Trained on 120 Billion Parameters

November 26th 2022
6 min
by @whatsai 1,452 reads
tldt arrow
EN
Read on Terminal Reader

Too Long; Didn't Read

MetaAI and Papers with Code announced the release of Galactica, a game-changer, open-source large language model trained on scientific knowledge with 120 billion parameters. The model can write whitepapers, reviews, Wikipedia pages, and code. It knows how to cite and how to write equations. It’s kind of a big deal for AI and science. On November 17th, Galactica was shut down because it didn’t understand the task at hand and was wrong in many cases. Still, the model is available to researchers, and I believe it is important to keep it open-sourced.
featured image - Galactica is an AI Model Trained on 120 Billion Parameters
Louis Bouchard HackerNoon profile picture

@whatsai

Louis Bouchard

About @whatsai
LEARN MORE ABOUT @WHATSAI'S EXPERTISE AND PLACE ON THE INTERNET.
react to story with heart

RELATED STORIES

L O A D I N G
. . . comments & more!
Hackernoon hq - po box 2206, edwards, colorado 81632, usa