Galactica is an AI Model Trained on 120 Billion Parametersby@whatsai
2,019 reads

Galactica is an AI Model Trained on 120 Billion Parameters

tldt arrow
EN
Read on Terminal Reader
Read this story w/o Javascript

Too Long; Didn't Read

MetaAI and Papers with Code announced the release of Galactica, a game-changer, open-source large language model trained on scientific knowledge with 120 billion parameters. The model can write whitepapers, reviews, Wikipedia pages, and code. It knows how to cite and how to write equations. It’s kind of a big deal for AI and science. On November 17th, Galactica was shut down because it didn’t understand the task at hand and was wrong in many cases. Still, the model is available to researchers, and I believe it is important to keep it open-sourced.
featured image - Galactica is an AI Model Trained on 120 Billion Parameters
Louis Bouchard HackerNoon profile picture

@whatsai

Louis Bouchard


Receive Stories from @whatsai

react to story with heart

RELATED STORIES

L O A D I N G
. . . comments & more!