Papers Read on AI

Papers Read on AI header image 1
June 10, 2022  

GPT-NeoX-20B: An Open-Source Autoregressive Language Model

June 10, 2022

In this work, we describe GPT-NeoX-20B’s architecture and training, and evaluate its performance. We open-source the training and evaluation code, as well as the model weights. A 20 billion parameter autoregressive language model trained on the Pile, whose weights will be made freely and openly available to the public through a permissive license. It is, to the best of our knowledge, the largest dense autoregressive model that has publicly available weights at the time of submission.

2022: Sid Black, Stella Rose Biderman, Eric Hallahan, Quentin G. Anthony, Leo Gao, Laurence Golding, Horace He, Connor Leahy, Kyle McDonell, Jason Phang, M. Pieler, Usvsn Sai Prashanth, Shivanshu Purohit, Laria Reynolds, J. Tow, Ben Wang, Samuel Weinbach

Ranked #7 on Multi-task Language Understanding on MMLU

https://arxiv.org/pdf/2204.06745v1.pdf