Cerebras open sources seven GPT-3 models from 111 million to 13 billion parameters. Trained using the Chinchilla formula, these models set new benchmarks for accuracy and compute efficiency.
Source: Cerebras-GPT: A Family of Open, Compute-efficient, Large Language Models – Cerebras