>
FLOPs1.92e+22
Notes: 6 * 1.6B * 2T = 19200000000000000000000
Training Code Accessibilitynon-commercial: https://huggingface.co/stabilityai/stablelm-2-1_6b/blob/main/LICENSE
HardwareNVIDIA A100 SXM4 40 GB
Hardware Quantity512
Size Notes: "model pre-trained on 2 trillion tokens of diverse multilingual and code datasets for two epochs."
Parameters1644417024
Notes: Table under Model Architecture gives exact parameter count