Yi-1.5 is an upgraded version of Yi. It is continuously pre-trained on Yi with a high-quality corpus of 500B tokens and fine-tuned on 3M diverse fine-tuning samples. Compared with Yi, Yi-1.5 delivers stronger performance in coding, math, reasoning, and instruction-following capability, while still maintaining excellent capabilities in language understanding, commonsense reasoning, and reading comprehension. Yi-1.5 comes in 3 model sizes: 34B, 9B, and 6B. For model details and benchmarks, see Model Card.
FLOPs7.34e+23
Notes: 6 FLOP / parameter / token * 34*10^9 parameters * 3.6*10^12 tokens = 7.344e+23 FLOP
Training Code Accessibilityno training code the model https://huggingface.co/01-ai/Yi-1.5-34B Apache 2.0 "If you create derivative works based on this model, please include the following attribution in your derivative works:"
Size Notes: 3.6T "Yi-1.5 is an upgraded version of Yi. It is continuously pre-trained on Yi with a high-quality corpus of 500B tokens and fine-tuned on 3M diverse fine-tuning samples." 3.6T total pre-trained tokens
Parameters34000000000
Notes: 34b