Baichuan-13B is an open-source, commercially available large-scale language model developed by Baichuan Intelligent Technology following Baichuan-7B, containing 13 billion parameters. It achieves the best results of the same size on both authoritative Chinese and English benchmarks. This release includes two versions: pre-training (Baichuan-13B-Base) and alignment (Baichuan-13B-Chat).
FLOPs9.36e+22
Notes: 13b parameters * 1.2t tokens * 6 FLOP / parameter / token = 9.36e22 FLOP
Training Code AccessibilityCommunity License for Baichuan-13B Model (usage restrictions, need to apply for commercial license) https://huggingface.co/baichuan-inc/Baichuan-13B-Base "Open source, free and available for commercial use: Baichuan-13B is not only fully open to academic research, but developers can also use it commercially for free, just by applying for and obtaining an official commercial license via email." repo is under Apache 2.0 https://github.com/baichuan-inc/Baichuan-13B/tree/main
Parameters13264901120