>
Japanese StableLM is a 7 billion-parameter general-purpose language model. It stands as the top-performing publicly available Japanese language model, according to a benchmark suite against four sets of other Japanese LMs. Japanese StableLM Base Alpha 7B will be released under the commercially available Apache License 2.0. Japanese StableLM Instruct Alpha 7B is a model created for research purposes and is released exclusively for research use. For details, please refer to the Hugging Face Hub page.
Notes: 7b params, 750b tokens 7b * 750b * 6 = 3.15e22
Size Notes: 750B tokens
Notes: 7B