>
Mistral Saba is a 24B parameter model trained on meticulously curated datasets from across the Middle East and South Asia. The model provides more accurate and relevant responses than models that are over 5 times its size, while being significantly faster and lower cost. The model can also serve as a strong base to train highly specific regional adaptations. Mistral Saba is available as an API, but importantly, it is also available to deploy locally within the security premises of customers. Like the recently released Mistral Small 3, the model is lightweight and can be deployed on single-GPU systems, responding at speeds of over 150 tokens per second.
Notes: 24B