Built with hybrid SSM-Transformer architecture, the Jamba 1.6 family of models outperform other open, instruction-following foundation models on quality, speed, and long context performance, and rival leading closed models on quality. As open models, Jamba Mini 1.6 (12B active/52B total) and Jamba Large 1.6 (94B active/398B total) are available for private deployment, either in VPC or on-premise, and demonstrate superior performance on the kind of long context tasks that matter most to enterprises, such as RAG workflows and grounded question answering across lengthy documents.
Training Code AccessibilityJamba Open Model License Agreement ($50M in annual revenue cap for commercial use): https://huggingface.co/ai21labs/AI21-Jamba-Mini-1.6
Parameters52000000000
Notes: 12B active/52B total