Zamba2
This model was released on 2024-11-22 and added to Hugging Face Transformers on 2025-01-27.
Zamba2
Section titled “Zamba2”
Zamba2 is a large language model (LLM) trained by Zyphra, and made available under an Apache 2.0 license. Please see the Zyphra Hugging Face repository for model weights.
This model was contributed by pglo.
Model details
Section titled “Model details”Zamba2-1.2B, Zamba2-2.7B and Zamba2-7B are hybrid models combining state-space models (Specifically Mamba2) and transformer, and were trained using next-token prediction. Zamba2 uses shared transformer layers after every 6 mamba blocks. It uses the Mistral v0.1 tokenizer. We came to this architecture after a series of ablations at small scales. Zamba2-1.2B, Zamba2-2.7B and Zamba2-7B were pre-trained on 2T and 3T tokens, respectively.
<img src=https://github.com/user-attachments/assets/c2cff209-b901-483c-87aa-774b82a0769f width=30% height=40% />
Quick start
Section titled “Quick start”Presequities
Section titled “Presequities”Zamba2 requires you use transformers version 4.48.0 or higher:
pip install transformers>=4.48.0Inference
Section titled “Inference”import torchfrom transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("Zyphra/Zamba2-7B")model = AutoModelForCausalLM.from_pretrained("Zyphra/Zamba2-7B", device_map="auto", dtype=torch.bfloat16)
input_text = "What factors contributed to the fall of the Roman Empire?"input_ids = tokenizer(input_text, return_tensors="pt").to(model.device)
outputs = model.generate(**input_ids, max_new_tokens=100)print(tokenizer.decode(outputs[0]))Model card
Section titled “Model card”The model cards can be found at:
Issues
Section titled “Issues”For issues with model output, or community discussion, please use the Hugging Face community forum
License
Section titled “License”The model weights are open-sourced via an Apache 2.0 license.
Zamba2Config
Section titled “Zamba2Config”[[autodoc]] Zamba2Config
Zamba2Model
Section titled “Zamba2Model”[[autodoc]] Zamba2Model - forward
Zamba2ForCausalLM
Section titled “Zamba2ForCausalLM”[[autodoc]] Zamba2ForCausalLM - forward
Zamba2ForSequenceClassification
Section titled “Zamba2ForSequenceClassification”[[autodoc]] transformers.Zamba2ForSequenceClassification - forward