-
Notifications
You must be signed in to change notification settings - Fork 27.3k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
* initial commit for PR Co-authored-by: Gabe Goodhart <[email protected]> * rename dynamic cache Signed-off-by: Yu Chin Fabian Lim <[email protected]> * add more unit tests Signed-off-by: Yu Chin Fabian Lim <[email protected]> * add integration test Signed-off-by: Yu Chin Fabian Lim <[email protected]> * add integration test Signed-off-by: Yu Chin Fabian Lim <[email protected]> * Add modular bamba file * Remove trainer changes from unrelated PR * Modify modular and cofig to get model running * Fix some CI errors and beam search * Fix a plethora of bugs from CI/docs/etc * Add bamba to models with special caches * Updat to newer mamba PR for mamba sublayer * fix test_left_padding_compatibility Signed-off-by: Yu Chin Fabian Lim <[email protected]> * fix style Signed-off-by: Yu Chin Fabian Lim <[email protected]> * fix remaining tests Signed-off-by: Yu Chin Fabian Lim <[email protected]> * missed this test Signed-off-by: Yu Chin Fabian Lim <[email protected]> * ran make style Signed-off-by: Yu Chin Fabian Lim <[email protected]> * move slow tag to integration obj Signed-off-by: Yu Chin Fabian Lim <[email protected]> * make style Signed-off-by: Yu Chin Fabian Lim <[email protected]> * address comments Signed-off-by: Yu Chin Fabian Lim <[email protected]> * fix modular Signed-off-by: Yu Chin Fabian Lim <[email protected]> * left out one part of modular Signed-off-by: Yu Chin Fabian Lim <[email protected]> * change model Signed-off-by: Yu Chin Fabian Lim <[email protected]> * Make Rotary modular as well * Update bamba.md Added overview, update Model inference card and added config * Update bamba.md * Update bamba.md * Update bamba.md Minor fixes * Add docs for config and model back Signed-off-by: Antoni Viros i Martin <[email protected]> * Add warning when using fast kernels * replaced generate example Signed-off-by: Yu Chin Fabian Lim <[email protected]> * Address comments from PR Signed-off-by: Antoni Viros i Martin <[email protected]> * Propagate attention fixes Signed-off-by: Antoni Viros i Martin <[email protected]> * Fix attention interfaces to the new API Signed-off-by: Antoni Viros i Martin <[email protected]> * Fix API for decoder layer Signed-off-by: Antoni Viros i Martin <[email protected]> * Remove extra weights Signed-off-by: Antoni Viros i Martin <[email protected]> --------- Signed-off-by: Yu Chin Fabian Lim <[email protected]> Signed-off-by: Antoni Viros i Martin <[email protected]> Co-authored-by: Gabe Goodhart <[email protected]> Co-authored-by: Antoni Viros i Martin <[email protected]> Co-authored-by: divya-kumari32 <[email protected]> Co-authored-by: Antoni Viros <[email protected]>
- Loading branch information
1 parent
9a94dfe
commit 9613933
Showing
19 changed files
with
4,138 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,64 @@ | ||
<!--Copyright 2024 The HuggingFace Team. All rights reserved. | ||
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with | ||
the License. You may obtain a copy of the License at | ||
http://www.apache.org/licenses/LICENSE-2.0 | ||
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on | ||
an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the | ||
specific language governing permissions and limitations under the License. | ||
⚠️ Note that this file is in Markdown but contain specific syntax for our doc-builder (similar to MDX) that may not be | ||
rendered properly in your Markdown viewer. | ||
--> | ||
|
||
# Bamba | ||
|
||
|
||
## Overview | ||
|
||
Bamba-9B is a decoder-only language model based on the [Mamba-2](https://github.com/state-spaces/mamba) architecture and is designed to handle a wide range of text generation tasks. It is trained from scratch using a two-stage training approach. In the first stage, the model is trained on 2 trillion tokens from the Dolma v1.7 dataset. In the second stage, it undergoes additional training on 200 billion tokens, leveraging a carefully curated blend of high-quality data to further refine its performance and enhance output quality. | ||
|
||
Checkout all Bamba-9B model checkpoints [here](https://github.com/foundation-model-stack/bamba). | ||
|
||
## BambaConfig | ||
|
||
| Model | Params | # Layers | Hidden Dim. | Attention Heads | GQA | KV Heads | Context Length | Tied Embeddings | | ||
|-------------------|--------------|----------|-------------|-----------------|-----|----------|----------------|------------------| | ||
| Bamba | 9B (9.78B) | 32 | 4096 | 32 | Yes | 8 | 4096 | True | | ||
|
||
[[autodoc]] BambaConfig | ||
|
||
<!--- | ||
## Usage Tips | ||
Tips: | ||
- The architecture is based on Mamba-2 models. | ||
## BambaModel | ||
[[autodoc]] BambaModel | ||
- forward | ||
--> | ||
|
||
## BambaForCausalLM | ||
|
||
```python | ||
from transformers import AutoModelForCausalLM, AutoTokenizer | ||
|
||
model = AutoModelForCausalLM.from_pretrained("ibm-fms/Bamba-9B") | ||
tokenizer = AutoTokenizer.from_pretrained("ibm-fms/Bamba-9B") | ||
|
||
message = ["Mamba is a snake with following properties "] | ||
inputs = tokenizer(message, return_tensors='pt', return_token_type_ids=False) | ||
response = model.generate(**inputs, max_new_tokens=64) | ||
print(tokenizer.batch_decode(response, skip_special_tokens=True)[0]) | ||
``` | ||
|
||
[[autodoc]] BambaForCausalLM | ||
- forward | ||
|
||
This HF implementation is contributed by [ani300](https://github.com/ani300) and [fabianlim](https://github.com/fabianlim). |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -20,6 +20,7 @@ | |
audio_spectrogram_transformer, | ||
auto, | ||
autoformer, | ||
bamba, | ||
bark, | ||
bart, | ||
barthez, | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,28 @@ | ||
# Copyright 2024 IBM and the HuggingFace Inc. team. All rights reserved. | ||
# | ||
# Licensed under the Apache License, Version 2.0 (the "License"); | ||
# you may not use this file except in compliance with the License. | ||
# You may obtain a copy of the License at | ||
# | ||
# http://www.apache.org/licenses/LICENSE-2.0 | ||
# | ||
# Unless required by applicable law or agreed to in writing, software | ||
# distributed under the License is distributed on an "AS IS" BASIS, | ||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
# See the License for the specific language governing permissions and | ||
# limitations under the License. | ||
from typing import TYPE_CHECKING | ||
|
||
from ...utils import _LazyModule | ||
from ...utils.import_utils import define_import_structure | ||
|
||
|
||
if TYPE_CHECKING: | ||
from .configuration_bamba import * | ||
from .modeling_bamba import * | ||
from .processing_bamba import * | ||
else: | ||
import sys | ||
|
||
_file = globals()["__file__"] | ||
sys.modules[__name__] = _LazyModule(__name__, _file, define_import_structure(_file), module_spec=__spec__) |
Oops, something went wrong.