Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for AdapterPlus #746

Merged
merged 35 commits into from
Nov 25, 2024
Merged
Show file tree
Hide file tree
Changes from 33 commits
Commits
Show all changes
35 commits
Select commit Hold shift + click to select a range
abeca7e
initial commit/added new scaling option channel
julian-fong Oct 18, 2024
20f8eee
added the houlsby initialization
julian-fong Oct 19, 2024
f42a855
add drop_path to adapterplus config
julian-fong Oct 19, 2024
54551ab
added drop_path implementation
julian-fong Oct 22, 2024
8f741f6
fixed typo inside AdapterPlusConfig and added DropPath inside __init_…
julian-fong Oct 22, 2024
3845c76
reverted pre-commit changes in adapter_config.py
julian-fong Oct 22, 2024
d31f9ad
Update adapter_config.py
julian-fong Oct 22, 2024
c92b390
Update adapter_config.py
julian-fong Oct 22, 2024
b6a43d7
revert pre-commit updates to modeling.py
julian-fong Oct 22, 2024
cc77959
Update modeling.py
julian-fong Oct 22, 2024
9437d5b
added config to init file
julian-fong Oct 23, 2024
009b2fe
update adapter_config.py
julian-fong Oct 23, 2024
0660776
fixed StochasticDepth
julian-fong Oct 23, 2024
ca9905b
update Adapter class
julian-fong Oct 23, 2024
25b1488
made docstring consistent
julian-fong Oct 26, 2024
9c25050
fixed bug with DropPath in forward function
julian-fong Oct 26, 2024
79dd694
removed vision.py and added torchvision implementation of stochastic …
julian-fong Oct 28, 2024
9166624
updated reduction_factor to 96 to that we get a rank of 8 with ViT mo…
julian-fong Oct 28, 2024
169b303
updated __init__ file
julian-fong Oct 28, 2024
0f54dae
updated documentation
julian-fong Oct 28, 2024
b484947
Merge branch 'adapter-hub:main' into adapterplus
julian-fong Oct 30, 2024
49bb668
Merge branch 'adapterplus' of github.com:julian-fong/adapters into ad…
julian-fong Oct 31, 2024
d899b50
added torchvision as an optional dependency, and added torchvision to…
julian-fong Nov 3, 2024
2452306
update
julian-fong Nov 3, 2024
dea8830
updates
julian-fong Nov 3, 2024
e6d6fa2
Merge branch 'main' into adapterplus
julian-fong Nov 3, 2024
a7f7705
updates
julian-fong Nov 3, 2024
e99304b
Merge branch 'adapterplus' of github.com:julian-fong/adapters into ad…
julian-fong Nov 3, 2024
89ff2b4
updates
julian-fong Nov 3, 2024
66093ec
re-added new optional dependency torchvision
julian-fong Nov 3, 2024
21f8aad
fixed typo
julian-fong Nov 3, 2024
b770279
added notebook
julian-fong Nov 23, 2024
ef7a7dd
updated readme
julian-fong Nov 23, 2024
1abcdac
fixed code formatting on modeling.py
julian-fong Nov 24, 2024
e896c52
Merge branch 'adapter-hub:main' into adapterplus
julian-fong Nov 24, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions .github/workflows/tests_torch.yml
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ jobs:
- name: Install
run: |
pip install torch==2.3
pip install .[sklearn,testing,sentencepiece]
pip install .[sklearn,testing,sentencepiece,torchvision]
- name: Test
run: |
make test-adapter-methods
Expand All @@ -86,7 +86,7 @@ jobs:
- name: Install
run: |
pip install torch==2.3
pip install .[sklearn,testing,sentencepiece]
pip install .[sklearn,testing,sentencepiece,torchvision]
- name: Test
run: |
make test-adapter-models
Expand All @@ -109,7 +109,7 @@ jobs:
- name: Install
run: |
pip install torch==2.3
pip install .[sklearn,testing,sentencepiece]
pip install .[sklearn,testing,sentencepiece,torchvision]
pip install conllu seqeval
- name: Test Examples
run: |
Expand Down
3 changes: 3 additions & 0 deletions docs/classes/adapter_config.rst
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,9 @@ Single (bottleneck) adapters
.. autoclass:: adapters.CompacterPlusPlusConfig
:members:

.. autoclass:: adapters.AdapterPlusConfig
:members:

Prefix Tuning
~~~~~~~~~~~~~~~~~~~~~~~

Expand Down
3 changes: 2 additions & 1 deletion docs/methods.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ A visualization of further configuration options related to the adapter structur
- [`DoubleSeqBnConfig`](adapters.DoubleSeqBnConfig), as proposed by [Houlsby et al. (2019)](https://arxiv.org/pdf/1902.00751.pdf) places adapter layers after both the multi-head attention and feed-forward block in each Transformer layer.
- [`SeqBnConfig`](adapters.SeqBnConfig), as proposed by [Pfeiffer et al. (2020)](https://arxiv.org/pdf/2005.00052.pdf) places an adapter layer only after the feed-forward block in each Transformer layer.
- [`ParBnConfig`](adapters.ParBnConfig), as proposed by [He et al. (2021)](https://arxiv.org/pdf/2110.04366.pdf) places adapter layers in parallel to the original Transformer layers.

- [`AdapterPlusConfig`](adapters.AdapterPlusConfig), as proposed by [Steitz and Roth (2024)](https://arxiv.org/pdf/2406.06820) places adapter layers adapter layers after the multi-head attention and has channel wise scaling and houlsby weight initialization
_Example_:
```python
from adapters import BnConfig
Expand All @@ -56,6 +56,7 @@ _Papers:_
* [Parameter-Efficient Transfer Learning for NLP](https://arxiv.org/pdf/1902.00751.pdf) (Houlsby et al., 2019)
* [Simple, Scalable Adaptation for Neural Machine Translation](https://arxiv.org/pdf/1909.08478.pdf) (Bapna and Firat, 2019)
* [AdapterFusion: Non-Destructive Task Composition for Transfer Learning](https://aclanthology.org/2021.eacl-main.39.pdf) (Pfeiffer et al., 2021)
* [Adapters Strike Back](https://arxiv.org/pdf/2406.06820) (Steitz and Roth., 2024)
* [AdapterHub: A Framework for Adapting Transformers](https://arxiv.org/pdf/2007.07779.pdf) (Pfeiffer et al., 2020)

## Language Adapters - Invertible Adapters
Expand Down
1 change: 1 addition & 0 deletions notebooks/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,3 +35,4 @@ As adapters is fully compatible with HuggingFace's Transformers, you can also us
| [NER on Wikiann](https://github.com/Adapter-Hub/adapters/blob/main/notebooks/08_NER_Wikiann.ipynb) | Evaluating adapters on NER on the wikiann dataset | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/Adapter-Hub/adapters/blob/main/notebooks/08_NER_Wikiann.ipynb) |
| [Finetuning Whisper with Adapters](https://github.com/Adapter-Hub/adapters/blob/main/notebooks/Adapter_Whisper_Audio_FineTuning.ipynb) | Fine Tuning Whisper using LoRA | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/Adapter-Hub/adapters/blob/main/notebooks/Adapter_Whisper_Audio_FineTuning.ipynb) |
| [Adapter Training with ReFT](https://github.com/Adapter-Hub/adapters/blob/main/notebooks/ReFT_Adapters_Finetuning.ipynb) | Fine Tuning using ReFT Adapters | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/Adapter-Hub/adapters/blob/main/notebooks/ReFT_Adapters_Finetuning.ipynb) |
| [ViT Fine-Tuning with AdapterPlus](https://github.com/Adapter-Hub/adapters/blob/main/notebooks/ViT_AdapterPlus_FineTuning.ipynb) | ViT Fine-Tuning with AdapterPlus | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/Adapter-Hub/adapters/blob/main/notebooks/ViT_AdapterPlus_FineTuning.ipynb) |
Loading
Loading