Skip to content

Commit

Permalink
Add new adapters library paper
Browse files Browse the repository at this point in the history
  • Loading branch information
calpt committed Mar 31, 2024
1 parent b5ecda2 commit a6df7ee
Show file tree
Hide file tree
Showing 3 changed files with 51 additions and 2 deletions.
15 changes: 14 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Awesome Adapter Resources

![](https://img.shields.io/badge/Resources-63-blue)
![](https://img.shields.io/badge/Resources-64-blue)

This repository collects important tools and papers related to adapter methods for recent large pre-trained neural networks.

Expand Down Expand Up @@ -52,6 +52,19 @@ Using adapters provides multiple benefits. They are ...

[[Paper PDF]](https://arxiv.org/pdf/2007.07779)  [[Code]](https://github.com/adapter-hub/adapter-transformers)  [[Website]](https://adapterhub.ml)  [[Semantic Scholar]](https://www.semanticscholar.org/paper/063f8b1ecf2394ca776ac61869734de9c1953808)

- **Adapters: A Unified Library for Parameter-Efficient and Modular Transfer Learning**  ![GitHub Repo stars](https://img.shields.io/github/stars/adapter-hub/adapters?color=yellow&logo=github)

Conference on Empirical Methods in Natural Language Processing

_Clifton A. Poth, Hannah Sterz, Indraneil Paul, Sukannya Purkayastha, Leon Arne Engländer, Timo Imhof, Ivan Vuli'c, Sebastian Ruder, Iryna Gurevych, Jonas Pfeiffer_ (2023)

<details>
<summary>TLDR</summary>
Adapters, an open-source library that unifies parameter-efficient and modular transfer learning in large language models and allows researchers and practitioners to leverage adapter modularity through composition blocks, enabling the design of complex adapter setups, is introduced.
</details>

[[Paper PDF]](https://arxiv.org/pdf/2311.11077.pdf)&nbsp; [[Code]](https://github.com/adapter-hub/adapters)&nbsp; [[Semantic Scholar]](https://www.semanticscholar.org/paper/e1f4b94479bfcb735a1a0add178a2337def07c9b)

- **OpenDelta**&nbsp; ![GitHub Repo stars](https://img.shields.io/github/stars/thunlp/OpenDelta?color=yellow&logo=github)


Expand Down
7 changes: 6 additions & 1 deletion data/tools.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,13 @@ items:
acl_id: "2020.emnlp-demos.7"
arxiv_id: "2007.07779"
pdf: https://arxiv.org/pdf/2007.07779
code: https://github.com/adapter-hub/adapter-transformers
website: https://adapterhub.ml
- title: "Adapters: A Unified Library for Parameter-Efficient and Modular Transfer Learning"
paper:
acl_id: "2023.emnlp-demo.13"
semantic_scholar_id: e1f4b94479bfcb735a1a0add178a2337def07c9b
pdf: https://arxiv.org/pdf/2311.11077.pdf
code: https://github.com/adapter-hub/adapters
- title: "OpenDelta"
code: https://github.com/thunlp/OpenDelta
website: https://opendelta.readthedocs.io/
Expand Down
31 changes: 31 additions & 0 deletions scripts/build.cache.json
Original file line number Diff line number Diff line change
Expand Up @@ -1736,5 +1736,36 @@
"MoV",
"MoLoRA"
]
},
"e1f4b94479bfcb735a1a0add178a2337def07c9b": {
"title": "Adapters: A Unified Library for Parameter-Efficient and Modular Transfer Learning",
"paper": {
"acl_id": "2023.emnlp-demo.13",
"semantic_scholar_id": "e1f4b94479bfcb735a1a0add178a2337def07c9b",
"pdf": "https://arxiv.org/pdf/2311.11077.pdf",
"paperId": "e1f4b94479bfcb735a1a0add178a2337def07c9b",
"url": "https://www.semanticscholar.org/paper/e1f4b94479bfcb735a1a0add178a2337def07c9b",
"title": "Adapters: A Unified Library for Parameter-Efficient and Modular Transfer Learning",
"abstract": "We introduce Adapters, an open-source library that unifies parameter-efficient and modular transfer learning in large language models. By integrating 10 diverse adapter methods into a unified interface, Adapters offers ease of use and flexible configuration. Our library allows researchers and practitioners to leverage adapter modularity through composition blocks, enabling the design of complex adapter setups. We demonstrate the library's efficacy by evaluating its performance against full fine-tuning on various NLP tasks. Adapters provides a powerful tool for addressing the challenges of conventional fine-tuning paradigms and promoting more efficient and modular transfer learning. The library is available via https://adapterhub.ml/adapters.",
"venue": "Conference on Empirical Methods in Natural Language Processing",
"year": 2023,
"tldr": {
"model": "[email protected]",
"text": "Adapters, an open-source library that unifies parameter-efficient and modular transfer learning in large language models and allows researchers and practitioners to leverage adapter modularity through composition blocks, enabling the design of complex adapter setups, is introduced."
},
"authors": [
"Clifton A. Poth",
"Hannah Sterz",
"Indraneil Paul",
"Sukannya Purkayastha",
"Leon Arne Engl\u00e4nder",
"Timo Imhof",
"Ivan Vuli'c",
"Sebastian Ruder",
"Iryna Gurevych",
"Jonas Pfeiffer"
]
},
"code": "https://github.com/adapter-hub/adapters"
}
}

0 comments on commit a6df7ee

Please sign in to comment.