adapter-transformers


Nameadapter-transformers JSON
Version 3.2.1.post0 PyPI version JSON
download
home_pagehttps://github.com/adapter-hub/adapter-transformers
SummaryA friendly fork of HuggingFace's Transformers, adding Adapters to PyTorch language models
upload_time2023-12-16 14:14:07
maintainer
docs_urlNone
authorJonas Pfeiffer, Andreas Rücklé, Clifton Poth, Hannah Sterz, Leon Engländer, based on work by the HuggingFace team and community
requires_python>=3.8.0
licenseApache
keywords nlp deep learning transformer pytorch bert adapters
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <!---
Copyright 2020 The AdapterHub Team. All rights reserved.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->

## IMPORTANT NOTE

This is the legacy `adapter-transformers` library, which has been replaced by the new **Adapters library, found here: https://github.com/adapter-hub/adapters**.

Install the new library via pip: `pip install adapters`.

This repository is kept for archival purposes, and will not be updated in the future.
Please use the new library for all active projects.

The documentation of this library can be found at https://docs-legacy.adapterhub.ml.
The documentation of the new _Adapters_ library can be found at https://docs.adapterhub.ml.
For transitioning, please read: https://docs.adapterhub.ml/transitioning.html.

---

<p align="center">
<img style="vertical-align:middle" src="https://raw.githubusercontent.com/Adapter-Hub/adapter-transformers/master/adapter_docs/logo.png" />
</p>
<h1 align="center">
<span>adapter-transformers</span>
</h1>

<h3 align="center">
A friendly fork of HuggingFace's <i>Transformers</i>, adding Adapters to PyTorch language models
</h3>

![Tests](https://github.com/Adapter-Hub/adapter-transformers/workflows/Tests/badge.svg)
[![GitHub](https://img.shields.io/github/license/adapter-hub/adapter-transformers.svg?color=blue)](https://github.com/adapter-hub/adapter-transformers/blob/master/LICENSE)
[![PyPI](https://img.shields.io/pypi/v/adapter-transformers)](https://pypi.org/project/adapter-transformers/)

`adapter-transformers` is an extension of [HuggingFace's Transformers](https://github.com/huggingface/transformers) library, integrating adapters into state-of-the-art language models by incorporating **[AdapterHub](https://adapterhub.ml)**, a central repository for pre-trained adapter modules.

_💡 Important: This library can be used as a drop-in replacement for HuggingFace Transformers and regularly synchronizes new upstream changes.
Thus, most files in this repository are direct copies from the HuggingFace Transformers source, modified only with changes required for the adapter implementations._

## Installation

`adapter-transformers` currently supports **Python 3.8+** and **PyTorch 1.12.1+**.
After [installing PyTorch](https://pytorch.org/get-started/locally/), you can install `adapter-transformers` from PyPI ...

```
pip install -U adapter-transformers
```

... or from source by cloning the repository:

```
git clone https://github.com/adapter-hub/adapter-transformers.git
cd adapter-transformers
pip install .
```

## Getting Started

HuggingFace's great documentation on getting started with _Transformers_ can be found [here](https://huggingface.co/transformers/index.html). `adapter-transformers` is fully compatible with _Transformers_.

To get started with adapters, refer to these locations:

- **[Colab notebook tutorials](https://github.com/Adapter-Hub/adapter-transformers/tree/master/notebooks)**, a series notebooks providing an introduction to all the main concepts of (adapter-)transformers and AdapterHub
- **https://docs-legacy.adapterhub.ml**, our documentation on training and using adapters with _adapter-transformers_
- **https://adapterhub.ml** to explore available pre-trained adapter modules and share your own adapters
- **[Examples folder](https://github.com/Adapter-Hub/adapter-transformers/tree/master/examples/pytorch)** of this repository containing HuggingFace's example training scripts, many adapted for training adapters

## Implemented Methods

Currently, adapter-transformers integrates all architectures and methods listed below:

| Method | Paper(s) | Quick Links |
| --- | --- | --- |
| Bottleneck adapters | [Houlsby et al. (2019)](https://arxiv.org/pdf/1902.00751.pdf)<br> [Bapna and Firat (2019)](https://arxiv.org/pdf/1909.08478.pdf) | [Quickstart](https://docs.adapterhub.ml/quickstart.html), [Notebook](https://colab.research.google.com/github/Adapter-Hub/adapter-transformers/blob/master/notebooks/01_Adapter_Training.ipynb) |
| AdapterFusion | [Pfeiffer et al. (2021)](https://aclanthology.org/2021.eacl-main.39.pdf) | [Docs: Training](https://docs.adapterhub.ml/training.html#train-adapterfusion), [Notebook](https://colab.research.google.com/github/Adapter-Hub/adapter-transformers/blob/master/notebooks/03_Adapter_Fusion.ipynb) |
| MAD-X,<br> Invertible adapters | [Pfeiffer et al. (2020)](https://aclanthology.org/2020.emnlp-main.617/) | [Notebook](https://colab.research.google.com/github/Adapter-Hub/adapter-transformers/blob/master/notebooks/04_Cross_Lingual_Transfer.ipynb) |
| AdapterDrop | [Rücklé et al. (2021)](https://arxiv.org/pdf/2010.11918.pdf) | [Notebook](https://colab.research.google.com/github/Adapter-Hub/adapter-transformers/blob/master/notebooks/05_Adapter_Drop_Training.ipynb) |
| MAD-X 2.0,<br> Embedding training | [Pfeiffer et al. (2021)](https://arxiv.org/pdf/2012.15562.pdf) | [Docs: Embeddings](https://docs.adapterhub.ml/embeddings.html), [Notebook](https://colab.research.google.com/github/Adapter-Hub/adapter-transformers/blob/master/notebooks/08_NER_Wikiann.ipynb) |
| Prefix Tuning | [Li and Liang (2021)](https://arxiv.org/pdf/2101.00190.pdf) | [Docs](https://docs.adapterhub.ml/overview.html#prefix-tuning) |
| Parallel adapters,<br> Mix-and-Match adapters | [He et al. (2021)](https://arxiv.org/pdf/2110.04366.pdf) | [Docs](https://docs.adapterhub.ml/overview.html#mix-and-match-adapters) |
| Compacter | [Mahabadi et al. (2021)](https://arxiv.org/pdf/2106.04647.pdf) | [Docs](https://docs.adapterhub.ml/overview.html#compacter) |
| LoRA | [Hu et al. (2021)](https://arxiv.org/pdf/2106.09685.pdf) | [Docs](https://docs.adapterhub.ml/overview.html#lora) |
| (IA)^3 | [Liu et al. (2022)](https://arxiv.org/pdf/2205.05638.pdf) | [Docs](https://docs.adapterhub.ml/overview.html#ia-3) |
| UniPELT | [Mao et al. (2022)](https://arxiv.org/pdf/2110.07577.pdf) | [Docs](https://docs.adapterhub.ml/overview.html#unipelt) |

## Supported Models

We currently support the PyTorch versions of all models listed on the **[Model Overview](https://docs.adapterhub.ml/model_overview.html) page** in our documentation.

## Citation

If you use this library for your work, please consider citing our paper [AdapterHub: A Framework for Adapting Transformers](https://arxiv.org/abs/2007.07779):

```
@inproceedings{pfeiffer2020AdapterHub,
    title={AdapterHub: A Framework for Adapting Transformers},
    author={Pfeiffer, Jonas and
            R{\"u}ckl{\'e}, Andreas and
            Poth, Clifton and
            Kamath, Aishwarya and
            Vuli{\'c}, Ivan and
            Ruder, Sebastian and
            Cho, Kyunghyun and
            Gurevych, Iryna},
    booktitle={Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations},
    pages={46--54},
    year={2020}
}
```



            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/adapter-hub/adapter-transformers",
    "name": "adapter-transformers",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.8.0",
    "maintainer_email": "",
    "keywords": "NLP deep learning transformer pytorch BERT adapters",
    "author": "Jonas Pfeiffer, Andreas R\u00fcckl\u00e9, Clifton Poth, Hannah Sterz, Leon Engl\u00e4nder, based on work by the HuggingFace team and community",
    "author_email": "pfeiffer@ukp.tu-darmstadt.de",
    "download_url": "https://files.pythonhosted.org/packages/ce/c2/23971a161f58237151d302fea01ce48040628c4d3664e35c9f57ab6f1586/adapter-transformers-3.2.1.post0.tar.gz",
    "platform": null,
    "description": "<!---\nCopyright 2020 The AdapterHub Team. All rights reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n-->\n\n## IMPORTANT NOTE\n\nThis is the legacy `adapter-transformers` library, which has been replaced by the new **Adapters library, found here: https://github.com/adapter-hub/adapters**.\n\nInstall the new library via pip: `pip install adapters`.\n\nThis repository is kept for archival purposes, and will not be updated in the future.\nPlease use the new library for all active projects.\n\nThe documentation of this library can be found at https://docs-legacy.adapterhub.ml.\nThe documentation of the new _Adapters_ library can be found at https://docs.adapterhub.ml.\nFor transitioning, please read: https://docs.adapterhub.ml/transitioning.html.\n\n---\n\n<p align=\"center\">\n<img style=\"vertical-align:middle\" src=\"https://raw.githubusercontent.com/Adapter-Hub/adapter-transformers/master/adapter_docs/logo.png\" />\n</p>\n<h1 align=\"center\">\n<span>adapter-transformers</span>\n</h1>\n\n<h3 align=\"center\">\nA friendly fork of HuggingFace's <i>Transformers</i>, adding Adapters to PyTorch language models\n</h3>\n\n![Tests](https://github.com/Adapter-Hub/adapter-transformers/workflows/Tests/badge.svg)\n[![GitHub](https://img.shields.io/github/license/adapter-hub/adapter-transformers.svg?color=blue)](https://github.com/adapter-hub/adapter-transformers/blob/master/LICENSE)\n[![PyPI](https://img.shields.io/pypi/v/adapter-transformers)](https://pypi.org/project/adapter-transformers/)\n\n`adapter-transformers` is an extension of [HuggingFace's Transformers](https://github.com/huggingface/transformers) library, integrating adapters into state-of-the-art language models by incorporating **[AdapterHub](https://adapterhub.ml)**, a central repository for pre-trained adapter modules.\n\n_\ud83d\udca1 Important: This library can be used as a drop-in replacement for HuggingFace Transformers and regularly synchronizes new upstream changes.\nThus, most files in this repository are direct copies from the HuggingFace Transformers source, modified only with changes required for the adapter implementations._\n\n## Installation\n\n`adapter-transformers` currently supports **Python 3.8+** and **PyTorch 1.12.1+**.\nAfter [installing PyTorch](https://pytorch.org/get-started/locally/), you can install `adapter-transformers` from PyPI ...\n\n```\npip install -U adapter-transformers\n```\n\n... or from source by cloning the repository:\n\n```\ngit clone https://github.com/adapter-hub/adapter-transformers.git\ncd adapter-transformers\npip install .\n```\n\n## Getting Started\n\nHuggingFace's great documentation on getting started with _Transformers_ can be found [here](https://huggingface.co/transformers/index.html). `adapter-transformers` is fully compatible with _Transformers_.\n\nTo get started with adapters, refer to these locations:\n\n- **[Colab notebook tutorials](https://github.com/Adapter-Hub/adapter-transformers/tree/master/notebooks)**, a series notebooks providing an introduction to all the main concepts of (adapter-)transformers and AdapterHub\n- **https://docs-legacy.adapterhub.ml**, our documentation on training and using adapters with _adapter-transformers_\n- **https://adapterhub.ml** to explore available pre-trained adapter modules and share your own adapters\n- **[Examples folder](https://github.com/Adapter-Hub/adapter-transformers/tree/master/examples/pytorch)** of this repository containing HuggingFace's example training scripts, many adapted for training adapters\n\n## Implemented Methods\n\nCurrently, adapter-transformers integrates all architectures and methods listed below:\n\n| Method | Paper(s) | Quick Links |\n| --- | --- | --- |\n| Bottleneck adapters | [Houlsby et al. (2019)](https://arxiv.org/pdf/1902.00751.pdf)<br> [Bapna and Firat (2019)](https://arxiv.org/pdf/1909.08478.pdf) | [Quickstart](https://docs.adapterhub.ml/quickstart.html), [Notebook](https://colab.research.google.com/github/Adapter-Hub/adapter-transformers/blob/master/notebooks/01_Adapter_Training.ipynb) |\n| AdapterFusion | [Pfeiffer et al. (2021)](https://aclanthology.org/2021.eacl-main.39.pdf) | [Docs: Training](https://docs.adapterhub.ml/training.html#train-adapterfusion), [Notebook](https://colab.research.google.com/github/Adapter-Hub/adapter-transformers/blob/master/notebooks/03_Adapter_Fusion.ipynb) |\n| MAD-X,<br> Invertible adapters | [Pfeiffer et al. (2020)](https://aclanthology.org/2020.emnlp-main.617/) | [Notebook](https://colab.research.google.com/github/Adapter-Hub/adapter-transformers/blob/master/notebooks/04_Cross_Lingual_Transfer.ipynb) |\n| AdapterDrop | [R\u00fcckl\u00e9 et al. (2021)](https://arxiv.org/pdf/2010.11918.pdf) | [Notebook](https://colab.research.google.com/github/Adapter-Hub/adapter-transformers/blob/master/notebooks/05_Adapter_Drop_Training.ipynb) |\n| MAD-X 2.0,<br> Embedding training | [Pfeiffer et al. (2021)](https://arxiv.org/pdf/2012.15562.pdf) | [Docs: Embeddings](https://docs.adapterhub.ml/embeddings.html), [Notebook](https://colab.research.google.com/github/Adapter-Hub/adapter-transformers/blob/master/notebooks/08_NER_Wikiann.ipynb) |\n| Prefix Tuning | [Li and Liang (2021)](https://arxiv.org/pdf/2101.00190.pdf) | [Docs](https://docs.adapterhub.ml/overview.html#prefix-tuning) |\n| Parallel adapters,<br> Mix-and-Match adapters | [He et al. (2021)](https://arxiv.org/pdf/2110.04366.pdf) | [Docs](https://docs.adapterhub.ml/overview.html#mix-and-match-adapters) |\n| Compacter | [Mahabadi et al. (2021)](https://arxiv.org/pdf/2106.04647.pdf) | [Docs](https://docs.adapterhub.ml/overview.html#compacter) |\n| LoRA | [Hu et al. (2021)](https://arxiv.org/pdf/2106.09685.pdf) | [Docs](https://docs.adapterhub.ml/overview.html#lora) |\n| (IA)^3 | [Liu et al. (2022)](https://arxiv.org/pdf/2205.05638.pdf) | [Docs](https://docs.adapterhub.ml/overview.html#ia-3) |\n| UniPELT | [Mao et al. (2022)](https://arxiv.org/pdf/2110.07577.pdf) | [Docs](https://docs.adapterhub.ml/overview.html#unipelt) |\n\n## Supported Models\n\nWe currently support the PyTorch versions of all models listed on the **[Model Overview](https://docs.adapterhub.ml/model_overview.html) page** in our documentation.\n\n## Citation\n\nIf you use this library for your work, please consider citing our paper [AdapterHub: A Framework for Adapting Transformers](https://arxiv.org/abs/2007.07779):\n\n```\n@inproceedings{pfeiffer2020AdapterHub,\n    title={AdapterHub: A Framework for Adapting Transformers},\n    author={Pfeiffer, Jonas and\n            R{\\\"u}ckl{\\'e}, Andreas and\n            Poth, Clifton and\n            Kamath, Aishwarya and\n            Vuli{\\'c}, Ivan and\n            Ruder, Sebastian and\n            Cho, Kyunghyun and\n            Gurevych, Iryna},\n    booktitle={Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations},\n    pages={46--54},\n    year={2020}\n}\n```\n\n\n",
    "bugtrack_url": null,
    "license": "Apache",
    "summary": "A friendly fork of HuggingFace's Transformers, adding Adapters to PyTorch language models",
    "version": "3.2.1.post0",
    "project_urls": {
        "Homepage": "https://github.com/adapter-hub/adapter-transformers"
    },
    "split_keywords": [
        "nlp",
        "deep",
        "learning",
        "transformer",
        "pytorch",
        "bert",
        "adapters"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "258f61685483d29817f48b7617d79b4bf5cc09dd31e9545ee7459d41c5ecf2e2",
                "md5": "199c2b154996214d5b1e718f36e97577",
                "sha256": "518b7ea7a479da34464c72f6972a564493bab654e132a79628684509945fbc5b"
            },
            "downloads": -1,
            "filename": "adapter_transformers-3.2.1.post0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "199c2b154996214d5b1e718f36e97577",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8.0",
            "size": 6427574,
            "upload_time": "2023-12-16T14:14:02",
            "upload_time_iso_8601": "2023-12-16T14:14:02.933561Z",
            "url": "https://files.pythonhosted.org/packages/25/8f/61685483d29817f48b7617d79b4bf5cc09dd31e9545ee7459d41c5ecf2e2/adapter_transformers-3.2.1.post0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "cec223971a161f58237151d302fea01ce48040628c4d3664e35c9f57ab6f1586",
                "md5": "2ec3adffe9beaac1e1af8f686a9f0809",
                "sha256": "732f24f72c7732332c9467fbcd8408b003b08b1c7cb8cbe6644365ef804570f0"
            },
            "downloads": -1,
            "filename": "adapter-transformers-3.2.1.post0.tar.gz",
            "has_sig": false,
            "md5_digest": "2ec3adffe9beaac1e1af8f686a9f0809",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8.0",
            "size": 5474485,
            "upload_time": "2023-12-16T14:14:07",
            "upload_time_iso_8601": "2023-12-16T14:14:07.979171Z",
            "url": "https://files.pythonhosted.org/packages/ce/c2/23971a161f58237151d302fea01ce48040628c4d3664e35c9f57ab6f1586/adapter-transformers-3.2.1.post0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-12-16 14:14:07",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "adapter-hub",
    "github_project": "adapter-transformers",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "adapter-transformers"
}
        
Elapsed time: 0.18864s