<!---
Copyright 2020-2024 The AdapterHub Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
## ⚠️ IMPORTANT NOTE ⚠️
This is the legacy `adapter-transformers` library, which has been replaced by the new **Adapters library, found here: https://github.com/adapter-hub/adapters**.
- **⚠️ Beginning with version 4.0.0, the `adapter-transformers` package will automatically install the latest `adapters` package version instead. ⚠️**
From this version on, the `adapter-transformers` package does not contain any own functionality.
- Older versions of `adapter-transformers` are kept for archival purposes, and should not be used for active projects.
Install the new library directly via: `pip install adapters`.
The documentation of adapter-transformers can be found at https://docs-legacy.adapterhub.ml.
The documentation of the new _Adapters_ library can be found at https://docs.adapterhub.ml.
For transitioning, please read: https://docs.adapterhub.ml/transitioning.html.
---
<p align="center">
<img style="vertical-align:middle" src="https://raw.githubusercontent.com/Adapter-Hub/adapters/main/docs/img/adapter-bert.png" width="80" />
</p>
<h1 align="center">
<span><i>Adapters</i></span>
</h1>
<h3 align="center">
A Unified Library for Parameter-Efficient and Modular Transfer Learning
</h3>
<h3 align="center">
<a href="https://adapterhub.ml">🌍 Website</a>
•
<a href="https://github.com/Adapter-Hub/adapters/tree/main/notebooks">💻 GitHub</a>
•
<a href="https://docs.adapterhub.ml">📚 Docs</a>
•
<a href="https://arxiv.org/abs/2311.11077">📜 Paper</a>
•
<a href="https://github.com/Adapter-Hub/adapters/tree/main/notebooks">🧪 Tutorials</a>
</h3>
_Adapters_ is an add-on library to [HuggingFace's Transformers](https://github.com/huggingface/transformers), integrating [various adapter methods](https://docs.adapterhub.ml/overview.html) into [state-of-the-art pre-trained language models](https://docs.adapterhub.ml/model_overview.html) with minimal coding overhead for training and inference.
```
pip install adapters
```
Raw data
{
"_id": null,
"home_page": "https://github.com/adapter-hub/adapters",
"name": "adapter-transformers",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8.0",
"maintainer_email": null,
"keywords": "NLP deep learning transformer pytorch BERT adapters",
"author": "The AdapterHub team and community contributors",
"author_email": "calpt@mail.de",
"download_url": "https://files.pythonhosted.org/packages/3e/ab/4c94d4771cf20d9d8a0f4e09e24576bcaaacebd2b1be852eb20fd274fded/adapter_transformers-4.0.0.tar.gz",
"platform": null,
"description": "<!---\r\nCopyright 2020-2024 The AdapterHub Team. All rights reserved.\r\n\r\nLicensed under the Apache License, Version 2.0 (the \"License\");\r\nyou may not use this file except in compliance with the License.\r\nYou may obtain a copy of the License at\r\n\r\n http://www.apache.org/licenses/LICENSE-2.0\r\n\r\nUnless required by applicable law or agreed to in writing, software\r\ndistributed under the License is distributed on an \"AS IS\" BASIS,\r\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\r\nSee the License for the specific language governing permissions and\r\nlimitations under the License.\r\n-->\r\n\r\n## \u26a0\ufe0f IMPORTANT NOTE \u26a0\ufe0f\r\n\r\nThis is the legacy `adapter-transformers` library, which has been replaced by the new **Adapters library, found here: https://github.com/adapter-hub/adapters**.\r\n\r\n- **\u26a0\ufe0f Beginning with version 4.0.0, the `adapter-transformers` package will automatically install the latest `adapters` package version instead. \u26a0\ufe0f** \r\n From this version on, the `adapter-transformers` package does not contain any own functionality.\r\n- Older versions of `adapter-transformers` are kept for archival purposes, and should not be used for active projects.\r\n\r\nInstall the new library directly via: `pip install adapters`.\r\n\r\nThe documentation of adapter-transformers can be found at https://docs-legacy.adapterhub.ml.\r\nThe documentation of the new _Adapters_ library can be found at https://docs.adapterhub.ml.\r\nFor transitioning, please read: https://docs.adapterhub.ml/transitioning.html.\r\n\r\n---\r\n\r\n\r\n<p align=\"center\">\r\n<img style=\"vertical-align:middle\" src=\"https://raw.githubusercontent.com/Adapter-Hub/adapters/main/docs/img/adapter-bert.png\" width=\"80\" />\r\n</p>\r\n<h1 align=\"center\">\r\n<span><i>Adapters</i></span>\r\n</h1>\r\n\r\n<h3 align=\"center\">\r\nA Unified Library for Parameter-Efficient and Modular Transfer Learning\r\n</h3>\r\n<h3 align=\"center\">\r\n <a href=\"https://adapterhub.ml\">\ud83c\udf0d Website</a>\r\n \u2022 \r\n <a href=\"https://github.com/Adapter-Hub/adapters/tree/main/notebooks\">\ud83d\udcbb GitHub</a>\r\n \u2022 \r\n <a href=\"https://docs.adapterhub.ml\">\ud83d\udcda Docs</a>\r\n \u2022 \r\n <a href=\"https://arxiv.org/abs/2311.11077\">\ud83d\udcdc Paper</a>\r\n \u2022 \r\n <a href=\"https://github.com/Adapter-Hub/adapters/tree/main/notebooks\">\ud83e\uddea Tutorials</a>\r\n</h3>\r\n\r\n_Adapters_ is an add-on library to [HuggingFace's Transformers](https://github.com/huggingface/transformers), integrating [various adapter methods](https://docs.adapterhub.ml/overview.html) into [state-of-the-art pre-trained language models](https://docs.adapterhub.ml/model_overview.html) with minimal coding overhead for training and inference.\r\n\r\n```\r\npip install adapters\r\n```\r\n",
"bugtrack_url": null,
"license": "Apache",
"summary": "Deprecated adapter-transformers package. Use adapters package instead.",
"version": "4.0.0",
"project_urls": {
"Homepage": "https://github.com/adapter-hub/adapters"
},
"split_keywords": [
"nlp",
"deep",
"learning",
"transformer",
"pytorch",
"bert",
"adapters"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "3eab4c94d4771cf20d9d8a0f4e09e24576bcaaacebd2b1be852eb20fd274fded",
"md5": "e1382acb3c28093710972343dbaa5d7d",
"sha256": "6253a6d6a398d4fe47ed51535c041e5e990c86b5fba2386f09677821f2653f05"
},
"downloads": -1,
"filename": "adapter_transformers-4.0.0.tar.gz",
"has_sig": false,
"md5_digest": "e1382acb3c28093710972343dbaa5d7d",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8.0",
"size": 2918,
"upload_time": "2024-07-07T11:49:43",
"upload_time_iso_8601": "2024-07-07T11:49:43.151716Z",
"url": "https://files.pythonhosted.org/packages/3e/ab/4c94d4771cf20d9d8a0f4e09e24576bcaaacebd2b1be852eb20fd274fded/adapter_transformers-4.0.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-07-07 11:49:43",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "adapter-hub",
"github_project": "adapters",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "adapter-transformers"
}