curated-transformers


Namecurated-transformers JSON
Version 1.3.1 PyPI version JSON
download
home_pagehttps://github.com/explosion/curated-transformers
SummaryA PyTorch library of transformer models and components
upload_time2024-02-12 18:48:41
maintainer
docs_urlNone
authorExplosion
requires_python>=3.8
licenseMIT
keywords
VCS
bugtrack_url
requirements catalogue curated-tokenizers huggingface-hub tokenizers torch mypy pytest
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <img src="docs/source/logo.png" width="100" align="right"/>

# Curated Transformers

[![Documentation Status](https://readthedocs.org/projects/button/badge/?version=latest)](https://curated-transformers.readthedocs.io/en/latest/?badge=latest)
[![pypi Version](https://img.shields.io/pypi/v/curated-transformers.svg?style=flat-square&logo=pypi&logoColor=white)](https://pypi.org/project/curated-transformers/)

**State-of-the-art transformers, brick by brick**

Curated Transformers is a transformer library for PyTorch. It provides
state-of-the-art models that are composed from a set of reusable
components. The stand-out features of Curated Transformer are:

- ⚡️ Supports state-of-the art transformer models, including LLMs such
  as Falcon, Llama, and Dolly v2.
- 👩‍🎨 Each model is composed from a set of reusable building blocks,
  providing many benefits:
  - Implementing a feature or bugfix benefits all models. For example,
    all models support 4/8-bit inference through the
    [`bitsandbytes`](https://github.com/TimDettmers/bitsandbytes) library
    and each model can use the PyTorch `meta` device to avoid unnecessary
    allocations and initialization.
  - Adding new models to the library is low-effort.
  - Do you want to try a new transformer architecture? A BERT encoder
    with rotary embeddings? You can make it in a pinch.
- 💎 Consistent type annotations of all public APIs:
  - Get great coding support from your IDE.
  - Integrates well with your existing type-checked code.
- 🎓 Great for education, because the building blocks are easy to study.
- 📦 Minimal dependencies.

Curated Transformers has been production-tested by [Explosion](http://explosion.ai/)
and will be used as the default transformer implementation in spaCy 3.7.

## 🧰 Supported Model Architectures

Supported encoder-only models:

- ALBERT
- BERT
- CamemBERT
- RoBERTa
- XLM-RoBERTa

Supported decoder-only models:

- Falcon
- GPT-NeoX
- Llama 1/2
- MPT

Generator wrappers:

- Dolly v2
- Falcon
- Llama 1/2
- MPT

All types of models can be loaded from Huggingface Hub.

spaCy integration for curated transformers is provided by the
[`spacy-curated-transformers`](https://github.com/explosion/spacy-curated-transformers)
package.

## ⏳ Install

```bash
pip install curated-transformers
```

### CUDA support

The default Linux build of PyTorch is built with CUDA 11.7 support. You should
explicitly install a CUDA build in the following cases:

- If you want to use Curated Transformers on Windows.
- If you want to use Curated Transformers on Linux with Ada-generation GPUs.
  The standard PyTorch build supports Ada GPUs, but you can get considerable
  performance improvements by installing PyTorch with CUDA 11.8 support.

In both cases, you can install PyTorch with:

```bash
pip install torch --index-url https://download.pytorch.org/whl/cu118
```

## 🏃‍♀️ Usage Example

```python-console
>>> import torch
>>> from curated_transformers.generation import AutoGenerator, GreedyGeneratorConfig
>>> generator = AutoGenerator.from_hf_hub(name="tiiuae/falcon-7b-instruct", device=torch.device("cuda"))
>>> generator(["What is Python in one sentence?", "What is Rust in one sentence?"], GreedyGeneratorConfig())
['Python is a high-level programming language that is easy to learn and widely used for web development, data analysis, and automation.',
 'Rust is a programming language that is designed to be a safe, concurrent, and efficient replacement for C++.']
```

You can find more [usage examples](https://curated-transformers.readthedocs.io/en/latest/usage.html)
in the documentation. You can also find example programs that use Curated Transformers in the
[`examples`](examples/) directory.

## 📚 Documentation

You can read more about how to use Curated Transformers here:

- [Overview](https://curated-transformers.readthedocs.io/en/v1.3.x/) ([Development](https://curated-transformers.readthedocs.io/en/latest/))
- [Usage](https://curated-transformers.readthedocs.io/en/v1.3.x/usage.html) ([Development](https://curated-transformers.readthedocs.io/en/latest/usage.html))
- [API](https://curated-transformers.readthedocs.io/en/v1.3.x/api.html) ([Development](https://curated-transformers.readthedocs.io/en/latest/api.html))

## 🗜️ Quantization

`curated-transformers` supports dynamic 8-bit and 4-bit quantization of models by leveraging the [`bitsandbytes` library](https://github.com/TimDettmers/bitsandbytes).

Use the quantization variant to automatically install the necessary dependencies:

```bash
pip install curated-transformers[quantization]
```

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/explosion/curated-transformers",
    "name": "curated-transformers",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": "",
    "keywords": "",
    "author": "Explosion",
    "author_email": "contact@explosion.ai",
    "download_url": "https://files.pythonhosted.org/packages/b1/96/f194ab571bd41833ec917362506899bb8bfe998862faa475e0083c6fd267/curated-transformers-1.3.1.tar.gz",
    "platform": null,
    "description": "<img src=\"docs/source/logo.png\" width=\"100\" align=\"right\"/>\n\n# Curated Transformers\n\n[![Documentation Status](https://readthedocs.org/projects/button/badge/?version=latest)](https://curated-transformers.readthedocs.io/en/latest/?badge=latest)\n[![pypi Version](https://img.shields.io/pypi/v/curated-transformers.svg?style=flat-square&logo=pypi&logoColor=white)](https://pypi.org/project/curated-transformers/)\n\n**State-of-the-art transformers, brick by brick**\n\nCurated Transformers is a transformer library for PyTorch. It provides\nstate-of-the-art models that are composed from a set of reusable\ncomponents. The stand-out features of Curated Transformer are:\n\n- \u26a1\ufe0f Supports state-of-the art transformer models, including LLMs such\n  as Falcon, Llama, and Dolly v2.\n- \ud83d\udc69\u200d\ud83c\udfa8 Each model is composed from a set of reusable building blocks,\n  providing many benefits:\n  - Implementing a feature or bugfix benefits all models. For example,\n    all models support 4/8-bit inference through the\n    [`bitsandbytes`](https://github.com/TimDettmers/bitsandbytes) library\n    and each model can use the PyTorch `meta` device to avoid unnecessary\n    allocations and initialization.\n  - Adding new models to the library is low-effort.\n  - Do you want to try a new transformer architecture? A BERT encoder\n    with rotary embeddings? You can make it in a pinch.\n- \ud83d\udc8e Consistent type annotations of all public APIs:\n  - Get great coding support from your IDE.\n  - Integrates well with your existing type-checked code.\n- \ud83c\udf93 Great for education, because the building blocks are easy to study.\n- \ud83d\udce6 Minimal dependencies.\n\nCurated Transformers has been production-tested by [Explosion](http://explosion.ai/)\nand will be used as the default transformer implementation in spaCy 3.7.\n\n## \ud83e\uddf0 Supported Model Architectures\n\nSupported encoder-only models:\n\n- ALBERT\n- BERT\n- CamemBERT\n- RoBERTa\n- XLM-RoBERTa\n\nSupported decoder-only models:\n\n- Falcon\n- GPT-NeoX\n- Llama 1/2\n- MPT\n\nGenerator wrappers:\n\n- Dolly v2\n- Falcon\n- Llama 1/2\n- MPT\n\nAll types of models can be loaded from Huggingface Hub.\n\nspaCy integration for curated transformers is provided by the\n[`spacy-curated-transformers`](https://github.com/explosion/spacy-curated-transformers)\npackage.\n\n## \u23f3 Install\n\n```bash\npip install curated-transformers\n```\n\n### CUDA support\n\nThe default Linux build of PyTorch is built with CUDA 11.7 support. You should\nexplicitly install a CUDA build in the following cases:\n\n- If you want to use Curated Transformers on Windows.\n- If you want to use Curated Transformers on Linux with Ada-generation GPUs.\n  The standard PyTorch build supports Ada GPUs, but you can get considerable\n  performance improvements by installing PyTorch with CUDA 11.8 support.\n\nIn both cases, you can install PyTorch with:\n\n```bash\npip install torch --index-url https://download.pytorch.org/whl/cu118\n```\n\n## \ud83c\udfc3\u200d\u2640\ufe0f Usage Example\n\n```python-console\n>>> import torch\n>>> from curated_transformers.generation import AutoGenerator, GreedyGeneratorConfig\n>>> generator = AutoGenerator.from_hf_hub(name=\"tiiuae/falcon-7b-instruct\", device=torch.device(\"cuda\"))\n>>> generator([\"What is Python in one sentence?\", \"What is Rust in one sentence?\"], GreedyGeneratorConfig())\n['Python is a high-level programming language that is easy to learn and widely used for web development, data analysis, and automation.',\n 'Rust is a programming language that is designed to be a safe, concurrent, and efficient replacement for C++.']\n```\n\nYou can find more [usage examples](https://curated-transformers.readthedocs.io/en/latest/usage.html)\nin the documentation. You can also find example programs that use Curated Transformers in the\n[`examples`](examples/) directory.\n\n## \ud83d\udcda Documentation\n\nYou can read more about how to use Curated Transformers here:\n\n- [Overview](https://curated-transformers.readthedocs.io/en/v1.3.x/) ([Development](https://curated-transformers.readthedocs.io/en/latest/))\n- [Usage](https://curated-transformers.readthedocs.io/en/v1.3.x/usage.html) ([Development](https://curated-transformers.readthedocs.io/en/latest/usage.html))\n- [API](https://curated-transformers.readthedocs.io/en/v1.3.x/api.html) ([Development](https://curated-transformers.readthedocs.io/en/latest/api.html))\n\n## \ud83d\udddc\ufe0f Quantization\n\n`curated-transformers` supports dynamic 8-bit and 4-bit quantization of models by leveraging the [`bitsandbytes` library](https://github.com/TimDettmers/bitsandbytes).\n\nUse the quantization variant to automatically install the necessary dependencies:\n\n```bash\npip install curated-transformers[quantization]\n```\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A PyTorch library of transformer models and components",
    "version": "1.3.1",
    "project_urls": {
        "Homepage": "https://github.com/explosion/curated-transformers"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "b63a06c8e9ef5238451020825316411d03f7fd975b6b8b384e31672abf567aa2",
                "md5": "88078063b93a5f9575b316bb1f2890e2",
                "sha256": "c394d40d066f76ca70bec4cd70c718f112b5e4df5b07f80aabb01e5e5da6b290"
            },
            "downloads": -1,
            "filename": "curated_transformers-1.3.1-py2.py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "88078063b93a5f9575b316bb1f2890e2",
            "packagetype": "bdist_wheel",
            "python_version": "py2.py3",
            "requires_python": ">=3.8",
            "size": 346791,
            "upload_time": "2024-02-12T18:48:39",
            "upload_time_iso_8601": "2024-02-12T18:48:39.407981Z",
            "url": "https://files.pythonhosted.org/packages/b6/3a/06c8e9ef5238451020825316411d03f7fd975b6b8b384e31672abf567aa2/curated_transformers-1.3.1-py2.py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "b196f194ab571bd41833ec917362506899bb8bfe998862faa475e0083c6fd267",
                "md5": "625db67183bef7f4ab857420ac71ebd5",
                "sha256": "4e7d26a5b9cb32c0a80a93659a4143690491ce5f5093719e046fa5eaf273b5da"
            },
            "downloads": -1,
            "filename": "curated-transformers-1.3.1.tar.gz",
            "has_sig": false,
            "md5_digest": "625db67183bef7f4ab857420ac71ebd5",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 276513,
            "upload_time": "2024-02-12T18:48:41",
            "upload_time_iso_8601": "2024-02-12T18:48:41.839236Z",
            "url": "https://files.pythonhosted.org/packages/b1/96/f194ab571bd41833ec917362506899bb8bfe998862faa475e0083c6fd267/curated-transformers-1.3.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-02-12 18:48:41",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "explosion",
    "github_project": "curated-transformers",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [
        {
            "name": "catalogue",
            "specs": [
                [
                    ">=",
                    "2.0.4"
                ],
                [
                    "<",
                    "2.1.0"
                ]
            ]
        },
        {
            "name": "curated-tokenizers",
            "specs": [
                [
                    "<",
                    "1.0.0"
                ],
                [
                    ">=",
                    "0.9.1"
                ]
            ]
        },
        {
            "name": "huggingface-hub",
            "specs": [
                [
                    ">=",
                    "0.14"
                ]
            ]
        },
        {
            "name": "tokenizers",
            "specs": [
                [
                    ">=",
                    "0.13.3"
                ]
            ]
        },
        {
            "name": "torch",
            "specs": [
                [
                    ">=",
                    "1.12.0"
                ]
            ]
        },
        {
            "name": "mypy",
            "specs": [
                [
                    ">=",
                    "1.5.0"
                ],
                [
                    "<",
                    "1.6.0"
                ]
            ]
        },
        {
            "name": "pytest",
            "specs": []
        }
    ],
    "lcname": "curated-transformers"
}
        
Elapsed time: 0.26378s