transformer-builder


Nametransformer-builder JSON
Version 0.1.0 PyPI version JSON
download
home_pagehttps://github.com/MrKekovich/transformer-builder
SummaryThis package provides builder-like API to create really flexible transformers using PyTorch
upload_time2023-10-30 08:15:46
maintainer
docs_urlNone
authorMrKekovich
requires_python>=3.8,<3.12
licenseBSD-3-Clause
keywords pytorch transformer ai
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # transformer-builder

[![License](https://img.shields.io/badge/license-BSD-blue.svg)](https://github.com/MrKekovich/transformer-builder/blob/master/LICENSE)
[![Python Version](https://img.shields.io/badge/python-3.9%2B-blue.svg)](https://www.python.org/downloads/)
[![Project Status](https://img.shields.io/badge/Project%20Status-pre--alpha-blue)](https://github.com/MrKekovich/transformer-builder/)

---

> Make your own transformers with ease.

Transformers have become a popular choice for a wide range of Natural Language Processing (NLP) and deep learning tasks.
The transformer-builder package allows you to create custom transformer models with ease, providing flexibility and
modularity for your deep learning projects.

---

## Features

- Build custom transformer models with a user-friendly and flexible interface.
- Configurable encoder and decoder blocks with support for custom self-attention mechanisms.
- Encapsulated self-attention blocks that adapt to your specific use case.
- Open-source and customizable to fit your project's requirements.

---

## Installation

You can install transformer-builder using pip:

```bash
pip install transformer-builder
```

---

## Usage

Here's an example of how to use Transformer Builder to create a custom model:

```python
import torch
from torch import nn

from transformer_builder.attention import SelfAttention, MultiHeadAttention
from transformer_builder.layers import ResidualConnection

vocab_size = 16_000
embedding_dim = 512
num_heads = 8
d_head = embedding_dim // num_heads

vocab_size = 16_000
embedding_dim = 512
num_heads = 4
num_blocks = 3
d_head = embedding_dim // num_heads

blocks = [MultiHeadAttention(
    layer_before=nn.Linear(embedding_dim, embedding_dim),
    self_attention_heads=[
        SelfAttention(
            q_architecture=nn.Linear(embedding_dim, d_head),  # Default: nn.Identity
            k_architecture=nn.Linear(embedding_dim, d_head),
            v_architecture=nn.Linear(embedding_dim, d_head),
        ),
        SelfAttention(
            # This will calculate scaled dot product attention of original inputs
            # And pass the result to the linear layer
            layer_after=nn.Linear(embedding_dim, d_head),
        ),
        SelfAttention(
            layer_after=nn.Linear(embedding_dim, d_head),
        ),
        SelfAttention(
            # Now some exotic attention architecture
            layer_before=SelfAttention(),
            # The default value for self_attention_heads is single default head
            layer_after=MultiHeadAttention(
                layer_after=nn.Linear(embedding_dim, d_head),
            )
        )
    ]
)
    for _ in range(num_blocks)]

gpt = nn.Sequential(
    # nn.Embedding(vocab_size, embedding_dim), for simplicity, we will use random embeddings
    # ResidualConnection will add original input to the output of the module and apply normalization
    *[ResidualConnection(
        module=multi_head_attention,
        normalization=nn.LayerNorm(embedding_dim)
    ) for multi_head_attention in blocks],
)

gpt(torch.randn(8, embedding_dim))

```

---

## Customization

With transformer-builder, you can customize each aspect of your blocks individually,
allowing for fine-grained control over your model's architecture.
The example above demonstrates how to configure the self-attention layer,
layer normalization, and linear layers.
You can go crazy and create encoder inside decoder inside self-attention!

---

## Contributing

If you would like to contribute to this project, please follow our
[contribution guidelines](https://github.com/MrKekovich/transformer-builder/blob/master/CONTRIBUTING.md).

---

## Support and Feedback

If you have questions, encounter issues, or have feedback, please open an issue on our
[GitHub repository](https://github.com/MrKekovich/transformer-builder).

---

## Acknowledgments

This project was inspired by the need for a flexible and customizable API for creating
decoder blocks in deep learning models.

---

## Author

[MrKekovich](https://github.com/MrKekovich)

---

## License

This project is licensed under the [BSD-3-Clause](https://opensource.org/license/bsd-3-clause/) License.
See the [LICENSE](https://github.com/MrKekovich/transformer-builder/blob/master/LICENSE) file for details.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/MrKekovich/transformer-builder",
    "name": "transformer-builder",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.8,<3.12",
    "maintainer_email": "",
    "keywords": "pytorch,transformer,ai",
    "author": "MrKekovich",
    "author_email": "mrkekovich.official@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/0d/53/541ad62506616a6ffdf6122d5428c756bb7fd77c7d88ba1d14b92b984ae9/transformer_builder-0.1.0.tar.gz",
    "platform": null,
    "description": "# transformer-builder\n\n[![License](https://img.shields.io/badge/license-BSD-blue.svg)](https://github.com/MrKekovich/transformer-builder/blob/master/LICENSE)\n[![Python Version](https://img.shields.io/badge/python-3.9%2B-blue.svg)](https://www.python.org/downloads/)\n[![Project Status](https://img.shields.io/badge/Project%20Status-pre--alpha-blue)](https://github.com/MrKekovich/transformer-builder/)\n\n---\n\n> Make your own transformers with ease.\n\nTransformers have become a popular choice for a wide range of Natural Language Processing (NLP) and deep learning tasks.\nThe transformer-builder package allows you to create custom transformer models with ease, providing flexibility and\nmodularity for your deep learning projects.\n\n---\n\n## Features\n\n- Build custom transformer models with a user-friendly and flexible interface.\n- Configurable encoder and decoder blocks with support for custom self-attention mechanisms.\n- Encapsulated self-attention blocks that adapt to your specific use case.\n- Open-source and customizable to fit your project's requirements.\n\n---\n\n## Installation\n\nYou can install transformer-builder using pip:\n\n```bash\npip install transformer-builder\n```\n\n---\n\n## Usage\n\nHere's an example of how to use Transformer Builder to create a custom model:\n\n```python\nimport torch\nfrom torch import nn\n\nfrom transformer_builder.attention import SelfAttention, MultiHeadAttention\nfrom transformer_builder.layers import ResidualConnection\n\nvocab_size = 16_000\nembedding_dim = 512\nnum_heads = 8\nd_head = embedding_dim // num_heads\n\nvocab_size = 16_000\nembedding_dim = 512\nnum_heads = 4\nnum_blocks = 3\nd_head = embedding_dim // num_heads\n\nblocks = [MultiHeadAttention(\n    layer_before=nn.Linear(embedding_dim, embedding_dim),\n    self_attention_heads=[\n        SelfAttention(\n            q_architecture=nn.Linear(embedding_dim, d_head),  # Default: nn.Identity\n            k_architecture=nn.Linear(embedding_dim, d_head),\n            v_architecture=nn.Linear(embedding_dim, d_head),\n        ),\n        SelfAttention(\n            # This will calculate scaled dot product attention of original inputs\n            # And pass the result to the linear layer\n            layer_after=nn.Linear(embedding_dim, d_head),\n        ),\n        SelfAttention(\n            layer_after=nn.Linear(embedding_dim, d_head),\n        ),\n        SelfAttention(\n            # Now some exotic attention architecture\n            layer_before=SelfAttention(),\n            # The default value for self_attention_heads is single default head\n            layer_after=MultiHeadAttention(\n                layer_after=nn.Linear(embedding_dim, d_head),\n            )\n        )\n    ]\n)\n    for _ in range(num_blocks)]\n\ngpt = nn.Sequential(\n    # nn.Embedding(vocab_size, embedding_dim), for simplicity, we will use random embeddings\n    # ResidualConnection will add original input to the output of the module and apply normalization\n    *[ResidualConnection(\n        module=multi_head_attention,\n        normalization=nn.LayerNorm(embedding_dim)\n    ) for multi_head_attention in blocks],\n)\n\ngpt(torch.randn(8, embedding_dim))\n\n```\n\n---\n\n## Customization\n\nWith transformer-builder, you can customize each aspect of your blocks individually,\nallowing for fine-grained control over your model's architecture.\nThe example above demonstrates how to configure the self-attention layer,\nlayer normalization, and linear layers.\nYou can go crazy and create encoder inside decoder inside self-attention!\n\n---\n\n## Contributing\n\nIf you would like to contribute to this project, please follow our\n[contribution guidelines](https://github.com/MrKekovich/transformer-builder/blob/master/CONTRIBUTING.md).\n\n---\n\n## Support and Feedback\n\nIf you have questions, encounter issues, or have feedback, please open an issue on our\n[GitHub repository](https://github.com/MrKekovich/transformer-builder).\n\n---\n\n## Acknowledgments\n\nThis project was inspired by the need for a flexible and customizable API for creating\ndecoder blocks in deep learning models.\n\n---\n\n## Author\n\n[MrKekovich](https://github.com/MrKekovich)\n\n---\n\n## License\n\nThis project is licensed under the [BSD-3-Clause](https://opensource.org/license/bsd-3-clause/) License.\nSee the [LICENSE](https://github.com/MrKekovich/transformer-builder/blob/master/LICENSE) file for details.\n",
    "bugtrack_url": null,
    "license": "BSD-3-Clause",
    "summary": "This package provides builder-like API to create really flexible transformers using PyTorch",
    "version": "0.1.0",
    "project_urls": {
        "Bug Tracker": "https://github.com/MrKekovich/transformer-builder/issues",
        "Homepage": "https://github.com/MrKekovich/transformer-builder",
        "Repository": "https://github.com/MrKekovich/transformer-builder"
    },
    "split_keywords": [
        "pytorch",
        "transformer",
        "ai"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "c6da45b437a0a074cfeea9c0660c452df415c80f9e3a7db667d033ad02e0aeb9",
                "md5": "28b718d6ef7601504af3383c0a2e3617",
                "sha256": "0c8a1d9ecd41ced92bb1b6704ee659a5a93511bf535a643ca0cbe3debcab8117"
            },
            "downloads": -1,
            "filename": "transformer_builder-0.1.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "28b718d6ef7601504af3383c0a2e3617",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8,<3.12",
            "size": 12114,
            "upload_time": "2023-10-30T08:15:44",
            "upload_time_iso_8601": "2023-10-30T08:15:44.801709Z",
            "url": "https://files.pythonhosted.org/packages/c6/da/45b437a0a074cfeea9c0660c452df415c80f9e3a7db667d033ad02e0aeb9/transformer_builder-0.1.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "0d53541ad62506616a6ffdf6122d5428c756bb7fd77c7d88ba1d14b92b984ae9",
                "md5": "d5ac61ab53aeadb8f254602d01b8cfcc",
                "sha256": "9f815face01898f2a03ec8cc829f612c6d13b1a12e768adfaa429da778fd154d"
            },
            "downloads": -1,
            "filename": "transformer_builder-0.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "d5ac61ab53aeadb8f254602d01b8cfcc",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8,<3.12",
            "size": 8009,
            "upload_time": "2023-10-30T08:15:46",
            "upload_time_iso_8601": "2023-10-30T08:15:46.048730Z",
            "url": "https://files.pythonhosted.org/packages/0d/53/541ad62506616a6ffdf6122d5428c756bb7fd77c7d88ba1d14b92b984ae9/transformer_builder-0.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-10-30 08:15:46",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "MrKekovich",
    "github_project": "transformer-builder",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "transformer-builder"
}
        
Elapsed time: 0.20842s