switch-transformers


Nameswitch-transformers JSON
Version 0.0.4 PyPI version JSON
download
home_pagehttps://github.com/kyegomez/SwitchTransformers
SummarySwitchTransformers - Pytorch
upload_time2024-01-24 21:05:04
maintainer
docs_urlNone
authorKye Gomez
requires_python>=3.6,<4.0
licenseMIT
keywords artificial intelligence deep learning optimizers prompt engineering
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            [![Multi-Modality](agorabanner.png)](https://discord.gg/qUtxnK2NMf)

# Switch Transformers

![Switch Transformer](st.png)

Implementation of Switch Transformers from the paper: "Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity" in PyTorch, Einops, and Zeta. [PAPER LINK](https://arxiv.org/abs/2101.03961)

## Installation

```bash
pip install switch-transformers
```

# Usage
```python
import torch
from switch_transformers import SwitchTransformer

# Generate a random tensor of shape (1, 10) with values between 0 and 100
x = torch.randint(0, 100, (1, 10))

# Create an instance of the SwitchTransformer model
# num_tokens: the number of tokens in the input sequence
# dim: the dimensionality of the model
# heads: the number of attention heads
# dim_head: the dimensionality of each attention head
model = SwitchTransformer(
    num_tokens=100, dim=512, heads=8, dim_head=64
)

# Pass the input tensor through the model
out = model(x)

# Print the shape of the output tensor
print(out.shape)


```



## Citation
```bibtex
@misc{fedus2022switch,
    title={Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity}, 
    author={William Fedus and Barret Zoph and Noam Shazeer},
    year={2022},
    eprint={2101.03961},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}

```

# License
MIT

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/kyegomez/SwitchTransformers",
    "name": "switch-transformers",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.6,<4.0",
    "maintainer_email": "",
    "keywords": "artificial intelligence,deep learning,optimizers,Prompt Engineering",
    "author": "Kye Gomez",
    "author_email": "kye@apac.ai",
    "download_url": "https://files.pythonhosted.org/packages/93/54/9467b85567de5db2a86f0e2884361a8bc5acf6763873354c2b5398c55847/switch_transformers-0.0.4.tar.gz",
    "platform": null,
    "description": "[![Multi-Modality](agorabanner.png)](https://discord.gg/qUtxnK2NMf)\n\n# Switch Transformers\n\n![Switch Transformer](st.png)\n\nImplementation of Switch Transformers from the paper: \"Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity\" in PyTorch, Einops, and Zeta. [PAPER LINK](https://arxiv.org/abs/2101.03961)\n\n## Installation\n\n```bash\npip install switch-transformers\n```\n\n# Usage\n```python\nimport torch\nfrom switch_transformers import SwitchTransformer\n\n# Generate a random tensor of shape (1, 10) with values between 0 and 100\nx = torch.randint(0, 100, (1, 10))\n\n# Create an instance of the SwitchTransformer model\n# num_tokens: the number of tokens in the input sequence\n# dim: the dimensionality of the model\n# heads: the number of attention heads\n# dim_head: the dimensionality of each attention head\nmodel = SwitchTransformer(\n    num_tokens=100, dim=512, heads=8, dim_head=64\n)\n\n# Pass the input tensor through the model\nout = model(x)\n\n# Print the shape of the output tensor\nprint(out.shape)\n\n\n```\n\n\n\n## Citation\n```bibtex\n@misc{fedus2022switch,\n    title={Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity}, \n    author={William Fedus and Barret Zoph and Noam Shazeer},\n    year={2022},\n    eprint={2101.03961},\n    archivePrefix={arXiv},\n    primaryClass={cs.LG}\n}\n\n```\n\n# License\nMIT\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "SwitchTransformers - Pytorch",
    "version": "0.0.4",
    "project_urls": {
        "Documentation": "https://github.com/kyegomez/SwitchTransformers",
        "Homepage": "https://github.com/kyegomez/SwitchTransformers",
        "Repository": "https://github.com/kyegomez/SwitchTransformers"
    },
    "split_keywords": [
        "artificial intelligence",
        "deep learning",
        "optimizers",
        "prompt engineering"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "b8c6ea0db98bfc80cc07851dea77af3124a28e0cbede49375129362e8a5d714d",
                "md5": "213580cb65ce06649b86f7c091c1e972",
                "sha256": "e57f21e358197e6e8347fb59703104b7a1c42bf2ccded4d95475614e145f7d1b"
            },
            "downloads": -1,
            "filename": "switch_transformers-0.0.4-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "213580cb65ce06649b86f7c091c1e972",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6,<4.0",
            "size": 5556,
            "upload_time": "2024-01-24T21:05:02",
            "upload_time_iso_8601": "2024-01-24T21:05:02.498344Z",
            "url": "https://files.pythonhosted.org/packages/b8/c6/ea0db98bfc80cc07851dea77af3124a28e0cbede49375129362e8a5d714d/switch_transformers-0.0.4-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "93549467b85567de5db2a86f0e2884361a8bc5acf6763873354c2b5398c55847",
                "md5": "7f3ef0705da38cbc8dd24822f4ae6e94",
                "sha256": "e80972012db0ac1f73d922b31f5d0a05af4402a0859ca6bf03279ecf64f536fc"
            },
            "downloads": -1,
            "filename": "switch_transformers-0.0.4.tar.gz",
            "has_sig": false,
            "md5_digest": "7f3ef0705da38cbc8dd24822f4ae6e94",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6,<4.0",
            "size": 5622,
            "upload_time": "2024-01-24T21:05:04",
            "upload_time_iso_8601": "2024-01-24T21:05:04.291463Z",
            "url": "https://files.pythonhosted.org/packages/93/54/9467b85567de5db2a86f0e2884361a8bc5acf6763873354c2b5398c55847/switch_transformers-0.0.4.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-01-24 21:05:04",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "kyegomez",
    "github_project": "SwitchTransformers",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [],
    "lcname": "switch-transformers"
}
        
Elapsed time: 0.18523s