gated-slot-attention


Namegated-slot-attention JSON
Version 0.0.1 PyPI version JSON
download
home_pagehttps://github.com/kyegomez/Gated-Slot-Attention
SummaryGated Slot Attention - Pytorch
upload_time2024-09-13 18:36:06
maintainerNone
docs_urlNone
authorKye Gomez
requires_python<4.0,>=3.10
licenseMIT
keywords artificial intelligence deep learning optimizers prompt engineering
VCS
bugtrack_url
requirements torch zetascale swarms
Travis-CI No Travis.
coveralls test coverage No coveralls.
            

# Gated Slot Attention

[![Join our Discord](https://img.shields.io/badge/Discord-Join%20our%20server-5865F2?style=for-the-badge&logo=discord&logoColor=white)](https://discord.gg/agora-999382051935506503) [![Subscribe on YouTube](https://img.shields.io/badge/YouTube-Subscribe-red?style=for-the-badge&logo=youtube&logoColor=white)](https://www.youtube.com/@kyegomez3242) [![Connect on LinkedIn](https://img.shields.io/badge/LinkedIn-Connect-blue?style=for-the-badge&logo=linkedin&logoColor=white)](https://www.linkedin.com/in/kye-g-38759a207/) [![Follow on X.com](https://img.shields.io/badge/X.com-Follow-1DA1F2?style=for-the-badge&logo=x&logoColor=white)](https://x.com/kyegomezb)





## Usage

```python
import torch
from gated_slot_attention.model import GSATransformer

model = GSATransformer(
    dim=512,
    heads=8,
    m=64,
    tau=0.1,
    depth=1,
    vocab_size=10000,
    max_seq_len=1024,
)

x = torch.randint(0, 10000, (1, 1024))
out = model(x)
print(out.shape)

```

# License
MIT

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/kyegomez/Gated-Slot-Attention",
    "name": "gated-slot-attention",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.10",
    "maintainer_email": null,
    "keywords": "artificial intelligence, deep learning, optimizers, Prompt Engineering",
    "author": "Kye Gomez",
    "author_email": "kye@apac.ai",
    "download_url": "https://files.pythonhosted.org/packages/6d/22/04728f8c974bf56e6c56dc7cf58368a6a7613f2e25e3d2605e1b57ca816e/gated_slot_attention-0.0.1.tar.gz",
    "platform": null,
    "description": "\n\n# Gated Slot Attention\n\n[![Join our Discord](https://img.shields.io/badge/Discord-Join%20our%20server-5865F2?style=for-the-badge&logo=discord&logoColor=white)](https://discord.gg/agora-999382051935506503) [![Subscribe on YouTube](https://img.shields.io/badge/YouTube-Subscribe-red?style=for-the-badge&logo=youtube&logoColor=white)](https://www.youtube.com/@kyegomez3242) [![Connect on LinkedIn](https://img.shields.io/badge/LinkedIn-Connect-blue?style=for-the-badge&logo=linkedin&logoColor=white)](https://www.linkedin.com/in/kye-g-38759a207/) [![Follow on X.com](https://img.shields.io/badge/X.com-Follow-1DA1F2?style=for-the-badge&logo=x&logoColor=white)](https://x.com/kyegomezb)\n\n\n\n\n\n## Usage\n\n```python\nimport torch\nfrom gated_slot_attention.model import GSATransformer\n\nmodel = GSATransformer(\n    dim=512,\n    heads=8,\n    m=64,\n    tau=0.1,\n    depth=1,\n    vocab_size=10000,\n    max_seq_len=1024,\n)\n\nx = torch.randint(0, 10000, (1, 1024))\nout = model(x)\nprint(out.shape)\n\n```\n\n# License\nMIT\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Gated Slot Attention - Pytorch",
    "version": "0.0.1",
    "project_urls": {
        "Documentation": "https://github.com/kyegomez/Gated-Slot-Attention",
        "Homepage": "https://github.com/kyegomez/Gated-Slot-Attention",
        "Repository": "https://github.com/kyegomez/Gated-Slot-Attention"
    },
    "split_keywords": [
        "artificial intelligence",
        " deep learning",
        " optimizers",
        " prompt engineering"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "b8c257db15f9525977620c41875673aeb7a302638f5b3844ac39185f39023e30",
                "md5": "c190723dc8021a233feeea9378e04f9a",
                "sha256": "5afce2322c21ceba30ccf73ee08c144fb7e3336f475881f098378a3bd37324c8"
            },
            "downloads": -1,
            "filename": "gated_slot_attention-0.0.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "c190723dc8021a233feeea9378e04f9a",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.10",
            "size": 5597,
            "upload_time": "2024-09-13T18:36:05",
            "upload_time_iso_8601": "2024-09-13T18:36:05.230723Z",
            "url": "https://files.pythonhosted.org/packages/b8/c2/57db15f9525977620c41875673aeb7a302638f5b3844ac39185f39023e30/gated_slot_attention-0.0.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "6d2204728f8c974bf56e6c56dc7cf58368a6a7613f2e25e3d2605e1b57ca816e",
                "md5": "99688abe7d5abd92d8962ab075e9675f",
                "sha256": "be2758fd001a81b1ff1f3ababc62e09231d826fea387d43bbd3a7eecc9f206b6"
            },
            "downloads": -1,
            "filename": "gated_slot_attention-0.0.1.tar.gz",
            "has_sig": false,
            "md5_digest": "99688abe7d5abd92d8962ab075e9675f",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.10",
            "size": 5145,
            "upload_time": "2024-09-13T18:36:06",
            "upload_time_iso_8601": "2024-09-13T18:36:06.475756Z",
            "url": "https://files.pythonhosted.org/packages/6d/22/04728f8c974bf56e6c56dc7cf58368a6a7613f2e25e3d2605e1b57ca816e/gated_slot_attention-0.0.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-09-13 18:36:06",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "kyegomez",
    "github_project": "Gated-Slot-Attention",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [
        {
            "name": "torch",
            "specs": []
        },
        {
            "name": "zetascale",
            "specs": []
        },
        {
            "name": "swarms",
            "specs": []
        }
    ],
    "lcname": "gated-slot-attention"
}
        
Elapsed time: 0.39497s