mamba-former


Namemamba-former JSON
Version 0.0.3 PyPI version JSON
download
home_pagehttps://github.com/kyegomez/MambaFormer
SummaryPaper - Pytorch
upload_time2024-04-05 00:54:46
maintainerNone
docs_urlNone
authorKye Gomez
requires_python<4.0,>=3.9
licenseMIT
keywords artificial intelligence deep learning optimizers prompt engineering
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            [![Multi-Modality](agorabanner.png)](https://discord.gg/qUtxnK2NMf)

# MambaFormer
Implementation of MambaFormer in Pytorch ++ Zeta from the paper: "Can Mamba Learn How to Learn? A Comparative Study on In-Context Learning Tasks"

## install
`pip3 install mamba-former`

## usage
```python
import torch
from mamba_former.main import MambaFormer

# Forward pass example
x = torch.randint(1, 1000, (1, 100))  # Token
# Tokens are integers representing input data

# Model
model = MambaFormer(
    dim=512,  # Dimension of the model
    num_tokens=1000,  # Number of unique tokens in the input data
    depth=6,  # Number of transformer layers
    d_state=512,  # Dimension of the transformer state
    d_conv=128,  # Dimension of the convolutional layer
    heads=8,  # Number of attention heads
    dim_head=64,  # Dimension of each attention head
    return_tokens=True,  # Whether to return the tokens in the output
)

# Forward pass
out = model(x)  # Perform a forward pass through the model

# If training
# out = model(x, return_loss=True)  # Perform a forward pass and calculate the loss

# Print the output
print(out)  # Print the output tensor
print(out.shape)  # Print the shape of the output tensor

```


# License
MIT

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/kyegomez/MambaFormer",
    "name": "mamba-former",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": "artificial intelligence, deep learning, optimizers, Prompt Engineering",
    "author": "Kye Gomez",
    "author_email": "kye@apac.ai",
    "download_url": "https://files.pythonhosted.org/packages/c0/0a/5fcc07d8485c4f2c03af84a68c00b5c1c0f208cc713f0f19bef389f53538/mamba_former-0.0.3.tar.gz",
    "platform": null,
    "description": "[![Multi-Modality](agorabanner.png)](https://discord.gg/qUtxnK2NMf)\n\n# MambaFormer\nImplementation of MambaFormer in Pytorch ++ Zeta from the paper: \"Can Mamba Learn How to Learn? A Comparative Study on In-Context Learning Tasks\"\n\n## install\n`pip3 install mamba-former`\n\n## usage\n```python\nimport torch\nfrom mamba_former.main import MambaFormer\n\n# Forward pass example\nx = torch.randint(1, 1000, (1, 100))  # Token\n# Tokens are integers representing input data\n\n# Model\nmodel = MambaFormer(\n    dim=512,  # Dimension of the model\n    num_tokens=1000,  # Number of unique tokens in the input data\n    depth=6,  # Number of transformer layers\n    d_state=512,  # Dimension of the transformer state\n    d_conv=128,  # Dimension of the convolutional layer\n    heads=8,  # Number of attention heads\n    dim_head=64,  # Dimension of each attention head\n    return_tokens=True,  # Whether to return the tokens in the output\n)\n\n# Forward pass\nout = model(x)  # Perform a forward pass through the model\n\n# If training\n# out = model(x, return_loss=True)  # Perform a forward pass and calculate the loss\n\n# Print the output\nprint(out)  # Print the output tensor\nprint(out.shape)  # Print the shape of the output tensor\n\n```\n\n\n# License\nMIT\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Paper - Pytorch",
    "version": "0.0.3",
    "project_urls": {
        "Documentation": "https://github.com/kyegomez/MambaFormer",
        "Homepage": "https://github.com/kyegomez/MambaFormer",
        "Repository": "https://github.com/kyegomez/MambaFormer"
    },
    "split_keywords": [
        "artificial intelligence",
        " deep learning",
        " optimizers",
        " prompt engineering"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3713bfc8e5b02af8bd22677d085771d85e1773dca56ddce87e5879eb39127be4",
                "md5": "01b32ef4a1577295d34c701b8aa8c6a3",
                "sha256": "b83622fd9dcbf6977a39d2056f1769e13a69c8f2569a05563774c37d00bc2a7d"
            },
            "downloads": -1,
            "filename": "mamba_former-0.0.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "01b32ef4a1577295d34c701b8aa8c6a3",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 3788,
            "upload_time": "2024-04-05T00:54:45",
            "upload_time_iso_8601": "2024-04-05T00:54:45.505045Z",
            "url": "https://files.pythonhosted.org/packages/37/13/bfc8e5b02af8bd22677d085771d85e1773dca56ddce87e5879eb39127be4/mamba_former-0.0.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "c00a5fcc07d8485c4f2c03af84a68c00b5c1c0f208cc713f0f19bef389f53538",
                "md5": "8dcd9882368c3b79f0e125855a437330",
                "sha256": "0b54d02e36848c3a9ade42c397d19f5b98e34df77664fb29213d77d775137bbd"
            },
            "downloads": -1,
            "filename": "mamba_former-0.0.3.tar.gz",
            "has_sig": false,
            "md5_digest": "8dcd9882368c3b79f0e125855a437330",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 3905,
            "upload_time": "2024-04-05T00:54:46",
            "upload_time_iso_8601": "2024-04-05T00:54:46.529112Z",
            "url": "https://files.pythonhosted.org/packages/c0/0a/5fcc07d8485c4f2c03af84a68c00b5c1c0f208cc713f0f19bef389f53538/mamba_former-0.0.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-04-05 00:54:46",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "kyegomez",
    "github_project": "MambaFormer",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [],
    "lcname": "mamba-former"
}
        
Elapsed time: 0.51750s