simple-mamba


Namesimple-mamba JSON
Version 0.0.4 PyPI version JSON
download
home_pagehttps://github.com/kyegomez/SimpleMamba
SummarySimple Mambda - Pytorch
upload_time2023-12-16 04:31:58
maintainer
docs_urlNone
authorKye Gomez
requires_python>=3.6,<4.0
licenseMIT
keywords artificial intelligence deep learning optimizers prompt engineering
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            [![Multi-Modality](agorabanner.png)](https://discord.gg/qUtxnK2NMf)

# Simple Mamba

## Install
`pip install simple-mamba`


## Usage
```python
import torch
from simple_mamba import MambaBlock


# Define block parameters
dim = 512
hidden_dim = 128
heads = 8
in_channels = 3
out_channels = 3
kernel_size = 3

# Create an instance of MambaBlock
mamba_block = MambaBlock(
    dim, hidden_dim, heads, in_channels, out_channels, kernel_size
)

# Create a sample input tensor
x = torch.randn(1, dim, dim)

# Pass the tensor through the MambaBlock
output = mamba_block(x)
print("Output shape:", output.shape)


```

### `SSM`
```python
import torch 
from simple_mamba import SSM


# # Example usage
vocab_size = 10000  # Example vocabulary size
embed_dim = 256  # Example embedding dimension
state_dim = 512  # State dimension
num_layers = 2  # Number of state-space layers

model = SSM(vocab_size, embed_dim, state_dim, num_layers)

# Example input (sequence of word indices)
input_seq = torch.randint(
     0, vocab_size, (32, 10)
 )  # Batch size of 32, sequence length of 10

 # Forward pass
logits = model(input_seq)
print(logits.shape)  # Should be [32, 10, vocab_size]

```


# License
MIT


# Citation
```bibtex
@misc{gu2023mamba,
    title={Mamba: Linear-Time Sequence Modeling with Selective State Spaces}, 
    author={Albert Gu and Tri Dao},
    year={2023},
    eprint={2312.00752},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}

```
            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/kyegomez/SimpleMamba ",
    "name": "simple-mamba",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.6,<4.0",
    "maintainer_email": "",
    "keywords": "artificial intelligence,deep learning,optimizers,Prompt Engineering",
    "author": "Kye Gomez",
    "author_email": "kye@apac.ai",
    "download_url": "https://files.pythonhosted.org/packages/a3/01/9616137e14481932a59bac640618d88385b2dab63434066cd333ee05439a/simple_mamba-0.0.4.tar.gz",
    "platform": null,
    "description": "[![Multi-Modality](agorabanner.png)](https://discord.gg/qUtxnK2NMf)\n\n# Simple Mamba\n\n## Install\n`pip install simple-mamba`\n\n\n## Usage\n```python\nimport torch\nfrom simple_mamba import MambaBlock\n\n\n# Define block parameters\ndim = 512\nhidden_dim = 128\nheads = 8\nin_channels = 3\nout_channels = 3\nkernel_size = 3\n\n# Create an instance of MambaBlock\nmamba_block = MambaBlock(\n    dim, hidden_dim, heads, in_channels, out_channels, kernel_size\n)\n\n# Create a sample input tensor\nx = torch.randn(1, dim, dim)\n\n# Pass the tensor through the MambaBlock\noutput = mamba_block(x)\nprint(\"Output shape:\", output.shape)\n\n\n```\n\n### `SSM`\n```python\nimport torch \nfrom simple_mamba import SSM\n\n\n# # Example usage\nvocab_size = 10000  # Example vocabulary size\nembed_dim = 256  # Example embedding dimension\nstate_dim = 512  # State dimension\nnum_layers = 2  # Number of state-space layers\n\nmodel = SSM(vocab_size, embed_dim, state_dim, num_layers)\n\n# Example input (sequence of word indices)\ninput_seq = torch.randint(\n     0, vocab_size, (32, 10)\n )  # Batch size of 32, sequence length of 10\n\n # Forward pass\nlogits = model(input_seq)\nprint(logits.shape)  # Should be [32, 10, vocab_size]\n\n```\n\n\n# License\nMIT\n\n\n# Citation\n```bibtex\n@misc{gu2023mamba,\n    title={Mamba: Linear-Time Sequence Modeling with Selective State Spaces}, \n    author={Albert Gu and Tri Dao},\n    year={2023},\n    eprint={2312.00752},\n    archivePrefix={arXiv},\n    primaryClass={cs.LG}\n}\n\n```",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Simple Mambda - Pytorch",
    "version": "0.0.4",
    "project_urls": {
        "Documentation": "https://github.com/kyegomez/SimpleMamba",
        "Homepage": "https://github.com/kyegomez/SimpleMamba ",
        "Repository": "https://github.com/kyegomez/SimpleMamba"
    },
    "split_keywords": [
        "artificial intelligence",
        "deep learning",
        "optimizers",
        "prompt engineering"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "81ceb6f9b3e62ab44881df43f90580715867ada4ad22832d77a41b671d928e1c",
                "md5": "a7cf2f917efeb8c9887ce768c5284c51",
                "sha256": "bbe1ce017b7580452e47757121da810c472796ce9f088da6f3683609c7528e2b"
            },
            "downloads": -1,
            "filename": "simple_mamba-0.0.4-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "a7cf2f917efeb8c9887ce768c5284c51",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6,<4.0",
            "size": 6064,
            "upload_time": "2023-12-16T04:31:56",
            "upload_time_iso_8601": "2023-12-16T04:31:56.982872Z",
            "url": "https://files.pythonhosted.org/packages/81/ce/b6f9b3e62ab44881df43f90580715867ada4ad22832d77a41b671d928e1c/simple_mamba-0.0.4-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a3019616137e14481932a59bac640618d88385b2dab63434066cd333ee05439a",
                "md5": "12c7bd2f7848e7e4c8c6f4222048cb3b",
                "sha256": "d5e02b1cb4d44a1209917b48afb5d33e6a73d2e0489753bca050cbe0e7a8d1c3"
            },
            "downloads": -1,
            "filename": "simple_mamba-0.0.4.tar.gz",
            "has_sig": false,
            "md5_digest": "12c7bd2f7848e7e4c8c6f4222048cb3b",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6,<4.0",
            "size": 5415,
            "upload_time": "2023-12-16T04:31:58",
            "upload_time_iso_8601": "2023-12-16T04:31:58.554960Z",
            "url": "https://files.pythonhosted.org/packages/a3/01/9616137e14481932a59bac640618d88385b2dab63434066cd333ee05439a/simple_mamba-0.0.4.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-12-16 04:31:58",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "kyegomez",
    "github_project": "SimpleMamba ",
    "github_not_found": true,
    "lcname": "simple-mamba"
}
        
Elapsed time: 0.15483s