moe-mamba


Namemoe-mamba JSON
Version 0.0.3 PyPI version JSON
download
home_pagehttps://github.com/kyegomez/MoE-Mamba
SummaryPaper - Pytorch
upload_time2024-01-22 01:53:11
maintainer
docs_urlNone
authorKye Gomez
requires_python>=3.6,<4.0
licenseMIT
keywords artificial intelligence deep learning optimizers prompt engineering
VCS
bugtrack_url
requirements torch zetascale swarms
Travis-CI No Travis.
coveralls test coverage No coveralls.
            [![Multi-Modality](agorabanner.png)](https://discord.gg/qUtxnK2NMf)

# MoE Mamba
Implementation of MoE Mamba from the paper: "MoE-Mamba: Efficient Selective State Space Models with Mixture of Experts" in Pytorch and Zeta. 

[PAPER LINK](https://arxiv.org/abs/2401.04081)


## Install

```bash
pip install moe-mamba
```

## Usage

### `MoEMambaBlock` 
```python
import torch 
from moe_mamba import MoEMambaBlock

x = torch.randn(1, 10, 512)
model = MoEMambaBlock(
    dim=512,
    depth=6,
    d_state=128,
    expand=4,
    num_experts=4,
)
out = model(x)
print(out)

```



## Code Quality 🧹

- `make style` to format the code
- `make check_code_quality` to check code quality (PEP8 basically)
- `black .`
- `ruff . --fix`


## Citation
```bibtex
@misc{pióro2024moemamba,
    title={MoE-Mamba: Efficient Selective State Space Models with Mixture of Experts}, 
    author={Maciej Pióro and Kamil Ciebiera and Krystian Król and Jan Ludziejewski and Sebastian Jaszczur},
    year={2024},
    eprint={2401.04081},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}

```


# License
MIT

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/kyegomez/MoE-Mamba",
    "name": "moe-mamba",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.6,<4.0",
    "maintainer_email": "",
    "keywords": "artificial intelligence,deep learning,optimizers,Prompt Engineering",
    "author": "Kye Gomez",
    "author_email": "kye@apac.ai",
    "download_url": "https://files.pythonhosted.org/packages/c4/62/24bd10853d3843c06556a28116bd4db38f01089dcc9c4db7bd5b1152cbd9/moe_mamba-0.0.3.tar.gz",
    "platform": null,
    "description": "[![Multi-Modality](agorabanner.png)](https://discord.gg/qUtxnK2NMf)\n\n# MoE Mamba\nImplementation of MoE Mamba from the paper: \"MoE-Mamba: Efficient Selective State Space Models with Mixture of Experts\" in Pytorch and Zeta. \n\n[PAPER LINK](https://arxiv.org/abs/2401.04081)\n\n\n## Install\n\n```bash\npip install moe-mamba\n```\n\n## Usage\n\n### `MoEMambaBlock` \n```python\nimport torch \nfrom moe_mamba import MoEMambaBlock\n\nx = torch.randn(1, 10, 512)\nmodel = MoEMambaBlock(\n    dim=512,\n    depth=6,\n    d_state=128,\n    expand=4,\n    num_experts=4,\n)\nout = model(x)\nprint(out)\n\n```\n\n\n\n## Code Quality \ud83e\uddf9\n\n- `make style` to format the code\n- `make check_code_quality` to check code quality (PEP8 basically)\n- `black .`\n- `ruff . --fix`\n\n\n## Citation\n```bibtex\n@misc{pi\u00f3ro2024moemamba,\n    title={MoE-Mamba: Efficient Selective State Space Models with Mixture of Experts}, \n    author={Maciej Pi\u00f3ro and Kamil Ciebiera and Krystian Kr\u00f3l and Jan Ludziejewski and Sebastian Jaszczur},\n    year={2024},\n    eprint={2401.04081},\n    archivePrefix={arXiv},\n    primaryClass={cs.LG}\n}\n\n```\n\n\n# License\nMIT\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Paper - Pytorch",
    "version": "0.0.3",
    "project_urls": {
        "Documentation": "https://github.com/kyegomez/MoE-Mamba",
        "Homepage": "https://github.com/kyegomez/MoE-Mamba",
        "Repository": "https://github.com/kyegomez/MoE-Mamba"
    },
    "split_keywords": [
        "artificial intelligence",
        "deep learning",
        "optimizers",
        "prompt engineering"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "550fc5c5ba01552ebfb72618a7d612f69602a51de4333b2f9b3b76927f61383d",
                "md5": "c347b19ec36cfe08a7aae63caac28804",
                "sha256": "937d8d3cf2c65058f74761228b5879a4eb7c985b15e47a572a4048ddbbc7e913"
            },
            "downloads": -1,
            "filename": "moe_mamba-0.0.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "c347b19ec36cfe08a7aae63caac28804",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6,<4.0",
            "size": 5383,
            "upload_time": "2024-01-22T01:53:10",
            "upload_time_iso_8601": "2024-01-22T01:53:10.084540Z",
            "url": "https://files.pythonhosted.org/packages/55/0f/c5c5ba01552ebfb72618a7d612f69602a51de4333b2f9b3b76927f61383d/moe_mamba-0.0.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "c46224bd10853d3843c06556a28116bd4db38f01089dcc9c4db7bd5b1152cbd9",
                "md5": "17110afd0e72c72d732894d033f474be",
                "sha256": "abf077f44d29ef046973d47d5559525d8ddff5962a79b0590ffcb80d7438fc36"
            },
            "downloads": -1,
            "filename": "moe_mamba-0.0.3.tar.gz",
            "has_sig": false,
            "md5_digest": "17110afd0e72c72d732894d033f474be",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6,<4.0",
            "size": 5565,
            "upload_time": "2024-01-22T01:53:11",
            "upload_time_iso_8601": "2024-01-22T01:53:11.653481Z",
            "url": "https://files.pythonhosted.org/packages/c4/62/24bd10853d3843c06556a28116bd4db38f01089dcc9c4db7bd5b1152cbd9/moe_mamba-0.0.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-01-22 01:53:11",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "kyegomez",
    "github_project": "MoE-Mamba",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [
        {
            "name": "torch",
            "specs": []
        },
        {
            "name": "zetascale",
            "specs": []
        },
        {
            "name": "swarms",
            "specs": []
        }
    ],
    "lcname": "moe-mamba"
}
        
Elapsed time: 0.21050s