# Simba
![graph](graph.png)
A simpler Pytorch + Zeta Implementation of the paper: "SiMBA: Simplified Mamba-based Architecture for Vision and Multivariate Time series"
## install
`$ pip install simba-torch`
## usage
```python
import torch
from simba_torch.main import Simba
# Forward pass with images
img = torch.randn(1, 3, 224, 224)
# Create model
model = Simba(
dim = 4, # Dimension of the transformer
dropout = 0.1, # Dropout rate for regularization
d_state=64, # Dimension of the transformer state
d_conv=64, # Dimension of the convolutional layers
num_classes=64, # Number of output classes
depth=8, # Number of transformer layers
patch_size=16, # Size of the image patches
image_size=224, # Size of the input image
channels=3, # Number of input channels
# use_pos_emb=True # If you want
)
# Forward pass
out = model(img)
print(out.shape)
```
# License
MIT
# Todo
- [ ] Add paper link
- [ ] Add citation bibtex
- [ ] cleanup
Raw data
{
"_id": null,
"home_page": "https://github.com/kyegomez/Simba",
"name": "simba-torch",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.6",
"maintainer_email": null,
"keywords": "artificial intelligence, deep learning, optimizers, Prompt Engineering",
"author": "Kye Gomez",
"author_email": "kye@apac.ai",
"download_url": "https://files.pythonhosted.org/packages/b1/bc/7f22a7bdd4166d258bea69751d8d09dc150f1a6eb885fb0a3d96b1c938da/simba_torch-0.0.5.tar.gz",
"platform": null,
"description": "\n# Simba\n![graph](graph.png)\nA simpler Pytorch + Zeta Implementation of the paper: \"SiMBA: Simplified Mamba-based Architecture for Vision and Multivariate Time series\"\n\n\n## install\n`$ pip install simba-torch`\n\n## usage\n```python\n\nimport torch \nfrom simba_torch.main import Simba\n\n# Forward pass with images\nimg = torch.randn(1, 3, 224, 224)\n\n# Create model\nmodel = Simba(\n dim = 4, # Dimension of the transformer\n dropout = 0.1, # Dropout rate for regularization\n d_state=64, # Dimension of the transformer state\n d_conv=64, # Dimension of the convolutional layers\n num_classes=64, # Number of output classes\n depth=8, # Number of transformer layers\n patch_size=16, # Size of the image patches\n image_size=224, # Size of the input image\n channels=3, # Number of input channels\n # use_pos_emb=True # If you want\n)\n\n# Forward pass\nout = model(img)\nprint(out.shape)\n\n```\n\n\n# License\nMIT\n\n# Todo\n- [ ] Add paper link\n- [ ] Add citation bibtex\n- [ ] cleanup",
"bugtrack_url": null,
"license": "MIT",
"summary": "Paper - Pytorch",
"version": "0.0.5",
"project_urls": {
"Documentation": "https://github.com/kyegomez/Simba",
"Homepage": "https://github.com/kyegomez/Simba",
"Repository": "https://github.com/kyegomez/Simba"
},
"split_keywords": [
"artificial intelligence",
" deep learning",
" optimizers",
" prompt engineering"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "686841bc77be42bf45adc262e91ab96eafeefa90a07a4b3d3d8aa9461a5e9df2",
"md5": "a0d9b29578cf64d0e0877e8410d95648",
"sha256": "46ac9b3296780b001b81ad79542bf14290683acbb600a99d130b756d4d470284"
},
"downloads": -1,
"filename": "simba_torch-0.0.5-py3-none-any.whl",
"has_sig": false,
"md5_digest": "a0d9b29578cf64d0e0877e8410d95648",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.6",
"size": 5998,
"upload_time": "2024-03-26T08:03:50",
"upload_time_iso_8601": "2024-03-26T08:03:50.327839Z",
"url": "https://files.pythonhosted.org/packages/68/68/41bc77be42bf45adc262e91ab96eafeefa90a07a4b3d3d8aa9461a5e9df2/simba_torch-0.0.5-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "b1bc7f22a7bdd4166d258bea69751d8d09dc150f1a6eb885fb0a3d96b1c938da",
"md5": "7ab0ecc9a82ed0ff8c11b82febd08308",
"sha256": "e619f4defed636bfe3d7218ede72c1dba003a0558a8372dda3a0e0bbd94240a2"
},
"downloads": -1,
"filename": "simba_torch-0.0.5.tar.gz",
"has_sig": false,
"md5_digest": "7ab0ecc9a82ed0ff8c11b82febd08308",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.6",
"size": 5699,
"upload_time": "2024-03-26T08:03:53",
"upload_time_iso_8601": "2024-03-26T08:03:53.495233Z",
"url": "https://files.pythonhosted.org/packages/b1/bc/7f22a7bdd4166d258bea69751d8d09dc150f1a6eb885fb0a3d96b1c938da/simba_torch-0.0.5.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-03-26 08:03:53",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "kyegomez",
"github_project": "Simba",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [],
"lcname": "simba-torch"
}