brave-torch


Namebrave-torch JSON
Version 4.7.9 PyPI version JSON
download
home_pagehttps://github.com/kyegomez/BRAVE-ViT-Swarm
SummarySwarms - Pytorch
upload_time2024-04-13 03:31:07
maintainerNone
docs_urlNone
authorKye Gomez
requires_python<4.0,>=3.9
licenseMIT
keywords artificial intelligence deep learning optimizers prompt engineering swarms agents
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            [![Multi-Modality](agorabanner.png)](https://discord.gg/qUtxnK2NMf)

# BRAVE or Swarms of Vision Transformers
Implementation of the paper: "BRAVE : Broadening the visual encoding of vision-language models". BRAVE achieves state-of-the-art performance on a broad range of captioning and VQA benchmarks and significantly reduces the aforementioned issues of VLMs, while requiring a smaller number of trainable parameters than existing methods and having a more compressed representation.

## install
`pip3 install brave-torch`


## usage

### 
```python
import torch
from brave_torch.main import SwarmOfViTs

# IMG Tensor
x = torch.randn(1, 3, 224, 224) 

# Model
model = SwarmOfViTs(
    image_size=224,
    patch_size=32,
    encoder_dim=512,
    encoder_depth=6,
    encoder_heads=8,
    num_of_vits=4
)

# Forward
out = model(x)
print(out)
```

# Citations

## Todo
- [ ] Citation link
- [ ] Citation Bibtex
- [ ] Diagram photo
- [ ] Implement Andromeda Base LLM architecture
- [ ] Provide multi-modal tokenizer
- [ ] Train and release the model 
            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/kyegomez/BRAVE-ViT-Swarm",
    "name": "brave-torch",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": "artificial intelligence, deep learning, optimizers, Prompt Engineering, swarms, agents",
    "author": "Kye Gomez",
    "author_email": "kye@apac.ai",
    "download_url": "https://files.pythonhosted.org/packages/bb/12/8b9ebe67ad9f7c4d74e31638e2fcbd5eb65f95c35c77a41bf3c42208dcb1/brave_torch-4.7.9.tar.gz",
    "platform": null,
    "description": "[![Multi-Modality](agorabanner.png)](https://discord.gg/qUtxnK2NMf)\n\n# BRAVE or Swarms of Vision Transformers\nImplementation of the paper: \"BRAVE : Broadening the visual encoding of vision-language models\". BRAVE achieves state-of-the-art performance on a broad range of captioning and VQA benchmarks and significantly reduces the aforementioned issues of VLMs, while requiring a smaller number of trainable parameters than existing methods and having a more compressed representation.\n\n## install\n`pip3 install brave-torch`\n\n\n## usage\n\n### \n```python\nimport torch\nfrom brave_torch.main import SwarmOfViTs\n\n# IMG Tensor\nx = torch.randn(1, 3, 224, 224) \n\n# Model\nmodel = SwarmOfViTs(\n    image_size=224,\n    patch_size=32,\n    encoder_dim=512,\n    encoder_depth=6,\n    encoder_heads=8,\n    num_of_vits=4\n)\n\n# Forward\nout = model(x)\nprint(out)\n```\n\n# Citations\n\n## Todo\n- [ ] Citation link\n- [ ] Citation Bibtex\n- [ ] Diagram photo\n- [ ] Implement Andromeda Base LLM architecture\n- [ ] Provide multi-modal tokenizer\n- [ ] Train and release the model ",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Swarms - Pytorch",
    "version": "4.7.9",
    "project_urls": {
        "Documentation": "https://swarms.apac.ai",
        "Homepage": "https://github.com/kyegomez/BRAVE-ViT-Swarm",
        "Repository": "https://github.com/kyegomez/BRAVE-ViT-Swarm"
    },
    "split_keywords": [
        "artificial intelligence",
        " deep learning",
        " optimizers",
        " prompt engineering",
        " swarms",
        " agents"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "eb7fe260d2829f894d9a9ab35c77dd76606566f9fc836a329393d3d0e3c3cba4",
                "md5": "74dc100e1d8e5ffd8ef63f25b7c07cca",
                "sha256": "1ecd965fe9ef4ae554a31075cde450dc3f342bc29b44f0d4d4b591f388db7c83"
            },
            "downloads": -1,
            "filename": "brave_torch-4.7.9-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "74dc100e1d8e5ffd8ef63f25b7c07cca",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 6556,
            "upload_time": "2024-04-13T03:31:06",
            "upload_time_iso_8601": "2024-04-13T03:31:06.608626Z",
            "url": "https://files.pythonhosted.org/packages/eb/7f/e260d2829f894d9a9ab35c77dd76606566f9fc836a329393d3d0e3c3cba4/brave_torch-4.7.9-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "bb128b9ebe67ad9f7c4d74e31638e2fcbd5eb65f95c35c77a41bf3c42208dcb1",
                "md5": "f751266589f3c7e95a4c55a9a59bfab1",
                "sha256": "ad2f4b63ef35520a08e8968016cc6b8870c3173b2008ba8a2655760ea0b310e6"
            },
            "downloads": -1,
            "filename": "brave_torch-4.7.9.tar.gz",
            "has_sig": false,
            "md5_digest": "f751266589f3c7e95a4c55a9a59bfab1",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 6169,
            "upload_time": "2024-04-13T03:31:07",
            "upload_time_iso_8601": "2024-04-13T03:31:07.775752Z",
            "url": "https://files.pythonhosted.org/packages/bb/12/8b9ebe67ad9f7c4d74e31638e2fcbd5eb65f95c35c77a41bf3c42208dcb1/brave_torch-4.7.9.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-04-13 03:31:07",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "kyegomez",
    "github_project": "BRAVE-ViT-Swarm",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [],
    "lcname": "brave-torch"
}
        
Elapsed time: 0.26390s