jamba


Namejamba JSON
Version 0.0.2 PyPI version JSON
download
home_pagehttps://github.com/kyegomez/jamba
Summaryjamba - Pytorch
upload_time2024-04-01 18:28:50
maintainerNone
docs_urlNone
authorKye Gomez
requires_python<4.0,>=3.6
licenseMIT
keywords artificial intelligence deep learning optimizers prompt engineering
VCS
bugtrack_url
requirements torch zetascale swarms
Travis-CI No Travis.
coveralls test coverage No coveralls.
            [![Multi-Modality](agorabanner.png)](https://discord.gg/qUtxnK2NMf)

# Jamba
PyTorch Implementation of Jamba: "Jamba: A Hybrid Transformer-Mamba Language Model"


## install
`$ pip install jamba`

## usage

```python
# Import the torch library, which provides tools for machine learning
import torch

# Import the Jamba model from the jamba.model module
from jamba.model import Jamba

# Create a tensor of random integers between 0 and 100, with shape (1, 100)
# This simulates a batch of tokens that we will pass through the model
x = torch.randint(0, 100, (1, 100))

# Initialize the Jamba model with the specified parameters
# dim: dimensionality of the input data
# depth: number of layers in the model
# num_tokens: number of unique tokens in the input data
# d_state: dimensionality of the hidden state in the model
# d_conv: dimensionality of the convolutional layers in the model
# heads: number of attention heads in the model
# num_experts: number of expert networks in the model
# num_experts_per_token: number of experts used for each token in the input data
model = Jamba(
    dim=512,
    depth=6,
    num_tokens=100,
    d_state=256,
    d_conv=128,
    heads=8,
    num_experts=8,
    num_experts_per_token=2,
)

# Perform a forward pass through the model with the input data
# This will return the model's predictions for each token in the input data
output = model(x)

# Print the model's predictions
print(output)

```

# License
MIT

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/kyegomez/jamba",
    "name": "jamba",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.6",
    "maintainer_email": null,
    "keywords": "artificial intelligence, deep learning, optimizers, Prompt Engineering",
    "author": "Kye Gomez",
    "author_email": "kye@apac.ai",
    "download_url": "https://files.pythonhosted.org/packages/10/28/5bc6245545c7be050685d887e8e034a70054ce390db81ba197316266d1ec/jamba-0.0.2.tar.gz",
    "platform": null,
    "description": "[![Multi-Modality](agorabanner.png)](https://discord.gg/qUtxnK2NMf)\n\n# Jamba\nPyTorch Implementation of Jamba: \"Jamba: A Hybrid Transformer-Mamba Language Model\"\n\n\n## install\n`$ pip install jamba`\n\n## usage\n\n```python\n# Import the torch library, which provides tools for machine learning\nimport torch\n\n# Import the Jamba model from the jamba.model module\nfrom jamba.model import Jamba\n\n# Create a tensor of random integers between 0 and 100, with shape (1, 100)\n# This simulates a batch of tokens that we will pass through the model\nx = torch.randint(0, 100, (1, 100))\n\n# Initialize the Jamba model with the specified parameters\n# dim: dimensionality of the input data\n# depth: number of layers in the model\n# num_tokens: number of unique tokens in the input data\n# d_state: dimensionality of the hidden state in the model\n# d_conv: dimensionality of the convolutional layers in the model\n# heads: number of attention heads in the model\n# num_experts: number of expert networks in the model\n# num_experts_per_token: number of experts used for each token in the input data\nmodel = Jamba(\n    dim=512,\n    depth=6,\n    num_tokens=100,\n    d_state=256,\n    d_conv=128,\n    heads=8,\n    num_experts=8,\n    num_experts_per_token=2,\n)\n\n# Perform a forward pass through the model with the input data\n# This will return the model's predictions for each token in the input data\noutput = model(x)\n\n# Print the model's predictions\nprint(output)\n\n```\n\n# License\nMIT\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "jamba - Pytorch",
    "version": "0.0.2",
    "project_urls": {
        "Documentation": "https://github.com/kyegomez/jamba",
        "Homepage": "https://github.com/kyegomez/jamba",
        "Repository": "https://github.com/kyegomez/jamba"
    },
    "split_keywords": [
        "artificial intelligence",
        " deep learning",
        " optimizers",
        " prompt engineering"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "d66879333062974aaaed75ecd950821b13f62fbd2852f7ef08076124c4614436",
                "md5": "db985a5e6b2446611a0a3e548169fbe9",
                "sha256": "d1d918498812a5f748ad18b3d94eccbb4c9ebb0b1e997755837a721f62844eb2"
            },
            "downloads": -1,
            "filename": "jamba-0.0.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "db985a5e6b2446611a0a3e548169fbe9",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.6",
            "size": 8267,
            "upload_time": "2024-04-01T18:28:48",
            "upload_time_iso_8601": "2024-04-01T18:28:48.925747Z",
            "url": "https://files.pythonhosted.org/packages/d6/68/79333062974aaaed75ecd950821b13f62fbd2852f7ef08076124c4614436/jamba-0.0.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "10285bc6245545c7be050685d887e8e034a70054ce390db81ba197316266d1ec",
                "md5": "09035fdff16940d1e09f77e310a6c8a2",
                "sha256": "9fb7d3b5501351f297cae924a8b3efc241c37d949f20858b1241b16162275fa1"
            },
            "downloads": -1,
            "filename": "jamba-0.0.2.tar.gz",
            "has_sig": false,
            "md5_digest": "09035fdff16940d1e09f77e310a6c8a2",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.6",
            "size": 8664,
            "upload_time": "2024-04-01T18:28:50",
            "upload_time_iso_8601": "2024-04-01T18:28:50.546022Z",
            "url": "https://files.pythonhosted.org/packages/10/28/5bc6245545c7be050685d887e8e034a70054ce390db81ba197316266d1ec/jamba-0.0.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-04-01 18:28:50",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "kyegomez",
    "github_project": "jamba",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [
        {
            "name": "torch",
            "specs": []
        },
        {
            "name": "zetascale",
            "specs": []
        },
        {
            "name": "swarms",
            "specs": []
        }
    ],
    "lcname": "jamba"
}
        
Elapsed time: 0.23818s