Name | isoai JSON |
Version |
0.0.3
JSON |
| download |
home_page | https://github.com/iso-ai/isosdk |
Summary | Compress, Trace and Deploy Your Models with Iso! |
upload_time | 2024-07-30 07:54:29 |
maintainer | None |
docs_url | None |
author | Jazmia Henry |
requires_python | >=3.6 |
license | None |
keywords |
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# Iso AI
**Unlock the Value of GenAI with Iso!**
# isoai Package Documentation
## Table of Contents
1. [Overview](#overview)
2. [Installation](#installation)
3. [Modules](#modules)
- [isobits](#isobits)
- [GPT_bitnet](#gpt_bitnet)
- [torch_bitnet](#torch_bitnet)
- [replace_linear_layers](#replace_linear_layers)
- [LLama_bitnet](#llama_bitnet)
- [MoE_bitnet](#moe_bitnet)
- [isodeploy](#isodeploy)
- [containerize](#containerize)
- [oscontainerization](#oscontainerization)
- [isotrace](#isotrace)
4. [Usage Examples](#usage-examples)
- [Replacing Linear Layers with TorchBitNet](#replacing-linear-layers-with-torchbitnet)
- [Using BitNetLLAMA](#using-bitnetllama)
- [Using MoE Transformer](#using-moe-transformer)
- [Containerizing Your Model](#containerizing-your-model)
- [Containerizing with OS Detection](#containerizing-with-os-detection)
- [Tracing Variables](#tracing-variables)
5. [Contributing](#contributing)
6. [Support](#support)
7. [License](#license)
8. [Acknowledgments](#acknowledgments)
## Overview
The `isoai` package is designed to facilitate the integration, transformation, deployment, and tracing of AI models. It provides a suite of tools that make it easy to work with different types of neural network architectures and deploy them in a containerized environment.
## Installation
To install the `isoai` package, use the following command:
```bash
pip install isoai
```
## Modules
### isobits
#### GPT_bitnet
The `GPT_bitnet` module contains implementations of transformers using the BitNet architecture.
- **Transformer**: Main transformer class.
- **TransformerBlock**: Individual transformer block class.
#### torch_bitnet
The `torch_bitnet` module provides implementations for RMSNorm and TorchBitNet layers.
- **RMSNorm**: Root Mean Square Layer Normalization.
- **TorchBitNet**: Implementation of BitNet layers using PyTorch.
#### replace_linear_layers
The `replace_linear_layers` module provides functionality to replace traditional linear layers in a model with TorchBitNet layers.
- **replace_linears_with_torchbitnet**: Function to replace linear layers with TorchBitNet layers.
#### LLama_bitnet
The `LLama_bitnet` module includes the BitNetLLAMA model, which leverages the BitNet architecture.
- **BitNetLLAMA**: Implementation of the LLAMA model using BitNet.
#### MoE_bitnet
The `MoE_bitnet` module implements the Mixture of Experts (MoE) architecture using BitNet.
- **MoETransformer**: Transformer model with MoE.
- **MoETransformerBlock**: Individual MoE transformer block.
### isodeploy
#### containerize
The `containerize` module helps in containerizing models into Docker containers.
- **Containerize**: Main class for containerizing models.
#### oscontainerization
The `oscontainerization` module extends containerization functionality with automatic OS detection.
- **OSContainerize**: Class for containerizing models with OS detection.
### isotrace
The `isotrace` module provides tools for tracing and logging variables within the code.
- **Autotrace**: Class for automatic tracing and logging of variables.
## Usage Examples
### Replacing Linear Layers with TorchBitNet
```python
import torch
from isoai.isobits.GPT_bitnet import Transformer
from isoai.isobits.replace_linear_layers import replace_linears_with_torchbitnet
class ModelArgs:
def __init__(self):
self.vocab_size = 30522
self.dim = 768
self.n_heads = 12
self.n_kv_heads = 12
self.max_seq_len = 512
self.norm_eps = 1e-5
self.multiple_of = 64
self.ffn_dim_multiplier = 4
self.n_layers = 12
self.max_batch_size = 32
args = ModelArgs()
tokens = torch.randint(0, args.vocab_size, (2, args.max_seq_len))
transformer = Transformer(args)
output = transformer(tokens, start_pos=0)
print("Original Transformer Description: ", transformer)
replace_linears_with_torchbitnet(transformer, norm_dim=10)
print("Bitnet Transformer Description: ", transformer)
```
### Using BitNetLLAMA
```python
import torch
from isoai.isobits.LLama_bitnet import BitNetLLAMA
class ModelArgs:
def __init__(self):
self.vocab_size = 30522
self.dim = 768
self.n_heads = 12
self.n_kv_heads = 12
self.max_seq_len = 512
self.norm_eps = 1e-5
self.multiple_of = 64
self.ffn_dim_multiplier = 4
self.max_batch_size = 32
self.n_layers = 12
args = ModelArgs()
tokens = torch.randint(0, args.vocab_size, (2, args.max_seq_len))
bitnet_llama = BitNetLLAMA(args)
output = bitnet_llama(tokens, start_pos=0)
print(output.shape)
```
### Using MoE Transformer
```python
import torch
from isoai.isobits.MoE_bitnet import MoETransformer
class ModelArgs:
def __init__(self):
self.vocab_size = 30522
self.dim = 768
self.n_heads = 12
self.n_kv_heads = 12
self.max_seq_len = 512
self.norm_eps = 1e-5
self.multiple_of = 64
self.ffn_dim_multiplier = 4
self.max_batch_size = 32
self.n_layers = 12
self.num_experts = 4 # Number of experts in MoE layers
args = ModelArgs()
tokens = torch.randint(0, args.vocab_size, (2, args.max_seq_len))
moe_transformer = MoETransformer(args)
output = moe_transformer(tokens, start_pos=0)
print(output.shape)
```
### Containerizing Your Model
```python
from isoai.isodeploy.containerize import Containerize
model_path = "isoai"
output_path = "Dockerfile"
containerize = Containerize(model_path)
containerize.run(output_path)
```
### Containerizing with OS Detection
```python
from isoai.isodeploy.oscontainerization import OSContainerize
model_path = "isoai"
output_path = "Dockerfile"
containerize = OSContainerize(model_path)
containerize.run(output_path)
```
### Tracing Variables
```python
from isoai.isotrace.autotrace import Autotrace
search_path = "isoai"
output_file = "output.json"
autotrace = Autotrace(search_path)
autotrace.run(output_file)
```
Raw data
{
"_id": null,
"home_page": "https://github.com/iso-ai/isosdk",
"name": "isoai",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.6",
"maintainer_email": null,
"keywords": null,
"author": "Jazmia Henry",
"author_email": "isojaz@isoai.co",
"download_url": "https://files.pythonhosted.org/packages/fd/4f/6a452c7cb7f5a5473f604fbe0ed6137ebfc9b95d2a6625a35f24884dff94/isoai-0.0.3.tar.gz",
"platform": null,
"description": "# Iso AI\n**Unlock the Value of GenAI with Iso!** \n# isoai Package Documentation\n\n## Table of Contents\n1. [Overview](#overview)\n2. [Installation](#installation)\n3. [Modules](#modules)\n - [isobits](#isobits)\n - [GPT_bitnet](#gpt_bitnet)\n - [torch_bitnet](#torch_bitnet)\n - [replace_linear_layers](#replace_linear_layers)\n - [LLama_bitnet](#llama_bitnet)\n - [MoE_bitnet](#moe_bitnet)\n - [isodeploy](#isodeploy)\n - [containerize](#containerize)\n - [oscontainerization](#oscontainerization)\n - [isotrace](#isotrace)\n4. [Usage Examples](#usage-examples)\n - [Replacing Linear Layers with TorchBitNet](#replacing-linear-layers-with-torchbitnet)\n - [Using BitNetLLAMA](#using-bitnetllama)\n - [Using MoE Transformer](#using-moe-transformer)\n - [Containerizing Your Model](#containerizing-your-model)\n - [Containerizing with OS Detection](#containerizing-with-os-detection)\n - [Tracing Variables](#tracing-variables)\n5. [Contributing](#contributing)\n6. [Support](#support)\n7. [License](#license)\n8. [Acknowledgments](#acknowledgments)\n\n## Overview\nThe `isoai` package is designed to facilitate the integration, transformation, deployment, and tracing of AI models. It provides a suite of tools that make it easy to work with different types of neural network architectures and deploy them in a containerized environment.\n\n## Installation\nTo install the `isoai` package, use the following command:\n```bash\npip install isoai\n```\n\n## Modules\n\n### isobits\n\n#### GPT_bitnet\nThe `GPT_bitnet` module contains implementations of transformers using the BitNet architecture.\n\n- **Transformer**: Main transformer class.\n- **TransformerBlock**: Individual transformer block class.\n\n#### torch_bitnet\nThe `torch_bitnet` module provides implementations for RMSNorm and TorchBitNet layers.\n\n- **RMSNorm**: Root Mean Square Layer Normalization.\n- **TorchBitNet**: Implementation of BitNet layers using PyTorch.\n\n#### replace_linear_layers\nThe `replace_linear_layers` module provides functionality to replace traditional linear layers in a model with TorchBitNet layers.\n\n- **replace_linears_with_torchbitnet**: Function to replace linear layers with TorchBitNet layers.\n\n#### LLama_bitnet\nThe `LLama_bitnet` module includes the BitNetLLAMA model, which leverages the BitNet architecture.\n\n- **BitNetLLAMA**: Implementation of the LLAMA model using BitNet.\n\n#### MoE_bitnet\nThe `MoE_bitnet` module implements the Mixture of Experts (MoE) architecture using BitNet.\n\n- **MoETransformer**: Transformer model with MoE.\n- **MoETransformerBlock**: Individual MoE transformer block.\n\n### isodeploy\n\n#### containerize\nThe `containerize` module helps in containerizing models into Docker containers.\n\n- **Containerize**: Main class for containerizing models.\n\n#### oscontainerization\nThe `oscontainerization` module extends containerization functionality with automatic OS detection.\n\n- **OSContainerize**: Class for containerizing models with OS detection.\n\n### isotrace\n\nThe `isotrace` module provides tools for tracing and logging variables within the code.\n\n- **Autotrace**: Class for automatic tracing and logging of variables.\n\n## Usage Examples\n\n### Replacing Linear Layers with TorchBitNet\n```python\nimport torch\nfrom isoai.isobits.GPT_bitnet import Transformer\nfrom isoai.isobits.replace_linear_layers import replace_linears_with_torchbitnet\n\nclass ModelArgs:\n def __init__(self):\n self.vocab_size = 30522\n self.dim = 768\n self.n_heads = 12\n self.n_kv_heads = 12\n self.max_seq_len = 512\n self.norm_eps = 1e-5\n self.multiple_of = 64\n self.ffn_dim_multiplier = 4\n self.n_layers = 12\n self.max_batch_size = 32\n\nargs = ModelArgs()\ntokens = torch.randint(0, args.vocab_size, (2, args.max_seq_len))\ntransformer = Transformer(args)\noutput = transformer(tokens, start_pos=0)\nprint(\"Original Transformer Description: \", transformer)\nreplace_linears_with_torchbitnet(transformer, norm_dim=10)\nprint(\"Bitnet Transformer Description: \", transformer)\n```\n\n### Using BitNetLLAMA\n```python\nimport torch\nfrom isoai.isobits.LLama_bitnet import BitNetLLAMA\n\nclass ModelArgs:\n def __init__(self):\n self.vocab_size = 30522\n self.dim = 768\n self.n_heads = 12\n self.n_kv_heads = 12\n self.max_seq_len = 512\n self.norm_eps = 1e-5\n self.multiple_of = 64\n self.ffn_dim_multiplier = 4\n self.max_batch_size = 32\n self.n_layers = 12\n\nargs = ModelArgs()\ntokens = torch.randint(0, args.vocab_size, (2, args.max_seq_len))\nbitnet_llama = BitNetLLAMA(args)\noutput = bitnet_llama(tokens, start_pos=0)\nprint(output.shape)\n```\n\n### Using MoE Transformer\n```python\nimport torch\nfrom isoai.isobits.MoE_bitnet import MoETransformer\n\nclass ModelArgs:\n def __init__(self):\n self.vocab_size = 30522\n self.dim = 768\n self.n_heads = 12\n self.n_kv_heads = 12\n self.max_seq_len = 512\n self.norm_eps = 1e-5\n self.multiple_of = 64\n self.ffn_dim_multiplier = 4\n self.max_batch_size = 32\n self.n_layers = 12\n self.num_experts = 4 # Number of experts in MoE layers\n\nargs = ModelArgs()\ntokens = torch.randint(0, args.vocab_size, (2, args.max_seq_len))\nmoe_transformer = MoETransformer(args)\noutput = moe_transformer(tokens, start_pos=0)\nprint(output.shape)\n```\n\n### Containerizing Your Model\n```python\nfrom isoai.isodeploy.containerize import Containerize\n\nmodel_path = \"isoai\"\noutput_path = \"Dockerfile\"\n\ncontainerize = Containerize(model_path)\ncontainerize.run(output_path)\n```\n\n### Containerizing with OS Detection\n```python\nfrom isoai.isodeploy.oscontainerization import OSContainerize\n\nmodel_path = \"isoai\"\noutput_path = \"Dockerfile\"\n\ncontainerize = OSContainerize(model_path)\ncontainerize.run(output_path)\n```\n\n### Tracing Variables\n```python\nfrom isoai.isotrace.autotrace import Autotrace\n\nsearch_path = \"isoai\"\noutput_file = \"output.json\"\n\nautotrace = Autotrace(search_path)\nautotrace.run(output_file)\n```\n\n\n\n\n\n",
"bugtrack_url": null,
"license": null,
"summary": "Compress, Trace and Deploy Your Models with Iso!",
"version": "0.0.3",
"project_urls": {
"Homepage": "https://github.com/iso-ai/isosdk"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "d3f561c9080af7dfc3f515449a90dfe562cdb9410286b137afd7d07253ea449e",
"md5": "6d4592ffb770508f86bbe4f09a6fb611",
"sha256": "6468acd8399199fbd0e8a864f5b6d82a126b4da757f26bd7bcb50476cc7ab568"
},
"downloads": -1,
"filename": "isoai-0.0.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "6d4592ffb770508f86bbe4f09a6fb611",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.6",
"size": 8346,
"upload_time": "2024-07-30T07:54:28",
"upload_time_iso_8601": "2024-07-30T07:54:28.064287Z",
"url": "https://files.pythonhosted.org/packages/d3/f5/61c9080af7dfc3f515449a90dfe562cdb9410286b137afd7d07253ea449e/isoai-0.0.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "fd4f6a452c7cb7f5a5473f604fbe0ed6137ebfc9b95d2a6625a35f24884dff94",
"md5": "54f2203d90600b2ede14b7119c25b0d2",
"sha256": "b77b11ed4638f14ed4930e4146992b46e8a44c4c32ffb1b5438a44f6705000e7"
},
"downloads": -1,
"filename": "isoai-0.0.3.tar.gz",
"has_sig": false,
"md5_digest": "54f2203d90600b2ede14b7119c25b0d2",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.6",
"size": 16265,
"upload_time": "2024-07-30T07:54:29",
"upload_time_iso_8601": "2024-07-30T07:54:29.412757Z",
"url": "https://files.pythonhosted.org/packages/fd/4f/6a452c7cb7f5a5473f604fbe0ed6137ebfc9b95d2a6625a35f24884dff94/isoai-0.0.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-07-30 07:54:29",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "iso-ai",
"github_project": "isosdk",
"github_not_found": true,
"lcname": "isoai"
}