lora-pytorch


Namelora-pytorch JSON
Version 0.2.0 PyPI version JSON
download
home_pagehttps://github.com/fkodom/lora-pytorch
Summaryproject_description
upload_time2024-05-13 01:17:09
maintainerNone
docs_urlNone
authorFrank Odom
requires_python>=3.8
licenseNone
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage
            # lora-pytorch

A simple but robust implementation of [LoRA (Low-Rank Adaptation)](https://arxiv.org/pdf/2106.09685.pdf) for PyTorch, which depends only on PyTorch itself!  No dependence on `transformers` or other packages.
* Compatible with LLMs, CNNs, MLPs, and other model types ✔️
* Strongly typed ✔️
* Fully tested ✔️


## Install

PyPI:
```bash
pip install lora-pytorch
```

From source:
```bash
pip install "lora-pytorch @ git+ssh://git@github.com/fkodom/lora-pytorch.git"
```

For contributors:
```bash
# Clone repository
gh repo clone fkodom/lora-pytorch
# Install all dev dependencies (tests etc.)
cd lora-pytorch
pip install -e ".[all]"
# Setup pre-commit hooks
pre-commit install
```


## Usage

```python
import torch
from lora_pytorch import LoRA
from torchvision.models import resnet18, ResNet

# Wrap your model with LoRA
model = resnet18()
lora_model = LoRA.from_module(model, rank=5)

print(lora_model)
# LoRA(
#   (module): ResNet(
#     (conv1): LoRA(
#       (module): Conv2d(3, 64, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3), bias=False)
#       (lora_module): Conv2dLoRAModule(
#         (in_conv): Conv2d(3, 5, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3), bias=False)
#         (out_conv): Conv2d(5, 64, kernel_size=(1, 1), stride=(1, 1), bias=False)
#         (dropout): Dropout(p=0.0, inplace=False)
#       )
#     )
#     (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
#     (relu): ReLU(inplace=True)
# ...

# Train or predict as usual.
x = torch.randn(1, 3, 224, 224)
y = lora_model(x)
# compute loss, backprop, etc...

# Merge LoRA weights into the original model.
new_model = lora_model.merge_lora(inplace=False)  # default: inplace=False

# NOTE: new_model has the same type as the original model!  Inference is just as
# fast as in the original model.
assert isinstance(new_model, ResNet)
```

### Advanced Usage

Enable or disable `LoRA` as needed. (e.g. to access the original model)

**NOTE**: `LoRA` will *not* track gradients from the original model.
```python
# Disable
lora_model.disable_lora()
y = lora_model(x)
print(y.requires_grad)
# False

# Re-enable
lora_model.enable_lora()
y = lora_model(x)
print(y.requires_grad)
# True
```

Remove `LoRA` from the model.

**NOTE**: The original model weights will be unchanged.
```python
# Remove
original_model = lora_model.remove_lora(inplace=False)  # default: inplace=False
assert isinstance(original_model, ResNet)
```


## Supported Layers

Layer | Supported
--- | ---
`nn.Linear` | ✅
`nn.Embedding` | ✅
`nn.MultiheadAttention` | ✅
`nn.TransformerEncoder` | ✅
`nn.TransformerEncoderLayer` | ✅
`nn.TransformerDecoder` | ✅
`nn.TransformerDecoderLayer` | ✅
`nn.Transformer` | ✅
`nn.Conv1d` | ✅
`nn.Conv2d` | ✅
`nn.Conv3d` | ✅
`nn.ConvTranspose1d` | ❌
`nn.ConvTranspose2d` | ❌
`nn.ConvTranspose3d` | ❌

**NOTE**: Activation, normalization, dropout, etc. layers are not affected by `LoRA`.  Those are not listed here, but you shouldn't have any problems using them.

## TODO

* Add support for `ConvTranspose` layers.
* Experiments with large, pretrained models
    * Specifically, models that are not covered by LoRA in [huggingface/transformers](https://github.com/huggingface/transformers).
    * Lots of CV examples: ResNet, ViT, DETR, UNET, DeepLab, etc.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/fkodom/lora-pytorch",
    "name": "lora-pytorch",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": null,
    "author": "Frank Odom",
    "author_email": "frank.odom.iii@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/ce/8b/39068c1710ace1982685d9e710d70ef8bdc4df34be74dfa647e5d5ca4ae9/lora_pytorch-0.2.0.tar.gz",
    "platform": null,
    "description": "# lora-pytorch\n\nA simple but robust implementation of [LoRA (Low-Rank Adaptation)](https://arxiv.org/pdf/2106.09685.pdf) for PyTorch, which depends only on PyTorch itself!  No dependence on `transformers` or other packages.\n* Compatible with LLMs, CNNs, MLPs, and other model types \u2714\ufe0f\n* Strongly typed \u2714\ufe0f\n* Fully tested \u2714\ufe0f\n\n\n## Install\n\nPyPI:\n```bash\npip install lora-pytorch\n```\n\nFrom source:\n```bash\npip install \"lora-pytorch @ git+ssh://git@github.com/fkodom/lora-pytorch.git\"\n```\n\nFor contributors:\n```bash\n# Clone repository\ngh repo clone fkodom/lora-pytorch\n# Install all dev dependencies (tests etc.)\ncd lora-pytorch\npip install -e \".[all]\"\n# Setup pre-commit hooks\npre-commit install\n```\n\n\n## Usage\n\n```python\nimport torch\nfrom lora_pytorch import LoRA\nfrom torchvision.models import resnet18, ResNet\n\n# Wrap your model with LoRA\nmodel = resnet18()\nlora_model = LoRA.from_module(model, rank=5)\n\nprint(lora_model)\n# LoRA(\n#   (module): ResNet(\n#     (conv1): LoRA(\n#       (module): Conv2d(3, 64, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3), bias=False)\n#       (lora_module): Conv2dLoRAModule(\n#         (in_conv): Conv2d(3, 5, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3), bias=False)\n#         (out_conv): Conv2d(5, 64, kernel_size=(1, 1), stride=(1, 1), bias=False)\n#         (dropout): Dropout(p=0.0, inplace=False)\n#       )\n#     )\n#     (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)\n#     (relu): ReLU(inplace=True)\n# ...\n\n# Train or predict as usual.\nx = torch.randn(1, 3, 224, 224)\ny = lora_model(x)\n# compute loss, backprop, etc...\n\n# Merge LoRA weights into the original model.\nnew_model = lora_model.merge_lora(inplace=False)  # default: inplace=False\n\n# NOTE: new_model has the same type as the original model!  Inference is just as\n# fast as in the original model.\nassert isinstance(new_model, ResNet)\n```\n\n### Advanced Usage\n\nEnable or disable `LoRA` as needed. (e.g. to access the original model)\n\n**NOTE**: `LoRA` will *not* track gradients from the original model.\n```python\n# Disable\nlora_model.disable_lora()\ny = lora_model(x)\nprint(y.requires_grad)\n# False\n\n# Re-enable\nlora_model.enable_lora()\ny = lora_model(x)\nprint(y.requires_grad)\n# True\n```\n\nRemove `LoRA` from the model.\n\n**NOTE**: The original model weights will be unchanged.\n```python\n# Remove\noriginal_model = lora_model.remove_lora(inplace=False)  # default: inplace=False\nassert isinstance(original_model, ResNet)\n```\n\n\n## Supported Layers\n\nLayer | Supported\n--- | ---\n`nn.Linear` | \u2705\n`nn.Embedding` | \u2705\n`nn.MultiheadAttention` | \u2705\n`nn.TransformerEncoder` | \u2705\n`nn.TransformerEncoderLayer` | \u2705\n`nn.TransformerDecoder` | \u2705\n`nn.TransformerDecoderLayer` | \u2705\n`nn.Transformer` | \u2705\n`nn.Conv1d` | \u2705\n`nn.Conv2d` | \u2705\n`nn.Conv3d` | \u2705\n`nn.ConvTranspose1d` | \u274c\n`nn.ConvTranspose2d` | \u274c\n`nn.ConvTranspose3d` | \u274c\n\n**NOTE**: Activation, normalization, dropout, etc. layers are not affected by `LoRA`.  Those are not listed here, but you shouldn't have any problems using them.\n\n## TODO\n\n* Add support for `ConvTranspose` layers.\n* Experiments with large, pretrained models\n    * Specifically, models that are not covered by LoRA in [huggingface/transformers](https://github.com/huggingface/transformers).\n    * Lots of CV examples: ResNet, ViT, DETR, UNET, DeepLab, etc.\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "project_description",
    "version": "0.2.0",
    "project_urls": {
        "Homepage": "https://github.com/fkodom/lora-pytorch"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "aba659170ff46d9cdd5a0705257c2345a922c875a682caf63b0e9320cc47a167",
                "md5": "cb69cc6c431654c6d4e31c5a941876f7",
                "sha256": "cb0395d7413510a36b2e85bcf448e52f9eabd4af8da01ab7fb2c84c5d08a77e7"
            },
            "downloads": -1,
            "filename": "lora_pytorch-0.2.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "cb69cc6c431654c6d4e31c5a941876f7",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 12846,
            "upload_time": "2024-05-13T01:17:08",
            "upload_time_iso_8601": "2024-05-13T01:17:08.163710Z",
            "url": "https://files.pythonhosted.org/packages/ab/a6/59170ff46d9cdd5a0705257c2345a922c875a682caf63b0e9320cc47a167/lora_pytorch-0.2.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "ce8b39068c1710ace1982685d9e710d70ef8bdc4df34be74dfa647e5d5ca4ae9",
                "md5": "80a2c1068d8d27d254b3226fb7bb556a",
                "sha256": "2d15145198429fe0134245acbf411dc480d747d15a1ccffe92650634ecd914b1"
            },
            "downloads": -1,
            "filename": "lora_pytorch-0.2.0.tar.gz",
            "has_sig": false,
            "md5_digest": "80a2c1068d8d27d254b3226fb7bb556a",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 14142,
            "upload_time": "2024-05-13T01:17:09",
            "upload_time_iso_8601": "2024-05-13T01:17:09.818023Z",
            "url": "https://files.pythonhosted.org/packages/ce/8b/39068c1710ace1982685d9e710d70ef8bdc4df34be74dfa647e5d5ca4ae9/lora_pytorch-0.2.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-05-13 01:17:09",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "fkodom",
    "github_project": "lora-pytorch",
    "travis_ci": false,
    "coveralls": true,
    "github_actions": true,
    "lcname": "lora-pytorch"
}
        
Elapsed time: 4.35368s