Name | peft-singlora JSON |
Version |
0.2.0
JSON |
| download |
home_page | None |
Summary | SingLoRA: Single Low-Rank Adaptation for PEFT |
upload_time | 2025-07-23 01:38:42 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.8 |
license | BSD 2-Clause License
Copyright (c) 2025, bagheera
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
1. Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright notice,
this list of conditions and the following disclaimer in the documentation
and/or other materials provided with the distribution.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
keywords |
peft
lora
singlora
fine-tuning
transformers
pytorch
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# PEFT-SingLoRA
[](https://badge.fury.io/py/peft-singlora)
[](https://opensource.org/licenses/BSD-2-Clause)
[](https://www.python.org/downloads/)
SingLoRA (Single Low-Rank Adaptation) is an efficient alternative to traditional LoRA that uses a single low-rank matrix instead of two, reducing parameters while maintaining performance. This package provides a PEFT-compatible implementation of SingLoRA based on [kyegomez's implementation](https://github.com/kyegomez/SingLoRA).
## Key Features
- 🚀 **50% fewer parameters** than standard LoRA
- 🔧 **Fully compatible with PEFT** ecosystem
- 📊 **Mathematically equivalent** to standard LoRA
- 🎯 **Easy integration** with existing PEFT workflows
## Installation
```bash
pip install peft-singlora
```
## Quick Start
Here is an extremely simplified training loop example to help understand when to update the global step:
```python
from transformers import AutoModelForCausalLM
from peft import LoraConfig, get_peft_model
from peft_singlora import setup_singlora, update_singlora_global_step
# Load your model
model = AutoModelForCausalLM.from_pretrained("your-model-name")
# Setup SingLoRA (this registers it with PEFT)
setup_singlora()
# Configure LoRA as usual - it will use SingLoRA under the hood
config = LoraConfig(
r=8,
lora_alpha=32,
target_modules=["q_proj", "v_proj"],
lora_dropout=0.1,
)
# Create PEFT model - will automatically use SingLoRA for linear layers
peft_model = get_peft_model(model, config)
# Training loop with proper step tracking
gradient_accumulation_steps = 4
global_step = 0
for epoch in range(num_epochs):
for batch_idx, batch in enumerate(dataloader):
# Forward pass
outputs = peft_model(**batch)
loss = outputs.loss / gradient_accumulation_steps
# Backward pass
loss.backward()
# Update weights and global step every N batches
if (batch_idx + 1) % gradient_accumulation_steps == 0:
optimizer.step()
optimizer.zero_grad()
# Update SingLoRA step counter after optimizer step
update_singlora_global_step(peft_model, global_step)
global_step += 1
```
## How It Works
Traditional LoRA uses two matrices (A and B) for the low-rank decomposition:
```
W = W_0 + BA
```
SingLoRA uses a single matrix A with a symmetric decomposition:
```
W = W_0 + α/r * A @ A^T
```
This reduces trainable parameters from `2 * d * r` to `d * r` while maintaining the same expressive power.
## Advanced Usage
### Custom Configuration
```python
from peft_singlora import SingLoRAConfig
config = SingLoRAConfig(
r=16,
lora_alpha=32,
target_modules=["q_proj", "v_proj", "k_proj"],
lora_dropout=0.1,
ramp_up_steps=1000, # Gradually increase adaptation strength
)
```
### Manual Integration
```python
import torch.nn as nn
from peft_singlora import Linear as SingLoRALinear
# Register custom module mapping
custom_module_mapping = {nn.Linear: SingLoRALinear}
config._register_custom_module(custom_module_mapping)
```
## Examples
Check out the [examples/](https://github.com/bghira/PEFT-SingLoRA/tree/main/examples) directory for:
- Basic usage with different model architectures
- Fine-tuning examples with real datasets
- Performance comparisons with standard LoRA
## Citation
If you use SingLoRA in your research, please cite:
```bibtex
@misc{bensaïd2025singloralowrankadaptation,
title={SingLoRA: Low Rank Adaptation Using a Single Matrix},
author={David Bensaïd and Noam Rotstein and Roy Velich and Daniel Bensaïd and Ron Kimmel},
year={2025},
eprint={2507.05566},
archivePrefix={arXiv},
primaryClass={cs.AI},
url={https://arxiv.org/abs/2507.05566},
}
```
## Contributing
We welcome contributions! Please see our [Contributing Guidelines](CONTRIBUTING.md) for details.
## License
This project is licensed under the BSD 2-Clause License - see the [LICENSE](LICENSE) file for details.
Raw data
{
"_id": null,
"home_page": null,
"name": "peft-singlora",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": "bghira <bghira@users.github.com>",
"keywords": "peft, lora, singlora, fine-tuning, transformers, pytorch",
"author": null,
"author_email": "bghira <bghira@users.github.com>",
"download_url": "https://files.pythonhosted.org/packages/7c/ee/6d916a01bf2ce1f8f306d0133c4165170efced887da386c22266a81da683/peft_singlora-0.2.0.tar.gz",
"platform": null,
"description": "# PEFT-SingLoRA\n\n[](https://badge.fury.io/py/peft-singlora)\n[](https://opensource.org/licenses/BSD-2-Clause)\n[](https://www.python.org/downloads/)\n\nSingLoRA (Single Low-Rank Adaptation) is an efficient alternative to traditional LoRA that uses a single low-rank matrix instead of two, reducing parameters while maintaining performance. This package provides a PEFT-compatible implementation of SingLoRA based on [kyegomez's implementation](https://github.com/kyegomez/SingLoRA).\n\n## Key Features\n\n- \ud83d\ude80 **50% fewer parameters** than standard LoRA\n- \ud83d\udd27 **Fully compatible with PEFT** ecosystem\n- \ud83d\udcca **Mathematically equivalent** to standard LoRA\n- \ud83c\udfaf **Easy integration** with existing PEFT workflows\n\n## Installation\n\n```bash\npip install peft-singlora\n```\n\n## Quick Start\n\nHere is an extremely simplified training loop example to help understand when to update the global step:\n\n```python\nfrom transformers import AutoModelForCausalLM\nfrom peft import LoraConfig, get_peft_model\nfrom peft_singlora import setup_singlora, update_singlora_global_step\n\n# Load your model\nmodel = AutoModelForCausalLM.from_pretrained(\"your-model-name\")\n\n# Setup SingLoRA (this registers it with PEFT)\nsetup_singlora()\n\n# Configure LoRA as usual - it will use SingLoRA under the hood\nconfig = LoraConfig(\n r=8,\n lora_alpha=32,\n target_modules=[\"q_proj\", \"v_proj\"],\n lora_dropout=0.1,\n)\n\n# Create PEFT model - will automatically use SingLoRA for linear layers\npeft_model = get_peft_model(model, config)\n\n# Training loop with proper step tracking\ngradient_accumulation_steps = 4\nglobal_step = 0\n\nfor epoch in range(num_epochs):\n for batch_idx, batch in enumerate(dataloader):\n # Forward pass\n outputs = peft_model(**batch)\n loss = outputs.loss / gradient_accumulation_steps\n \n # Backward pass\n loss.backward()\n \n # Update weights and global step every N batches\n if (batch_idx + 1) % gradient_accumulation_steps == 0:\n optimizer.step()\n optimizer.zero_grad()\n \n # Update SingLoRA step counter after optimizer step\n update_singlora_global_step(peft_model, global_step)\n global_step += 1\n```\n\n## How It Works\n\nTraditional LoRA uses two matrices (A and B) for the low-rank decomposition:\n```\nW = W_0 + BA\n```\n\nSingLoRA uses a single matrix A with a symmetric decomposition:\n```\nW = W_0 + \u03b1/r * A @ A^T\n```\n\nThis reduces trainable parameters from `2 * d * r` to `d * r` while maintaining the same expressive power.\n\n## Advanced Usage\n\n### Custom Configuration\n\n```python\nfrom peft_singlora import SingLoRAConfig\n\nconfig = SingLoRAConfig(\n r=16,\n lora_alpha=32,\n target_modules=[\"q_proj\", \"v_proj\", \"k_proj\"],\n lora_dropout=0.1,\n ramp_up_steps=1000, # Gradually increase adaptation strength\n)\n```\n\n### Manual Integration\n\n```python\nimport torch.nn as nn\nfrom peft_singlora import Linear as SingLoRALinear\n\n# Register custom module mapping\ncustom_module_mapping = {nn.Linear: SingLoRALinear}\nconfig._register_custom_module(custom_module_mapping)\n```\n\n## Examples\n\nCheck out the [examples/](https://github.com/bghira/PEFT-SingLoRA/tree/main/examples) directory for:\n- Basic usage with different model architectures\n- Fine-tuning examples with real datasets\n- Performance comparisons with standard LoRA\n\n## Citation\n\nIf you use SingLoRA in your research, please cite:\n\n```bibtex\n@misc{bensa\u00efd2025singloralowrankadaptation,\n title={SingLoRA: Low Rank Adaptation Using a Single Matrix}, \n author={David Bensa\u00efd and Noam Rotstein and Roy Velich and Daniel Bensa\u00efd and Ron Kimmel},\n year={2025},\n eprint={2507.05566},\n archivePrefix={arXiv},\n primaryClass={cs.AI},\n url={https://arxiv.org/abs/2507.05566}, \n}\n```\n\n## Contributing\n\nWe welcome contributions! Please see our [Contributing Guidelines](CONTRIBUTING.md) for details.\n\n## License\n\nThis project is licensed under the BSD 2-Clause License - see the [LICENSE](LICENSE) file for details.\n",
"bugtrack_url": null,
"license": "BSD 2-Clause License\n \n Copyright (c) 2025, bagheera\n \n Redistribution and use in source and binary forms, with or without\n modification, are permitted provided that the following conditions are met:\n \n 1. Redistributions of source code must retain the above copyright notice, this\n list of conditions and the following disclaimer.\n \n 2. Redistributions in binary form must reproduce the above copyright notice,\n this list of conditions and the following disclaimer in the documentation\n and/or other materials provided with the distribution.\n \n THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\"\n AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\n IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\n DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE\n FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL\n DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR\n SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER\n CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,\n OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\n OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n ",
"summary": "SingLoRA: Single Low-Rank Adaptation for PEFT",
"version": "0.2.0",
"project_urls": {
"Documentation": "https://github.com/bghira/PEFT-SingLoRA#readme",
"Homepage": "https://github.com/bghira/PEFT-SingLoRA",
"Issues": "https://github.com/bghira/PEFT-SingLoRA/issues",
"Repository": "https://github.com/bghira/PEFT-SingLoRA"
},
"split_keywords": [
"peft",
" lora",
" singlora",
" fine-tuning",
" transformers",
" pytorch"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "65705450aea2b71700ed110c36b54ed3be8b0fbe639a1663f4ad961c8d47a865",
"md5": "c300494d58614d692e12b3515f041a1e",
"sha256": "1ddc546eb18f9d6f4e0d80a56f224a0c852de5b72234408a58e1ca3669f9b703"
},
"downloads": -1,
"filename": "peft_singlora-0.2.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "c300494d58614d692e12b3515f041a1e",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 9583,
"upload_time": "2025-07-23T01:38:41",
"upload_time_iso_8601": "2025-07-23T01:38:41.527319Z",
"url": "https://files.pythonhosted.org/packages/65/70/5450aea2b71700ed110c36b54ed3be8b0fbe639a1663f4ad961c8d47a865/peft_singlora-0.2.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "7cee6d916a01bf2ce1f8f306d0133c4165170efced887da386c22266a81da683",
"md5": "c74726cc51c815ba8f4b1119b3891d52",
"sha256": "f90eda649ff354ae9aa6777e6761e98c8eb56a7b265ba1e734203a318c2716b7"
},
"downloads": -1,
"filename": "peft_singlora-0.2.0.tar.gz",
"has_sig": false,
"md5_digest": "c74726cc51c815ba8f4b1119b3891d52",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 12271,
"upload_time": "2025-07-23T01:38:42",
"upload_time_iso_8601": "2025-07-23T01:38:42.646747Z",
"url": "https://files.pythonhosted.org/packages/7c/ee/6d916a01bf2ce1f8f306d0133c4165170efced887da386c22266a81da683/peft_singlora-0.2.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-23 01:38:42",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "bghira",
"github_project": "PEFT-SingLoRA#readme",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "peft-singlora"
}