Name | l0-python JSON |
Version |
0.1.0
JSON |
| download |
home_page | None |
Summary | L0 regularization for sparse neural networks and intelligent sampling |
upload_time | 2025-08-18 09:17:06 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.13 |
license | MIT License
Copyright (c) 2025 PolicyEngine
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
|
keywords |
l0
regularization
sparsity
neural networks
calibration
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# L0 Regularization
A PyTorch implementation of L0 regularization for neural network sparsification and intelligent sampling, based on [Louizos, Welling, & Kingma (2017)](https://arxiv.org/abs/1712.01312).
## Features
- **Hard Concrete Distribution**: Differentiable approximation of L0 norm
- **Sparse Neural Network Layers**: L0Linear, L0Conv2d with automatic pruning
- **Intelligent Sampling**: Sample/feature selection gates for calibration
- **L0L2 Combined Penalty**: Recommended approach to prevent overfitting
- **Temperature Scheduling**: Annealing for improved convergence
- **TDD Development**: Comprehensive test coverage
## Installation
```bash
pip install l0
```
For development:
```bash
git clone https://github.com/PolicyEngine/L0.git
cd L0
pip install -e .[dev]
```
## Quick Start
### Neural Network Sparsification
```python
import torch
from l0 import L0Linear, compute_l0l2_penalty, TemperatureScheduler, update_temperatures
# Create a sparse model
class SparseModel(torch.nn.Module):
def __init__(self):
super().__init__()
self.fc1 = L0Linear(784, 256, init_sparsity=0.5)
self.fc2 = L0Linear(256, 10, init_sparsity=0.7)
def forward(self, x):
x = torch.relu(self.fc1(x))
return self.fc2(x)
model = SparseModel()
optimizer = torch.optim.Adam(model.parameters())
scheduler = TemperatureScheduler(initial_temp=1.0, final_temp=0.1)
# Training loop
for epoch in range(100):
# Update temperature
temp = scheduler.get_temperature(epoch)
update_temperatures(model, temp)
# Forward pass
output = model(input_data)
ce_loss = criterion(output, target)
# Add L0L2 penalty
penalty = compute_l0l2_penalty(model, l0_lambda=1e-3, l2_lambda=1e-4)
loss = ce_loss + penalty
# Backward pass
optimizer.zero_grad()
loss.backward()
optimizer.step()
```
### Intelligent Sample Selection
```python
from l0 import SampleGate, HybridGate
# Pure L0 selection
gate = SampleGate(n_samples=10000, target_samples=1000)
selected_data, indices = gate.select_samples(data)
# Hybrid selection (25% L0, 75% random)
hybrid = HybridGate(
n_items=10000,
l0_fraction=0.25,
random_fraction=0.75,
target_items=1000
)
selected, indices, types = hybrid.select(data)
```
### Feature Selection
```python
from l0 import FeatureGate
# Select top features
gate = FeatureGate(n_features=1000, max_features=50)
selected_data, feature_indices = gate.select_features(data)
# Get feature importance
importance = gate.get_feature_importance()
```
## Integration with PolicyEngine
This package is designed to work with PolicyEngine's calibration system:
```python
# In policyengine-us-data or similar
from l0 import HardConcrete
# Use for household selection in CPS calibration
gates = HardConcrete(
len(household_weights),
temperature=0.25,
init_mean=0.999 # Start with most households
)
# Apply gates during reweighting
masked_weights = weights * gates()
```
## Documentation
Full documentation available at: https://policyengine.github.io/L0/
## Testing
Run tests with:
```bash
pytest tests/ -v --cov=l0
```
## Acknowledgments
This implementation is inspired by and builds upon the [original L0 regularization code](https://github.com/AMLab-Amsterdam/L0_regularization) by AMLab Amsterdam, which accompanied the paper by Louizos et al. (2018).
## Citation
If you use this package, please cite:
```bibtex
@article{louizos2017learning,
title={Learning Sparse Neural Networks through L0 Regularization},
author={Louizos, Christos and Welling, Max and Kingma, Diederik P},
journal={arXiv preprint arXiv:1712.01312},
year={2017}
}
```
## License
MIT License - see [LICENSE](LICENSE) file for details.
Raw data
{
"_id": null,
"home_page": null,
"name": "l0-python",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.13",
"maintainer_email": null,
"keywords": "l0, regularization, sparsity, neural networks, calibration",
"author": null,
"author_email": "PolicyEngine <hello@policyengine.org>",
"download_url": "https://files.pythonhosted.org/packages/0e/83/75d3f065770b983b2932cba851822a9e7838a140f5851b707097a1c691c3/l0_python-0.1.0.tar.gz",
"platform": null,
"description": "# L0 Regularization\n\nA PyTorch implementation of L0 regularization for neural network sparsification and intelligent sampling, based on [Louizos, Welling, & Kingma (2017)](https://arxiv.org/abs/1712.01312).\n\n## Features\n\n- **Hard Concrete Distribution**: Differentiable approximation of L0 norm\n- **Sparse Neural Network Layers**: L0Linear, L0Conv2d with automatic pruning\n- **Intelligent Sampling**: Sample/feature selection gates for calibration\n- **L0L2 Combined Penalty**: Recommended approach to prevent overfitting\n- **Temperature Scheduling**: Annealing for improved convergence\n- **TDD Development**: Comprehensive test coverage\n\n## Installation\n\n```bash\npip install l0\n```\n\nFor development:\n```bash\ngit clone https://github.com/PolicyEngine/L0.git\ncd L0\npip install -e .[dev]\n```\n\n## Quick Start\n\n### Neural Network Sparsification\n\n```python\nimport torch\nfrom l0 import L0Linear, compute_l0l2_penalty, TemperatureScheduler, update_temperatures\n\n# Create a sparse model\nclass SparseModel(torch.nn.Module):\n def __init__(self):\n super().__init__()\n self.fc1 = L0Linear(784, 256, init_sparsity=0.5)\n self.fc2 = L0Linear(256, 10, init_sparsity=0.7)\n \n def forward(self, x):\n x = torch.relu(self.fc1(x))\n return self.fc2(x)\n\nmodel = SparseModel()\noptimizer = torch.optim.Adam(model.parameters())\nscheduler = TemperatureScheduler(initial_temp=1.0, final_temp=0.1)\n\n# Training loop\nfor epoch in range(100):\n # Update temperature\n temp = scheduler.get_temperature(epoch)\n update_temperatures(model, temp)\n \n # Forward pass\n output = model(input_data)\n ce_loss = criterion(output, target)\n \n # Add L0L2 penalty\n penalty = compute_l0l2_penalty(model, l0_lambda=1e-3, l2_lambda=1e-4)\n loss = ce_loss + penalty\n \n # Backward pass\n optimizer.zero_grad()\n loss.backward()\n optimizer.step()\n```\n\n### Intelligent Sample Selection\n\n```python\nfrom l0 import SampleGate, HybridGate\n\n# Pure L0 selection\ngate = SampleGate(n_samples=10000, target_samples=1000)\nselected_data, indices = gate.select_samples(data)\n\n# Hybrid selection (25% L0, 75% random)\nhybrid = HybridGate(\n n_items=10000,\n l0_fraction=0.25,\n random_fraction=0.75,\n target_items=1000\n)\nselected, indices, types = hybrid.select(data)\n```\n\n### Feature Selection\n\n```python\nfrom l0 import FeatureGate\n\n# Select top features\ngate = FeatureGate(n_features=1000, max_features=50)\nselected_data, feature_indices = gate.select_features(data)\n\n# Get feature importance\nimportance = gate.get_feature_importance()\n```\n\n## Integration with PolicyEngine\n\nThis package is designed to work with PolicyEngine's calibration system:\n\n```python\n# In policyengine-us-data or similar\nfrom l0 import HardConcrete\n\n# Use for household selection in CPS calibration\ngates = HardConcrete(\n len(household_weights),\n temperature=0.25,\n init_mean=0.999 # Start with most households\n)\n\n# Apply gates during reweighting\nmasked_weights = weights * gates()\n```\n\n## Documentation\n\nFull documentation available at: https://policyengine.github.io/L0/\n\n## Testing\n\nRun tests with:\n```bash\npytest tests/ -v --cov=l0\n```\n\n## Acknowledgments\n\nThis implementation is inspired by and builds upon the [original L0 regularization code](https://github.com/AMLab-Amsterdam/L0_regularization) by AMLab Amsterdam, which accompanied the paper by Louizos et al. (2018).\n\n## Citation\n\nIf you use this package, please cite:\n\n```bibtex\n@article{louizos2017learning,\n title={Learning Sparse Neural Networks through L0 Regularization},\n author={Louizos, Christos and Welling, Max and Kingma, Diederik P},\n journal={arXiv preprint arXiv:1712.01312},\n year={2017}\n}\n```\n\n## License\n\nMIT License - see [LICENSE](LICENSE) file for details.\n",
"bugtrack_url": null,
"license": "MIT License\n \n Copyright (c) 2025 PolicyEngine\n \n Permission is hereby granted, free of charge, to any person obtaining a copy\n of this software and associated documentation files (the \"Software\"), to deal\n in the Software without restriction, including without limitation the rights\n to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n copies of the Software, and to permit persons to whom the Software is\n furnished to do so, subject to the following conditions:\n \n The above copyright notice and this permission notice shall be included in all\n copies or substantial portions of the Software.\n \n THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n SOFTWARE.\n ",
"summary": "L0 regularization for sparse neural networks and intelligent sampling",
"version": "0.1.0",
"project_urls": {
"Documentation": "https://github.com/PolicyEngine/L0#readme",
"Homepage": "https://github.com/PolicyEngine/L0",
"Issues": "https://github.com/PolicyEngine/L0/issues",
"Repository": "https://github.com/PolicyEngine/L0"
},
"split_keywords": [
"l0",
" regularization",
" sparsity",
" neural networks",
" calibration"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "43f786fae3394793c8bfdf5cc04d747391184d86615c8d503cc396ecdbedeaf0",
"md5": "a668a1a2ffdf5ebc310244f701251bdd",
"sha256": "6939b41a0d364e56504d3840b8426b387a2c047a299b7d6ab412db7169e768c1"
},
"downloads": -1,
"filename": "l0_python-0.1.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "a668a1a2ffdf5ebc310244f701251bdd",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.13",
"size": 15637,
"upload_time": "2025-08-18T09:17:04",
"upload_time_iso_8601": "2025-08-18T09:17:04.853720Z",
"url": "https://files.pythonhosted.org/packages/43/f7/86fae3394793c8bfdf5cc04d747391184d86615c8d503cc396ecdbedeaf0/l0_python-0.1.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "0e8375d3f065770b983b2932cba851822a9e7838a140f5851b707097a1c691c3",
"md5": "fc39125af3dc0c3dc6dce6b300975190",
"sha256": "35dc08f0f3609661aa89b617cecddafbdaa738095c89e0435ea4ed21b9bca2ac"
},
"downloads": -1,
"filename": "l0_python-0.1.0.tar.gz",
"has_sig": false,
"md5_digest": "fc39125af3dc0c3dc6dce6b300975190",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.13",
"size": 22507,
"upload_time": "2025-08-18T09:17:06",
"upload_time_iso_8601": "2025-08-18T09:17:06.007508Z",
"url": "https://files.pythonhosted.org/packages/0e/83/75d3f065770b983b2932cba851822a9e7838a140f5851b707097a1c691c3/l0_python-0.1.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-18 09:17:06",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "PolicyEngine",
"github_project": "L0#readme",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "l0-python"
}