# ml-experiment-utils
**Opinionated utilities for quick ML and deep learning experiments**
A personal collection of reusable utilities designed to streamline machine learning and deep learning experimentation workflows. This package provides ready-to-use components for common ML tasks, with built-in best practices and sensible defaults.
## Features
### Training Loops
- **`simple_cls_train_v1`**: Complete training loop for classification tasks with:
- Automatic device detection (CUDA/CPU)
- Cosine annealing learning rate scheduling
- Built-in Weights & Biases integration for experiment tracking
- Periodic evaluation and logging
- Progress tracking with tqdm
- Exponentially weighted moving averages for metrics
### Data Utilities
- **`CIFAR_for_torch`**: Pre-configured CIFAR-10 dataset with:
- Standard normalization (channel-wise mean/std)
- Data augmentation for training (ColorJitter, horizontal flip, rotation)
- Ready-to-use PyTorch DataLoaders
### Helper Classes
- **`ApproximateSlidingAverage`**: Efficient exponentially weighted moving average for metric tracking
## Installation
```bash
pip install ml-experiment-utils
```
Or with uv:
```bash
uv add ml-experiment-utils
```
## Quick Start
### Training a Model on CIFAR-10
```python
import torch.nn as nn
from ml_experiment_utils.experiments.data import CIFAR_for_torch
from ml_experiment_utils.experiments.train_loops import simple_cls_train_v1
# Load data
train_loader, test_loader = CIFAR_for_torch(batch_size=128)
# Define your model
model = nn.Sequential(
nn.Flatten(),
nn.Linear(3*32*32, 512),
nn.ReLU(),
nn.Linear(512, 10)
)
# Train with built-in logging and evaluation
simple_cls_train_v1(
model=model,
epochs=10,
eval_steps=100,
train_loader=train_loader,
test_loader=test_loader,
lr=5e-4,
name="my-experiment"
)
```
### Using the Data Loaders
```python
from ml_experiment_utils.experiments.data import CIFAR_for_torch
# Get pre-configured CIFAR-10 loaders
train_loader, test_loader = CIFAR_for_torch(
batch_size=64,
root="./my_data"
)
# Start training immediately
for images, labels in train_loader:
# Your training code here
pass
```
## Requirements
- PyTorch (with torchvision)
- Weights & Biases
- tqdm
## Design Philosophy
This package is designed with the following principles:
1. **Opinionated but flexible**: Sensible defaults that work well for most cases, but configurable when needed
2. **Batteries included**: Everything you need to start experimenting quickly
3. **Experiment tracking first**: Built-in W&B integration for reproducibility
4. **Minimal boilerplate**: Focus on model architecture, not training infrastructure
## Use Cases
This package is ideal for:
- Quick prototyping of ML models
- Educational projects and learning
- Baseline implementations for research
- Rapid iteration on model architectures
## Contributing
This is primarily a personal utility package, but suggestions and improvements are welcome! Feel free to open an issue or submit a pull request.
## License
MIT License - see LICENSE file for details.
Raw data
{
"_id": null,
"home_page": null,
"name": "ml-experiment-utils",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.11",
"maintainer_email": null,
"keywords": "deep-learning, experiments, machine-learning, pytorch, training, utilities",
"author": null,
"author_email": "Daniel Bustamante Ospina <dbustamante70@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/ff/bd/d6ad3721f9a2776167d380399b8c37f011987d69e98bd0d13acd86399b7a/ml_experiment_utils-0.2.0.tar.gz",
"platform": null,
"description": "# ml-experiment-utils\n\n**Opinionated utilities for quick ML and deep learning experiments**\n\nA personal collection of reusable utilities designed to streamline machine learning and deep learning experimentation workflows. This package provides ready-to-use components for common ML tasks, with built-in best practices and sensible defaults.\n\n## Features\n\n### Training Loops\n- **`simple_cls_train_v1`**: Complete training loop for classification tasks with:\n - Automatic device detection (CUDA/CPU)\n - Cosine annealing learning rate scheduling\n - Built-in Weights & Biases integration for experiment tracking\n - Periodic evaluation and logging\n - Progress tracking with tqdm\n - Exponentially weighted moving averages for metrics\n\n### Data Utilities\n- **`CIFAR_for_torch`**: Pre-configured CIFAR-10 dataset with:\n - Standard normalization (channel-wise mean/std)\n - Data augmentation for training (ColorJitter, horizontal flip, rotation)\n - Ready-to-use PyTorch DataLoaders\n\n### Helper Classes\n- **`ApproximateSlidingAverage`**: Efficient exponentially weighted moving average for metric tracking\n\n## Installation\n\n```bash\npip install ml-experiment-utils\n```\n\nOr with uv:\n```bash\nuv add ml-experiment-utils\n```\n\n## Quick Start\n\n### Training a Model on CIFAR-10\n\n```python\nimport torch.nn as nn\nfrom ml_experiment_utils.experiments.data import CIFAR_for_torch\nfrom ml_experiment_utils.experiments.train_loops import simple_cls_train_v1\n\n# Load data\ntrain_loader, test_loader = CIFAR_for_torch(batch_size=128)\n\n# Define your model\nmodel = nn.Sequential(\n nn.Flatten(),\n nn.Linear(3*32*32, 512),\n nn.ReLU(),\n nn.Linear(512, 10)\n)\n\n# Train with built-in logging and evaluation\nsimple_cls_train_v1(\n model=model,\n epochs=10,\n eval_steps=100,\n train_loader=train_loader,\n test_loader=test_loader,\n lr=5e-4,\n name=\"my-experiment\"\n)\n```\n\n### Using the Data Loaders\n\n```python\nfrom ml_experiment_utils.experiments.data import CIFAR_for_torch\n\n# Get pre-configured CIFAR-10 loaders\ntrain_loader, test_loader = CIFAR_for_torch(\n batch_size=64,\n root=\"./my_data\"\n)\n\n# Start training immediately\nfor images, labels in train_loader:\n # Your training code here\n pass\n```\n\n## Requirements\n\n- PyTorch (with torchvision)\n- Weights & Biases\n- tqdm\n\n## Design Philosophy\n\nThis package is designed with the following principles:\n\n1. **Opinionated but flexible**: Sensible defaults that work well for most cases, but configurable when needed\n2. **Batteries included**: Everything you need to start experimenting quickly\n3. **Experiment tracking first**: Built-in W&B integration for reproducibility\n4. **Minimal boilerplate**: Focus on model architecture, not training infrastructure\n\n## Use Cases\n\nThis package is ideal for:\n- Quick prototyping of ML models\n- Educational projects and learning\n- Baseline implementations for research\n- Rapid iteration on model architectures\n\n## Contributing\n\nThis is primarily a personal utility package, but suggestions and improvements are welcome! Feel free to open an issue or submit a pull request.\n\n## License\n\nMIT License - see LICENSE file for details.\n\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Opinionated utilities for quick ML and deep learning experiments",
"version": "0.2.0",
"project_urls": {
"Repository": "https://github.com/dbuos/ml-experiment-utils"
},
"split_keywords": [
"deep-learning",
" experiments",
" machine-learning",
" pytorch",
" training",
" utilities"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "883bd20999990d2afe89b7172fad778093448a714f16451551c1b8d6ccbb61ad",
"md5": "a96e19bf984b6397f365c1a4fbfa1b39",
"sha256": "245918a9cfa3f89b82dc7a773e84f03c2eca4ed5071524d11386570f920723c0"
},
"downloads": -1,
"filename": "ml_experiment_utils-0.2.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "a96e19bf984b6397f365c1a4fbfa1b39",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.11",
"size": 6575,
"upload_time": "2025-10-27T03:19:05",
"upload_time_iso_8601": "2025-10-27T03:19:05.821845Z",
"url": "https://files.pythonhosted.org/packages/88/3b/d20999990d2afe89b7172fad778093448a714f16451551c1b8d6ccbb61ad/ml_experiment_utils-0.2.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "ffbdd6ad3721f9a2776167d380399b8c37f011987d69e98bd0d13acd86399b7a",
"md5": "ab676a58a1b46878584d71e7f5079d3f",
"sha256": "6b4f270fb7d435ec102f2e3c7bef856e366431dda103413f139b954eee85ebab"
},
"downloads": -1,
"filename": "ml_experiment_utils-0.2.0.tar.gz",
"has_sig": false,
"md5_digest": "ab676a58a1b46878584d71e7f5079d3f",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.11",
"size": 5310,
"upload_time": "2025-10-27T03:19:07",
"upload_time_iso_8601": "2025-10-27T03:19:07.133991Z",
"url": "https://files.pythonhosted.org/packages/ff/bd/d6ad3721f9a2776167d380399b8c37f011987d69e98bd0d13acd86399b7a/ml_experiment_utils-0.2.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-10-27 03:19:07",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "dbuos",
"github_project": "ml-experiment-utils",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "ml-experiment-utils"
}