zencfg


Namezencfg JSON
Version 0.4.0 PyPI version JSON
download
home_pageNone
SummaryThe Zen way to configure your Python projects, deep learning and others
upload_time2025-07-30 18:48:40
maintainerNone
docs_urlNone
authorJean Kossaifi
requires_python>=3.9
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # ZenCFG

![ZenCFG Logo](docs/_static/ZenCFG_image.png)

A Zen way to configure your Python packages while keeping your sanity.

## ⚡ Quick Start

First install the library:

```bash
pip install zencfg
```

```python
from zencfg import ConfigBase, make_config_from_cli

# Define base categories
class ModelConfig(ConfigBase):
    pass

class OptimizerConfig(ConfigBase):
    pass

# Define model architectures
class TransformerConfig(ModelConfig):
    layers: int = 12
    n_heads: int = 8

class CNNConfig(ModelConfig):
    channels: list[int] = [64, 128, 256]
    kernel_size: int = 3

# Define optimizers  
class AdamConfig(OptimizerConfig):
    lr: float = 1e-4
    weight_decay: float = 0.01

class SGDConfig(OptimizerConfig):
    lr: float = 1e-3
    momentum: float = 0.9

# Compose your experiment
class ExperimentConfig(ConfigBase):
    model: ModelConfig = TransformerConfig()
    optimizer: OptimizerConfig = AdamConfig()
    batch_size: int = 32

# Get config with command-line overrides
config = make_config_from_cli(ExperimentConfig)
```

Switch between architectures and tune their specific parameters:
```bash
# Switch to CNN with specific CNN parameters
python train.py --model cnnconfig --model.channels "[32,64,128]" --model.kernel_size 5

# Try SGD with momentum
python train.py --optimizer sgdconfig --optimizer.momentum 0.95 --batch_size 128

# Mix and match everything
python train.py --model transformerconfig --model.n_heads 16 --optimizer adamconfig --optimizer.weight_decay 0.001
```

## Why ZenCFG

ZenCFG (for *Zen ConFiGuration*), is the result of many iterations of trying pretty much every existing approach to Python configuration management systems and being satisfied by none of them. 

The key advantages of ZenCFG are:

### **1. Native Python Tooling**
Work with configs like any other Python code—inheritance, composition, and type hints provide familiar development patterns.
This also means full IDE support with autocomplete, refactoring safety, and type checking.

```python
class ModelConfig(ConfigBase):
    layers: int = 12  # IDE autocomplete and type checking
    learning_rate: float = 1e-4  # Runtime validation through type hints
```

### **2. Reduced Debugging Time** 
Catch configuration errors at startup with type safety and runtime validation, not hours into expensive training runs.

```python
config = ModelConfig(layers="invalid")  # ValidationError immediately!
# No more failed experiments due to config typos
```

### **3. Quick and Flexible Experimentation**
Override any nested parameter through intuitive command-line syntax without file editing. Switch between model architectures, optimizers, and their specific parameters in a single command.

```bash
# Switch architectures and tune their specific parameters
python train.py --model ditconfig --model.n_heads 16 --optimizer sgdconfig --optimizer.momentum 0.9
```

### **4. Zero Boilerplate**
Pure Python classes with no frameworks, no special syntax, and no additional dependencies. If you know Python, you know ZenCFG.

```python
from zencfg import make_config_from_cli
config = make_config_from_cli(MyConfig)  # That's it!
```

It was built originally to configure and manage scripts for Deep Learning experiments, but you can use it for any Python project.
The examples I use are Deep Learning inspired.

## Install

Just clone the repository and install it, here in editable mode:

```bash
git clone https://github.com/JeanKossaifi/zencfg
cd zencfg
python -m pip install -e .
```

## Defining configurations

There are two main type of configurations: core configuration categories, and subcategories.

### Core configurations categories

Core categories are defined by inheriting **directly** from ConfigBase:

```python
# We define a Model core config
class ModelConfig(ConfigBase):
    version: str = "0.1.0"

# Another base class: optimizer configurations
class OptimizerConfig(ConfigBase):
    lr: float = 0.001
```

### SubCategories

Now that you have core categories, you can optionally instantiate this as subcategories. 
For instance, the different type of models you have, optimizers, etc.

To do this, simply inherit from your core category:
```python
class DiT(ModelConfig):
    layers: Union[int, List[int]] = 16

class Unet(ModelConfig):
    conv: str = "DISCO"

# Nested config.
class CompositeModel(ModelConfig):
    submodel: ModelConfig
    num_heads: int = 4

class AdamW(OptimizerConfig):
    weight_decay: float = 0.01
```

### Composing categories
You can have configuration objects as parameters in your config: 
for instance, our main configuration will contain a model and an optimizer.

```python
# Our main config is also a core category, and encapsulates a model and an optimizer
class Config(ConfigBase):
    model: ModelConfig
    opt: OptimizerConfig = OptimizerConfig(_config_name='adamw')
```

Note the `_config_name="adamw"`: this indicates that the default will be the AdamW class. 
You can create a subcategory by passing to the main category class the class name of the subcategory you want to create, 
through `_config_name`. 

The above is equivalent to explicitly creating an ADAMW optimizer:

```python
class Config(ConfigBase):
    model: ModelConfig
    opt: OptimizerConfig = AdamW(_config_name='adamw')
```

### Instantiating configurations

Your configurations are Python object: you can instantiate them:

```python
config = Config(model = ModelConfig(_config_name='dit', layers=24))
```

## Instantiating a configuration with optional values from the command line

The library also lets you override parameters from the configuration through the command line, 
using `make_config_from_cli`.

For instance, you can create a script `main.py` containing:
```python
from zencfg import make_config_from_cli

from YOUR_CONFIG_FILE import Config

config = make_config_from_cli(Config, strict=True)
```

You can then call your script via the command line. 
In that case, we simply use `.` to indicate nesting.

For instance, to instantiate the same config as above, you could simply do:
```bash
python main.py --model dit --model.layers 24
```

Or, equivalently, but more verbose (but perhaps also more explicitly):
```bash
python main.py --model._config_name dit --model.layers 24
```

You can switch between different config types and override their specific parameters:
```bash
# Switch optimizers with their specific parameters
python main.py --opt adamw --opt.weight_decay 0.001
python main.py --opt sgd --opt.momentum 0.9

# Mix model and optimizer changes
python main.py --model unet --model.conv "new_conv" --opt adamw --opt.weight_decay 0.01
```

## Export your configuration to dictionary

While you can directly use the configuration, you can also get a Python dictionary from a configuration instance, by simply using the `to_dict` method:

```python
config_dict = config.to_dict()

model_cfg = config_dict['model']

# You can access values as attributes too
optimizer_cfg = config_dict.opt
```

## Examples

For concrete examples, check the [`examples`](examples/) folder.
You can try running [`test_config`](examples/test_config.py) script.

## Gotchas

Note that we handle ConfigBase types differently. Consider the following scenario:
```python
class ModelConfig(ConfigBase):
    in_channels: int = 3
    out_channels: int = 1

class UNet(ModelConfig):
    layers: int = 10
    kernel: Tuple[int] = (3, 3)

class DiT(ModelConfig):
    layers: int = 10
    n_heads: int = 12

class Config(ConfigBase):
    some_param: str = 'whatever'
    model: ModelConfig = DiT(layers=4)
```

Now, if a user wants to override the number of layers through the command line to 6, they'd want to write:
```bash
python script.py --model.layers 6
```

We allow this and it will give you a DiT model with 6 layers. 

This is where the gotcha comes from: if you just instantiate the default type with layers=6, 
you would be instantiating a `ModelConfig`, **not** a DiT (which would also cause an error since ModelConfig does not have layers).

To fix this, we treat ConfigBase parameters differently: we first take the default value (here, DiT(layers=4)). 
Then, if the user passes a new `_config_name` (e.g. 'unet'), we discard those and use only users defaults.

Otherwise, if the user does **not** pass a `_config_name` (i.e. they want to use the default), then we use 
the same defaults (`DiT(layers=4)`), which is turned into a dict: `{'_config_name': 'dit', 'layers': 4}` and we update it 
with the values passed by the user. 

This causes the least surprises in general but you may want to be aware of this.
For example, back to our example, this will allow the users to get back a config that matches what they'd expect: 
 `{'_config_name': 'dit', 'layers': 6}`

## Questions or issues
This is very much a project in development that I wrote for myself and decided to share so others could easily reuse it for multiple projects, while knowing it is tested and actively developed!

If you have any questions or find any bugs, please open an issue, or better yet, a pull-request!

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "zencfg",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": "Jean Kossaifi",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/ee/a4/a2abf9878bd97e4901a26744478ea6a50ac00d5e265768a97c93654c3390/zencfg-0.4.0.tar.gz",
    "platform": null,
    "description": "# ZenCFG\n\n![ZenCFG Logo](docs/_static/ZenCFG_image.png)\n\nA Zen way to configure your Python packages while keeping your sanity.\n\n## \u26a1 Quick Start\n\nFirst install the library:\n\n```bash\npip install zencfg\n```\n\n```python\nfrom zencfg import ConfigBase, make_config_from_cli\n\n# Define base categories\nclass ModelConfig(ConfigBase):\n    pass\n\nclass OptimizerConfig(ConfigBase):\n    pass\n\n# Define model architectures\nclass TransformerConfig(ModelConfig):\n    layers: int = 12\n    n_heads: int = 8\n\nclass CNNConfig(ModelConfig):\n    channels: list[int] = [64, 128, 256]\n    kernel_size: int = 3\n\n# Define optimizers  \nclass AdamConfig(OptimizerConfig):\n    lr: float = 1e-4\n    weight_decay: float = 0.01\n\nclass SGDConfig(OptimizerConfig):\n    lr: float = 1e-3\n    momentum: float = 0.9\n\n# Compose your experiment\nclass ExperimentConfig(ConfigBase):\n    model: ModelConfig = TransformerConfig()\n    optimizer: OptimizerConfig = AdamConfig()\n    batch_size: int = 32\n\n# Get config with command-line overrides\nconfig = make_config_from_cli(ExperimentConfig)\n```\n\nSwitch between architectures and tune their specific parameters:\n```bash\n# Switch to CNN with specific CNN parameters\npython train.py --model cnnconfig --model.channels \"[32,64,128]\" --model.kernel_size 5\n\n# Try SGD with momentum\npython train.py --optimizer sgdconfig --optimizer.momentum 0.95 --batch_size 128\n\n# Mix and match everything\npython train.py --model transformerconfig --model.n_heads 16 --optimizer adamconfig --optimizer.weight_decay 0.001\n```\n\n## Why ZenCFG\n\nZenCFG (for *Zen ConFiGuration*), is the result of many iterations of trying pretty much every existing approach to Python configuration management systems and being satisfied by none of them. \n\nThe key advantages of ZenCFG are:\n\n### **1. Native Python Tooling**\nWork with configs like any other Python code\u2014inheritance, composition, and type hints provide familiar development patterns.\nThis also means full IDE support with autocomplete, refactoring safety, and type checking.\n\n```python\nclass ModelConfig(ConfigBase):\n    layers: int = 12  # IDE autocomplete and type checking\n    learning_rate: float = 1e-4  # Runtime validation through type hints\n```\n\n### **2. Reduced Debugging Time** \nCatch configuration errors at startup with type safety and runtime validation, not hours into expensive training runs.\n\n```python\nconfig = ModelConfig(layers=\"invalid\")  # ValidationError immediately!\n# No more failed experiments due to config typos\n```\n\n### **3. Quick and Flexible Experimentation**\nOverride any nested parameter through intuitive command-line syntax without file editing. Switch between model architectures, optimizers, and their specific parameters in a single command.\n\n```bash\n# Switch architectures and tune their specific parameters\npython train.py --model ditconfig --model.n_heads 16 --optimizer sgdconfig --optimizer.momentum 0.9\n```\n\n### **4. Zero Boilerplate**\nPure Python classes with no frameworks, no special syntax, and no additional dependencies. If you know Python, you know ZenCFG.\n\n```python\nfrom zencfg import make_config_from_cli\nconfig = make_config_from_cli(MyConfig)  # That's it!\n```\n\nIt was built originally to configure and manage scripts for Deep Learning experiments, but you can use it for any Python project.\nThe examples I use are Deep Learning inspired.\n\n## Install\n\nJust clone the repository and install it, here in editable mode:\n\n```bash\ngit clone https://github.com/JeanKossaifi/zencfg\ncd zencfg\npython -m pip install -e .\n```\n\n## Defining configurations\n\nThere are two main type of configurations: core configuration categories, and subcategories.\n\n### Core configurations categories\n\nCore categories are defined by inheriting **directly** from ConfigBase:\n\n```python\n# We define a Model core config\nclass ModelConfig(ConfigBase):\n    version: str = \"0.1.0\"\n\n# Another base class: optimizer configurations\nclass OptimizerConfig(ConfigBase):\n    lr: float = 0.001\n```\n\n### SubCategories\n\nNow that you have core categories, you can optionally instantiate this as subcategories. \nFor instance, the different type of models you have, optimizers, etc.\n\nTo do this, simply inherit from your core category:\n```python\nclass DiT(ModelConfig):\n    layers: Union[int, List[int]] = 16\n\nclass Unet(ModelConfig):\n    conv: str = \"DISCO\"\n\n# Nested config.\nclass CompositeModel(ModelConfig):\n    submodel: ModelConfig\n    num_heads: int = 4\n\nclass AdamW(OptimizerConfig):\n    weight_decay: float = 0.01\n```\n\n### Composing categories\nYou can have configuration objects as parameters in your config: \nfor instance, our main configuration will contain a model and an optimizer.\n\n```python\n# Our main config is also a core category, and encapsulates a model and an optimizer\nclass Config(ConfigBase):\n    model: ModelConfig\n    opt: OptimizerConfig = OptimizerConfig(_config_name='adamw')\n```\n\nNote the `_config_name=\"adamw\"`: this indicates that the default will be the AdamW class. \nYou can create a subcategory by passing to the main category class the class name of the subcategory you want to create, \nthrough `_config_name`. \n\nThe above is equivalent to explicitly creating an ADAMW optimizer:\n\n```python\nclass Config(ConfigBase):\n    model: ModelConfig\n    opt: OptimizerConfig = AdamW(_config_name='adamw')\n```\n\n### Instantiating configurations\n\nYour configurations are Python object: you can instantiate them:\n\n```python\nconfig = Config(model = ModelConfig(_config_name='dit', layers=24))\n```\n\n## Instantiating a configuration with optional values from the command line\n\nThe library also lets you override parameters from the configuration through the command line, \nusing `make_config_from_cli`.\n\nFor instance, you can create a script `main.py` containing:\n```python\nfrom zencfg import make_config_from_cli\n\nfrom YOUR_CONFIG_FILE import Config\n\nconfig = make_config_from_cli(Config, strict=True)\n```\n\nYou can then call your script via the command line. \nIn that case, we simply use `.` to indicate nesting.\n\nFor instance, to instantiate the same config as above, you could simply do:\n```bash\npython main.py --model dit --model.layers 24\n```\n\nOr, equivalently, but more verbose (but perhaps also more explicitly):\n```bash\npython main.py --model._config_name dit --model.layers 24\n```\n\nYou can switch between different config types and override their specific parameters:\n```bash\n# Switch optimizers with their specific parameters\npython main.py --opt adamw --opt.weight_decay 0.001\npython main.py --opt sgd --opt.momentum 0.9\n\n# Mix model and optimizer changes\npython main.py --model unet --model.conv \"new_conv\" --opt adamw --opt.weight_decay 0.01\n```\n\n## Export your configuration to dictionary\n\nWhile you can directly use the configuration, you can also get a Python dictionary from a configuration instance, by simply using the `to_dict` method:\n\n```python\nconfig_dict = config.to_dict()\n\nmodel_cfg = config_dict['model']\n\n# You can access values as attributes too\noptimizer_cfg = config_dict.opt\n```\n\n## Examples\n\nFor concrete examples, check the [`examples`](examples/) folder.\nYou can try running [`test_config`](examples/test_config.py) script.\n\n## Gotchas\n\nNote that we handle ConfigBase types differently. Consider the following scenario:\n```python\nclass ModelConfig(ConfigBase):\n    in_channels: int = 3\n    out_channels: int = 1\n\nclass UNet(ModelConfig):\n    layers: int = 10\n    kernel: Tuple[int] = (3, 3)\n\nclass DiT(ModelConfig):\n    layers: int = 10\n    n_heads: int = 12\n\nclass Config(ConfigBase):\n    some_param: str = 'whatever'\n    model: ModelConfig = DiT(layers=4)\n```\n\nNow, if a user wants to override the number of layers through the command line to 6, they'd want to write:\n```bash\npython script.py --model.layers 6\n```\n\nWe allow this and it will give you a DiT model with 6 layers. \n\nThis is where the gotcha comes from: if you just instantiate the default type with layers=6, \nyou would be instantiating a `ModelConfig`, **not** a DiT (which would also cause an error since ModelConfig does not have layers).\n\nTo fix this, we treat ConfigBase parameters differently: we first take the default value (here, DiT(layers=4)). \nThen, if the user passes a new `_config_name` (e.g. 'unet'), we discard those and use only users defaults.\n\nOtherwise, if the user does **not** pass a `_config_name` (i.e. they want to use the default), then we use \nthe same defaults (`DiT(layers=4)`), which is turned into a dict: `{'_config_name': 'dit', 'layers': 4}` and we update it \nwith the values passed by the user. \n\nThis causes the least surprises in general but you may want to be aware of this.\nFor example, back to our example, this will allow the users to get back a config that matches what they'd expect: \n `{'_config_name': 'dit', 'layers': 6}`\n\n## Questions or issues\nThis is very much a project in development that I wrote for myself and decided to share so others could easily reuse it for multiple projects, while knowing it is tested and actively developed!\n\nIf you have any questions or find any bugs, please open an issue, or better yet, a pull-request!\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "The Zen way to configure your Python projects, deep learning and others",
    "version": "0.4.0",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "dbdf105bd0290a0b5903e9d51368e206a19ebd57b702fe1f478aa14f2663b806",
                "md5": "a90c91cb8fe05edb7361f0118742d551",
                "sha256": "f9c174da67e17cde21e59ffc9aa3824fb442c2adfe0f93cd656002087d4ebdcf"
            },
            "downloads": -1,
            "filename": "zencfg-0.4.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "a90c91cb8fe05edb7361f0118742d551",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 25048,
            "upload_time": "2025-07-30T18:48:39",
            "upload_time_iso_8601": "2025-07-30T18:48:39.068515Z",
            "url": "https://files.pythonhosted.org/packages/db/df/105bd0290a0b5903e9d51368e206a19ebd57b702fe1f478aa14f2663b806/zencfg-0.4.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "eea4a2abf9878bd97e4901a26744478ea6a50ac00d5e265768a97c93654c3390",
                "md5": "8b1809570541bf14a7c764ce5778add2",
                "sha256": "9fa20c5aef33bb32e7312bf88815ebf7aa26f75181b5623c7c4e56c52d1b906c"
            },
            "downloads": -1,
            "filename": "zencfg-0.4.0.tar.gz",
            "has_sig": false,
            "md5_digest": "8b1809570541bf14a7c764ce5778add2",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 23409,
            "upload_time": "2025-07-30T18:48:40",
            "upload_time_iso_8601": "2025-07-30T18:48:40.677205Z",
            "url": "https://files.pythonhosted.org/packages/ee/a4/a2abf9878bd97e4901a26744478ea6a50ac00d5e265768a97c93654c3390/zencfg-0.4.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-30 18:48:40",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "zencfg"
}
        
Elapsed time: 1.12719s