chanfig


Namechanfig JSON
Version 0.0.96 PyPI version JSON
download
home_page
SummaryEasier Configuration
upload_time2024-02-23 04:38:04
maintainer
docs_urlNone
author
requires_python>=3.7
licenseUnlicense OR AGPL-3.0-or-later OR GPL-2.0-or-later OR BSD-4-Clause OR MIT OR Apache-2.0
keywords config deep-learning dict machine-learning
VCS
bugtrack_url
requirements pyyaml typing-extensions
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # [CHANfiG](https://chanfig.danling.org)

[![Codacy Badge](https://app.codacy.com/project/badge/Grade/75f2ee4ba5a64458afb488615e36adcb)](https://app.codacy.com/gh/ZhiyuanChen/CHANfiG/dashboard?utm_source=gh&utm_medium=referral&utm_content=&utm_campaign=Badge_grade)
[![Codacy Badge](https://app.codacy.com/project/badge/Coverage/75f2ee4ba5a64458afb488615e36adcb)](https://app.codacy.com/gh/ZhiyuanChen/CHANfiG/dashboard?utm_source=gh&utm_medium=referral&utm_content=&utm_campaign=Badge_grade)
[![CodeCov](https://codecov.io/gh/ZhiyuanChen/CHANfiG/graph/badge.svg?token=G9WGWCOFQE)](https://codecov.io/gh/ZhiyuanChen/CHANfiG)

[![PyPI - Version](https://img.shields.io/pypi/v/chanfig)](https://pypi.org/project/chanfig/)
[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/chanfig)](https://pypi.org/project/chanfig/)
[![Downloads](https://static.pepy.tech/badge/chanfig/month)](https://chanfig.danling.org)

[![License: Unlicense](https://img.shields.io/badge/license-Unlicense-blue.svg)](http://unlicense.org/)
[![License: AGPL v3](https://img.shields.io/badge/License-AGPL%20v3-blue.svg)](https://www.gnu.org/licenses/agpl-3.0)

## Introduction

CHANfiG aims to make your configuration easier.

There are tons of configurable parameters in training a Machine Learning model.
To configure all these parameters, researchers usually need to write gigantic config files, sometimes even thousands of lines.
Most of the configs are just replicates of the default arguments of certain functions, resulting in many unnecessary declarations.
It is also very hard to alter the configurations.
One needs to navigate and open the right configuration file, make changes, save and exit.
These had wasted an uncountable[^uncountable] amount of precious time ~~and are no doubt a crime~~.
Using `argparse` could relieve the burdens to some extent.
However, it takes a lot of work to make it compatible with existing config files, and its lack of nesting limits its potential.

CHANfiG would like to make a change.

You just type the alternations in the command line, and leave everything else to CHANfiG.

CHANfiG is highly inspired by [YACS](https://github.com/rbgirshick/yacs).
Different from the paradigm of YACS(
`your code + a YACS config for experiment E (+ external dependencies + hardware + other nuisance terms ...) = reproducible experiment E`),
The paradigm of CHANfiG is:

`your code + command line arguments (+ optional CHANfiG config + external dependencies + hardware + other nuisance terms ...) = reproducible experiment E (+ optional CHANfiG config for experiment E)`

## Components

A Config is basically a nested dict structure.

However, the default Python dict is hard to manipulate.

The only way to access a dict member is through `dict['name']`, which is obviously extremely complex.
Even worse, if the dict is nested like a config, member access could be something like `dict['parent']['children']['name']`.

Enough is enough, it is time to make a change.

We need attribute-style access, and we need it now.
`dict.name` and `dict.parent.children.name` is all you need.

Although there have been some other works that achieve a similar functionality of attribute-style access to dict members.
Their Config object either uses a separate dict to store information from attribute-style access (EasyDict), which may lead to inconsistency between attribute-style access and dict-style access;
or reuse the existing `__dict__` and redirect dict-style access (ml_collections), which may result in confliction between attributes and members of Config.

To overcome the aforementioned limitations, we inherit the Python built-in `dict` to create `FlatDict`, `DefaultDict`, `NestedDict`, `Config`, and `Registry`.
We also introduce `Variable` to allow sharing a value across multiple places, and `ConfigParser` to parse command line arguments.

### FlatDict

`FlatDict` improves the default `dict` in 3 aspects.

#### Dict Operations

`FlatDict` incorporates a `merge` method that allows you to merge a `Mapping`, an `Iterable`, or a path to the `FlatDict`.
Different from built-in `update`, `merge` assign values instead of replace, which makes it works better with `DefaultDict`.

`dict` in Python is ordered since Python 3.7, but there isn't a built-in method to help you sort a `dict`. `FlatDict`supports`sort` to help you manage your dict.

Moreover, `FlatDict` comes with `difference` and `intersect`, which makes it very easy to compare a `FlatDict` with other `Mapping`, `Iterable`, or a path.

#### ML Operations

`FlatDict` supports `to` method similar to PyTorch Tensors.
You can simply convert all member values of `FlatDict` to a certain type or pass to a device in the same way.

`FlatDict` also integrates `cpu`, `gpu` (`cuda`), and `tpu` (`xla`) methods for easier access.

#### IO Operations

`FlatDict` provides `json`, `jsons`, `yaml` and `yamls` methods to dump `FlatDict` to a file or string.
It also provides `from_json`, `from_jsons`, `from_yaml` and `from_yamls` methods to build a `FlatDict` from a string or file.

`FlatDict` also includes `dump` and `load` methods which determines the type by its extension and dump/load `FlatDict` to/from a file.

### DefaultDict

To facility the needs of default values, we incorporate `DefaultDict` which accepts `default_factory` and works just like a `collections.defaultdict`.

### NestedDict

Since most Configs are in a nested structure, we further propose a `NestedDict`.

Based on `DefaultDict`, `NestedDict` provides `all_keys`, `all_values`, and `all_items` methods to allow iterating over the whole nested structure at once.

`NestedDict` also comes with `apply` and `apply_` methods, which made it easier to manipulate the nested structure.

### Config

`Config` extends the functionality by supporting `freeze` and `defrost`, and by adding a built-in `ConfigParser` to pare command line arguments.

Note that `Config` also has `default_factory=Config()` by default for convenience.

### Registry

`Registry` extends the `NestedDict` and supports `register`, `lookup`, and `build` to help you register constructors and build objects from a `Config`.

### Variable

Have one value for multiple names at multiple places? We got you covered.

Just wrap the value with `Variable`, and one alteration will be reflected everywhere.

`Variable` also supports `type`, `choices`, `validator`, and `required` to ensure the correctness of the value.

To make it even easier, `Variable` also supports `help` to provide a description when using `ConfigParser`.

### ConfigParser

`ConfigParser` extends `ArgumentParser` and provides `parse` and `parse_config` to parse command line arguments.

## Usage

CHANfiG has great backward compatibility with previous configs.

No matter if your old config is json or yaml, you could directly read from them.

And if you are using yacs, just replace `CfgNode` with `Config` and enjoy all the additional benefits that CHANfiG provides.

Moreover, if you find a name in the config is too long for command-line, you could simply call `self.add_argument` with proper `dest` to use a shorter name in command-line, as you do with `argparse`.

```python
from chanfig import Config, Variable


class Model:
    def __init__(self, encoder, dropout=0.1, activation='ReLU'):
        self.encoder = Encoder(**encoder)
        self.dropout = Dropout(dropout)
        self.activation = getattr(Activation, activation)

def main(config):
    model = Model(**config.model)
    optimizer = Optimizer(**config.optimizer)
    scheduler = Scheduler(**config.scheduler)
    dataset = Dataset(**config.dataset)
    dataloader = Dataloader(**config.dataloader)


class TestConfig(Config):
    def __init__(self):
        super().__init__()
        dropout = Variable(0.1)
        self.name = "CHANfiG"
        self.seed = 1013
        self.data.batch_size = 64
        self.model.encoder.num_layers = 6
        self.model.decoder.num_layers = 6
        self.model.dropout = dropout
        self.model.encoder.dropout = dropout
        self.model.decoder.dropout = dropout
        self.activation = "GELU"
        self.optim.lr = 1e-3
        self.add_argument("--batch_size", dest="data.batch_size")
        self.add_argument("--lr", dest="optim.lr")

    def post(self):
        self.id = f"{self.name}_{self.seed}"


if __name__ == '__main__':
    # config = Config.load('config.yaml')  # in case you want to read from a yaml
    # config = Config.load('config.json')  # in case you want to read from a json
    # existing_configs = {'data.batch_size': 64, 'model.encoder.num_layers': 8}
    # config = Config(**existing_configs)  # in case you have some config in dict to load
    config = TestConfig()
    config = config.parse()
    # config.merge('dataset.yaml')  # in case you want to merge a yaml
    # config.merge('dataset.json')  # in case you want to merge a json
    # note that the value of merge will override current values
    config.model.decoder.num_layers = 8
    config.freeze()
    print(config)
    # main(config)
    # config.yaml('config.yaml')  # in case you want to save a yaml
    # config.json('config.json')  # in case you want to save a json
```

All you need to do is just run a line:

```shell
python main.py --model.encoder.num_layers 8 --model.dropout=0.2 --lr 5e-3
```

You could also load a default configure file and make changes based on it:

Note, you must specify `config.parse(default_config='config')` to correctly load the default config.

```shell
python main.py --config meow.yaml --model.encoder.num_layers 8 --model.dropout=0.2 --lr 5e-3
```

If you have made it dump current configurations, this should be in the written file:

```yaml
activation: GELU
data:
  batch_size: 64
id: CHANfiG_1013
model:
  decoder:
    dropout: 0.1
    num_layers: 6
  dropout: 0.1
  encoder:
    dropout: 0.1
    num_layers: 6
name: CHANfiG
optim:
  lr: 0.005
seed: 1013
```

```json
{
  "name": "CHANfiG",
  "seed": 1013,
  "data": {
    "batch_size": 64
  },
  "model": {
    "encoder": {
      "num_layers": 6,
      "dropout": 0.1
    },
    "decoder": {
      "num_layers": 6,
      "dropout": 0.1
    },
    "dropout": 0.1
  },
  "activation": "GELU",
  "optim": {
    "lr": 0.005
  },
  "id": "CHANfiG_1013"
}
```

Define the default arguments in function, put alterations in CLI, and leave the rest to CHANfiG.

## Installation

Install the most recent stable version on pypi:

```shell
pip install chanfig
```

Install the latest version from source:

```shell
pip install git+https://github.com/ZhiyuanChen/CHANfiG
```

It works the way it should have worked.

## License

CHANfiG is multi-licensed under the following licenses:

- The Unlicense
- GNU Affero General Public License v3.0 or later
- GNU General Public License v2.0 or later
- BSD 4-Clause "Original" or "Old" License
- MIT License
- Apache License 2.0

You can choose any (one or more) of these licenses if you use this work.

`SPDX-License-Identifier: Unlicense OR AGPL-3.0-or-later OR GPL-2.0-or-later OR BSD-4-Clause OR MIT OR Apache-2.0`

[^uncountable]: fun fact: time is always uncountable.

            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "chanfig",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": "Zhiyuan Chen <this@zyc.ai>",
    "keywords": "config,deep-learning,dict,machine-learning",
    "author": "",
    "author_email": "Zhiyuan Chen <this@zyc.ai>, \"Evian C. Liu\" <evian.liu@outlook.com>",
    "download_url": "https://files.pythonhosted.org/packages/04/89/a34f27249de15f9da724ae39a3c3b38b0c4c45ac668de461e92761506cd2/chanfig-0.0.96.tar.gz",
    "platform": null,
    "description": "# [CHANfiG](https://chanfig.danling.org)\n\n[![Codacy Badge](https://app.codacy.com/project/badge/Grade/75f2ee4ba5a64458afb488615e36adcb)](https://app.codacy.com/gh/ZhiyuanChen/CHANfiG/dashboard?utm_source=gh&utm_medium=referral&utm_content=&utm_campaign=Badge_grade)\n[![Codacy Badge](https://app.codacy.com/project/badge/Coverage/75f2ee4ba5a64458afb488615e36adcb)](https://app.codacy.com/gh/ZhiyuanChen/CHANfiG/dashboard?utm_source=gh&utm_medium=referral&utm_content=&utm_campaign=Badge_grade)\n[![CodeCov](https://codecov.io/gh/ZhiyuanChen/CHANfiG/graph/badge.svg?token=G9WGWCOFQE)](https://codecov.io/gh/ZhiyuanChen/CHANfiG)\n\n[![PyPI - Version](https://img.shields.io/pypi/v/chanfig)](https://pypi.org/project/chanfig/)\n[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/chanfig)](https://pypi.org/project/chanfig/)\n[![Downloads](https://static.pepy.tech/badge/chanfig/month)](https://chanfig.danling.org)\n\n[![License: Unlicense](https://img.shields.io/badge/license-Unlicense-blue.svg)](http://unlicense.org/)\n[![License: AGPL v3](https://img.shields.io/badge/License-AGPL%20v3-blue.svg)](https://www.gnu.org/licenses/agpl-3.0)\n\n## Introduction\n\nCHANfiG aims to make your configuration easier.\n\nThere are tons of configurable parameters in training a Machine Learning model.\nTo configure all these parameters, researchers usually need to write gigantic config files, sometimes even thousands of lines.\nMost of the configs are just replicates of the default arguments of certain functions, resulting in many unnecessary declarations.\nIt is also very hard to alter the configurations.\nOne needs to navigate and open the right configuration file, make changes, save and exit.\nThese had wasted an uncountable[^uncountable] amount of precious time ~~and are no doubt a crime~~.\nUsing `argparse` could relieve the burdens to some extent.\nHowever, it takes a lot of work to make it compatible with existing config files, and its lack of nesting limits its potential.\n\nCHANfiG would like to make a change.\n\nYou just type the alternations in the command line, and leave everything else to CHANfiG.\n\nCHANfiG is highly inspired by [YACS](https://github.com/rbgirshick/yacs).\nDifferent from the paradigm of YACS(\n`your code + a YACS config for experiment E (+ external dependencies + hardware + other nuisance terms ...) = reproducible experiment E`),\nThe paradigm of CHANfiG is:\n\n`your code + command line arguments (+ optional CHANfiG config + external dependencies + hardware + other nuisance terms ...) = reproducible experiment E (+ optional CHANfiG config for experiment E)`\n\n## Components\n\nA Config is basically a nested dict structure.\n\nHowever, the default Python dict is hard to manipulate.\n\nThe only way to access a dict member is through `dict['name']`, which is obviously extremely complex.\nEven worse, if the dict is nested like a config, member access could be something like `dict['parent']['children']['name']`.\n\nEnough is enough, it is time to make a change.\n\nWe need attribute-style access, and we need it now.\n`dict.name` and `dict.parent.children.name` is all you need.\n\nAlthough there have been some other works that achieve a similar functionality of attribute-style access to dict members.\nTheir Config object either uses a separate dict to store information from attribute-style access (EasyDict), which may lead to inconsistency between attribute-style access and dict-style access;\nor reuse the existing `__dict__` and redirect dict-style access (ml_collections), which may result in confliction between attributes and members of Config.\n\nTo overcome the aforementioned limitations, we inherit the Python built-in `dict` to create `FlatDict`, `DefaultDict`, `NestedDict`, `Config`, and `Registry`.\nWe also introduce `Variable` to allow sharing a value across multiple places, and `ConfigParser` to parse command line arguments.\n\n### FlatDict\n\n`FlatDict` improves the default `dict` in 3 aspects.\n\n#### Dict Operations\n\n`FlatDict` incorporates a `merge` method that allows you to merge a `Mapping`, an `Iterable`, or a path to the `FlatDict`.\nDifferent from built-in `update`, `merge` assign values instead of replace, which makes it works better with `DefaultDict`.\n\n`dict` in Python is ordered since Python 3.7, but there isn't a built-in method to help you sort a `dict`. `FlatDict`supports`sort` to help you manage your dict.\n\nMoreover, `FlatDict` comes with `difference` and `intersect`, which makes it very easy to compare a `FlatDict` with other `Mapping`, `Iterable`, or a path.\n\n#### ML Operations\n\n`FlatDict` supports `to` method similar to PyTorch Tensors.\nYou can simply convert all member values of `FlatDict` to a certain type or pass to a device in the same way.\n\n`FlatDict` also integrates `cpu`, `gpu` (`cuda`), and `tpu` (`xla`) methods for easier access.\n\n#### IO Operations\n\n`FlatDict` provides `json`, `jsons`, `yaml` and `yamls` methods to dump `FlatDict` to a file or string.\nIt also provides `from_json`, `from_jsons`, `from_yaml` and `from_yamls` methods to build a `FlatDict` from a string or file.\n\n`FlatDict` also includes `dump` and `load` methods which determines the type by its extension and dump/load `FlatDict` to/from a file.\n\n### DefaultDict\n\nTo facility the needs of default values, we incorporate `DefaultDict` which accepts `default_factory` and works just like a `collections.defaultdict`.\n\n### NestedDict\n\nSince most Configs are in a nested structure, we further propose a `NestedDict`.\n\nBased on `DefaultDict`, `NestedDict` provides `all_keys`, `all_values`, and `all_items` methods to allow iterating over the whole nested structure at once.\n\n`NestedDict` also comes with `apply` and `apply_` methods, which made it easier to manipulate the nested structure.\n\n### Config\n\n`Config` extends the functionality by supporting `freeze` and `defrost`, and by adding a built-in `ConfigParser` to pare command line arguments.\n\nNote that `Config` also has `default_factory=Config()` by default for convenience.\n\n### Registry\n\n`Registry` extends the `NestedDict` and supports `register`, `lookup`, and `build` to help you register constructors and build objects from a `Config`.\n\n### Variable\n\nHave one value for multiple names at multiple places? We got you covered.\n\nJust wrap the value with `Variable`, and one alteration will be reflected everywhere.\n\n`Variable` also supports `type`, `choices`, `validator`, and `required` to ensure the correctness of the value.\n\nTo make it even easier, `Variable` also supports `help` to provide a description when using `ConfigParser`.\n\n### ConfigParser\n\n`ConfigParser` extends `ArgumentParser` and provides `parse` and `parse_config` to parse command line arguments.\n\n## Usage\n\nCHANfiG has great backward compatibility with previous configs.\n\nNo matter if your old config is json or yaml, you could directly read from them.\n\nAnd if you are using yacs, just replace `CfgNode` with `Config` and enjoy all the additional benefits that CHANfiG provides.\n\nMoreover, if you find a name in the config is too long for command-line, you could simply call `self.add_argument` with proper `dest` to use a shorter name in command-line, as you do with `argparse`.\n\n```python\nfrom chanfig import Config, Variable\n\n\nclass Model:\n    def __init__(self, encoder, dropout=0.1, activation='ReLU'):\n        self.encoder = Encoder(**encoder)\n        self.dropout = Dropout(dropout)\n        self.activation = getattr(Activation, activation)\n\ndef main(config):\n    model = Model(**config.model)\n    optimizer = Optimizer(**config.optimizer)\n    scheduler = Scheduler(**config.scheduler)\n    dataset = Dataset(**config.dataset)\n    dataloader = Dataloader(**config.dataloader)\n\n\nclass TestConfig(Config):\n    def __init__(self):\n        super().__init__()\n        dropout = Variable(0.1)\n        self.name = \"CHANfiG\"\n        self.seed = 1013\n        self.data.batch_size = 64\n        self.model.encoder.num_layers = 6\n        self.model.decoder.num_layers = 6\n        self.model.dropout = dropout\n        self.model.encoder.dropout = dropout\n        self.model.decoder.dropout = dropout\n        self.activation = \"GELU\"\n        self.optim.lr = 1e-3\n        self.add_argument(\"--batch_size\", dest=\"data.batch_size\")\n        self.add_argument(\"--lr\", dest=\"optim.lr\")\n\n    def post(self):\n        self.id = f\"{self.name}_{self.seed}\"\n\n\nif __name__ == '__main__':\n    # config = Config.load('config.yaml')  # in case you want to read from a yaml\n    # config = Config.load('config.json')  # in case you want to read from a json\n    # existing_configs = {'data.batch_size': 64, 'model.encoder.num_layers': 8}\n    # config = Config(**existing_configs)  # in case you have some config in dict to load\n    config = TestConfig()\n    config = config.parse()\n    # config.merge('dataset.yaml')  # in case you want to merge a yaml\n    # config.merge('dataset.json')  # in case you want to merge a json\n    # note that the value of merge will override current values\n    config.model.decoder.num_layers = 8\n    config.freeze()\n    print(config)\n    # main(config)\n    # config.yaml('config.yaml')  # in case you want to save a yaml\n    # config.json('config.json')  # in case you want to save a json\n```\n\nAll you need to do is just run a line:\n\n```shell\npython main.py --model.encoder.num_layers 8 --model.dropout=0.2 --lr 5e-3\n```\n\nYou could also load a default configure file and make changes based on it:\n\nNote, you must specify `config.parse(default_config='config')` to correctly load the default config.\n\n```shell\npython main.py --config meow.yaml --model.encoder.num_layers 8 --model.dropout=0.2 --lr 5e-3\n```\n\nIf you have made it dump current configurations, this should be in the written file:\n\n```yaml\nactivation: GELU\ndata:\n  batch_size: 64\nid: CHANfiG_1013\nmodel:\n  decoder:\n    dropout: 0.1\n    num_layers: 6\n  dropout: 0.1\n  encoder:\n    dropout: 0.1\n    num_layers: 6\nname: CHANfiG\noptim:\n  lr: 0.005\nseed: 1013\n```\n\n```json\n{\n  \"name\": \"CHANfiG\",\n  \"seed\": 1013,\n  \"data\": {\n    \"batch_size\": 64\n  },\n  \"model\": {\n    \"encoder\": {\n      \"num_layers\": 6,\n      \"dropout\": 0.1\n    },\n    \"decoder\": {\n      \"num_layers\": 6,\n      \"dropout\": 0.1\n    },\n    \"dropout\": 0.1\n  },\n  \"activation\": \"GELU\",\n  \"optim\": {\n    \"lr\": 0.005\n  },\n  \"id\": \"CHANfiG_1013\"\n}\n```\n\nDefine the default arguments in function, put alterations in CLI, and leave the rest to CHANfiG.\n\n## Installation\n\nInstall the most recent stable version on pypi:\n\n```shell\npip install chanfig\n```\n\nInstall the latest version from source:\n\n```shell\npip install git+https://github.com/ZhiyuanChen/CHANfiG\n```\n\nIt works the way it should have worked.\n\n## License\n\nCHANfiG is multi-licensed under the following licenses:\n\n- The Unlicense\n- GNU Affero General Public License v3.0 or later\n- GNU General Public License v2.0 or later\n- BSD 4-Clause \"Original\" or \"Old\" License\n- MIT License\n- Apache License 2.0\n\nYou can choose any (one or more) of these licenses if you use this work.\n\n`SPDX-License-Identifier: Unlicense OR AGPL-3.0-or-later OR GPL-2.0-or-later OR BSD-4-Clause OR MIT OR Apache-2.0`\n\n[^uncountable]: fun fact: time is always uncountable.\n",
    "bugtrack_url": null,
    "license": "Unlicense OR AGPL-3.0-or-later OR GPL-2.0-or-later OR BSD-4-Clause OR MIT OR Apache-2.0",
    "summary": "Easier Configuration",
    "version": "0.0.96",
    "project_urls": {
        "documentation": "https://chanfig.danling.org",
        "homepage": "https://chanfig.danling.org",
        "repository": "https://github.com/ZhiyuanChen/CHANfiG"
    },
    "split_keywords": [
        "config",
        "deep-learning",
        "dict",
        "machine-learning"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f44c2a6028507714158fefe9603bcabc813e417722c8afbc3ba000e1bb6e073e",
                "md5": "02c19a5a9fd8440b740ef10904d425ef",
                "sha256": "5138c08bb5c18edae99c0da2f2851e390d88d60b1c8b958440d4329b6b2f3b6c"
            },
            "downloads": -1,
            "filename": "chanfig-0.0.96-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "02c19a5a9fd8440b740ef10904d425ef",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 43316,
            "upload_time": "2024-02-23T04:37:59",
            "upload_time_iso_8601": "2024-02-23T04:37:59.809592Z",
            "url": "https://files.pythonhosted.org/packages/f4/4c/2a6028507714158fefe9603bcabc813e417722c8afbc3ba000e1bb6e073e/chanfig-0.0.96-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "0489a34f27249de15f9da724ae39a3c3b38b0c4c45ac668de461e92761506cd2",
                "md5": "67e658187f710bfa3d7306eff2145efd",
                "sha256": "c9f5abc2c51511450eec6e0f7bc3e578d6f505adfc6961908ba55aa33a5db5d7"
            },
            "downloads": -1,
            "filename": "chanfig-0.0.96.tar.gz",
            "has_sig": false,
            "md5_digest": "67e658187f710bfa3d7306eff2145efd",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 6353879,
            "upload_time": "2024-02-23T04:38:04",
            "upload_time_iso_8601": "2024-02-23T04:38:04.003110Z",
            "url": "https://files.pythonhosted.org/packages/04/89/a34f27249de15f9da724ae39a3c3b38b0c4c45ac668de461e92761506cd2/chanfig-0.0.96.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-02-23 04:38:04",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "ZhiyuanChen",
    "github_project": "CHANfiG",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [
        {
            "name": "pyyaml",
            "specs": []
        },
        {
            "name": "typing-extensions",
            "specs": []
        }
    ],
    "tox": true,
    "lcname": "chanfig"
}
        
Elapsed time: 0.21619s