kan-o-nas


Namekan-o-nas JSON
Version 0.1.0 PyPI version JSON
download
home_pageNone
SummaryEvolutionary full NAS for fair comparison of CNN and KAN architectures (KAN'o'NAS).
upload_time2025-09-10 01:32:50
maintainerNone
docs_urlNone
authorVladimir Latypov, Alexander Hvatov
requires_python>=3.9
licenseNone
keywords nas evolutionary kan cnn neural-architecture-search
VCS
bugtrack_url
requirements typing fedot thegolem scikit-learn opencv-python numpy pandas matplotlib psutil setuptools tabulate datasets torch
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # KAN'o'NAS: Neural Architecture Search for fair architecture comparison

KAN'o'NAS is an architecture-agnostic, evolutionary full NAS framework for fair comparison of convolutional networks (CNN) and Kolmogorov–Arnold networks (KAN), including hybrids.
It represents networks as DAGs, jointly optimizes topology and per-node hyperparameters, and selects models on a Pareto frontier of quality vs. complexity.
The framework is based on [GOLEM](https://github.com/aimclub/GOLEM) (evolutionary graph optimization) and uses PyTorch as the training backend.

---

## Install

```bash
git clone https://github.com/ITMO-NSS-team/kan-o-nas.git
cd kan-o-nas
python -m venv .venv
# Linux/Mac
source .venv/bin/activate
# Windows: .venv\Scripts\activate
pip install --upgrade pip
pip install -r requirements.txt
```

Also, install the corresponding datasets: MNIST, FashionMNIST, EuroSAT or CIFAR-10 for image classification, 

Quick start

The repository provides two runnable examples. They demonstrate how to define a search space, run NAS, and post-train selected finalists.

### Image classification

```python cases/image_classification.py```


### Spatial time series forecasting

```python cases/ts_forecasting.py```


Both scripts save NAS artifacts and final metrics to the output directory chosen inside the script.

## Method Overview

- Representation. Candidate models are encoded as DAGs. Nodes are layers (Conv2D, KANConv2D, Linear, KANLinear, Pool, Flatten). Edges define data flow and allow short-cuts (1–2 inputs per node).
- Search. An evolutionary algorithm with subtree crossover and layer/edge mutations explores structures and per-node hyperparameters.
- Objectives. Multi-objective selection by task quality (e.g., accuracy or L1) and complexity (parameters by default; FLOPs or wall time are supported).
- Validation. Graph-level rules ensure acyclicity, shape consistency, feasible connectivity, and complexity bounds.
- Evaluation. Finalists are retrained to estimate mean performance; the Pareto set is reported.


## Configure the framework

1) Task definition
- Choose task and shapes: `Task`, `TaskTypesEnum`, `ModelRequirements(input_shape=..., output_shape=... or num_of_classes=...)`.

2) Search space
- Layer families: `LayersPoolEnum` (e.g., `conv2d`, `kan_conv2d`, `linear`, `kan_linear`).
- Model scaffold: `ModelRequirements` fields `primary`, `secondary`, `min_num_of_conv_layers`, `max_num_of_conv_layers`, `min_nn_depth`, `max_nn_depth`.
- Per-node ranges: `ConvRequirements`, `KANConvRequirements`, `BaseLayerRequirements`, `KANLinearRequirements`.
- Initial graphs: `ConvGraphMaker(requirements=..., rules=...)`, `BaseGraphBuilder().set_builder(...).build(pop_size)`.
- Graph types (when needed): `NasGraph`, `NasNode`.

3) Validation rules
- DAG soundness: `has_no_cycle`, `has_no_self_cycled_nodes`.
- Classification constraints: `model_has_no_conv_layers`, `model_has_several_starts`, `model_has_several_roots`, `model_has_wrong_number_of_flatten_layers`, `no_linear_layers_before_flatten`, `filter_size_changes_monotonically(increases=True)`.
- Forecasting constraints: `only_conv_layers`, `no_transposed_layers_before_conv`, `filter_size_changes_monotonically(increases=False)`, `right_output_size`, `output_node_has_channels(...)`.
- Shape/complexity checks: `model_has_dim_mismatch(...)`, `has_too_much_parameters(...)` (optionally `has_too_much_flops(...)`, `has_too_much_time(...)`).
- Attach via `GraphGenerationParams(..., rules_for_constraint=[...])`.

4) Objectives and metrics
- Quality: for classification use `MetricsRepository().metric_by_id(ClassificationMetricsEnum.accuracy)`; for forecasting train with `L1Loss` and report `L1` and `ssim`.
- Complexity: `compute_total_graph_parameters(...)`, optionally `get_flops_from_graph(...)`, `get_time_from_graph(...)`.
- Provide to composer: `.with_metrics([quality_metric, complexity_metric_fn])`.

5) Search parameters
- Genetic setup: `GPAlgorithmParameters(genetic_scheme_type=GeneticSchemeTypesEnum.steady_state, mutation_types=[MutationTypesEnum.*], crossover_types=[CrossoverTypesEnum.subtree], pop_size=..., max_pop_size=..., regularization_type=RegularizationTypesEnum.none, multi_objective=True)`.
- Custom operators (optional): define `combined_mutation` with `register_native`.
- Graph generation: `DirectAdapter(...)`, `NNNodeFactory(..., DefaultChangeAdvisor())`, `GraphGenerationParams(adapter=..., rules_for_constraint=..., node_factory=...)`.

6) Training setup
- Trainer: `ModelConstructor(model_class=NASTorchModel, trainer=NeuralSearchModel, device=..., loss_function=..., optimizer=AdamW, metrics=...)`.
    - Classification losses: `CrossEntropyLoss` or `FocalLoss`.
    - Forecasting loss: `L1Loss`.
- Composer pipeline:  
  `ComposerBuilder(task).with_composer(NNComposer).with_optimizer(NNGraphOptimiser).with_requirements(NNComposerRequirements(...)).with_metrics([...]).with_optimizer_params(GPAlgorithmParameters(...)).with_initial_pipelines(initial_pipelines).with_graph_generation_param(GraphGenerationParams(...))` → `composer = builder.build()` → `composer.set_trainer(model_trainer)` → `composer.compose_pipeline(train_data, valid_or_test_data)`.

7) Outputs
- Persist and reuse: `composer.save(path)`, access `composer.history.final_choices`, restore with `DirectAdapter.restore(...)`, reload runs via `OptHistory.load(path)`.
- Summaries: write metrics to JSON (e.g., `final_results.json`).

## Outputs

- NAS history for reuse or post-training only runs.
- Finalist graphs and trained weights if enabled.
- Metrics summary per finalist (e.g., accuracy for classification; L1 and SSIM for forecasting).
- Optional qualitative images for forecasting.

## Roadmap

- Richer KAN variants, kernel function libraries
- Larger and more diverse datasets (e.g. ImageNet).
- Experimentation with optimizer, including surrogate models and indirect encodings for search efficiency at the domain of large models.

## Citation

TBD

## License

The code is published under the MIT License.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "kan-o-nas",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": "NAS, evolutionary, KAN, CNN, neural-architecture-search",
    "author": "Vladimir Latypov, Alexander Hvatov",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/ec/ae/c549fbdecb33305c09b9c10fc50ce62476d4da3337187a5ca7f5023621fd/kan_o_nas-0.1.0.tar.gz",
    "platform": null,
    "description": "# KAN'o'NAS: Neural Architecture Search for fair architecture comparison\r\n\r\nKAN'o'NAS is an architecture-agnostic, evolutionary full NAS framework for fair comparison of convolutional networks (CNN) and Kolmogorov\u2013Arnold networks (KAN), including hybrids.\r\nIt represents networks as DAGs, jointly optimizes topology and per-node hyperparameters, and selects models on a Pareto frontier of quality vs. complexity.\r\nThe framework is based on [GOLEM](https://github.com/aimclub/GOLEM) (evolutionary graph optimization) and uses PyTorch as the training backend.\r\n\r\n---\r\n\r\n## Install\r\n\r\n```bash\r\ngit clone https://github.com/ITMO-NSS-team/kan-o-nas.git\r\ncd kan-o-nas\r\npython -m venv .venv\r\n# Linux/Mac\r\nsource .venv/bin/activate\r\n# Windows: .venv\\Scripts\\activate\r\npip install --upgrade pip\r\npip install -r requirements.txt\r\n```\r\n\r\nAlso, install the corresponding datasets: MNIST, FashionMNIST, EuroSAT or CIFAR-10 for image classification, \r\n\r\nQuick start\r\n\r\nThe repository provides two runnable examples. They demonstrate how to define a search space, run NAS, and post-train selected finalists.\r\n\r\n### Image classification\r\n\r\n```python cases/image_classification.py```\r\n\r\n\r\n### Spatial time series forecasting\r\n\r\n```python cases/ts_forecasting.py```\r\n\r\n\r\nBoth scripts save NAS artifacts and final metrics to the output directory chosen inside the script.\r\n\r\n## Method Overview\r\n\r\n- Representation. Candidate models are encoded as DAGs. Nodes are layers (Conv2D, KANConv2D, Linear, KANLinear, Pool, Flatten). Edges define data flow and allow short-cuts (1\u20132 inputs per node).\r\n- Search. An evolutionary algorithm with subtree crossover and layer/edge mutations explores structures and per-node hyperparameters.\r\n- Objectives. Multi-objective selection by task quality (e.g., accuracy or L1) and complexity (parameters by default; FLOPs or wall time are supported).\r\n- Validation. Graph-level rules ensure acyclicity, shape consistency, feasible connectivity, and complexity bounds.\r\n- Evaluation. Finalists are retrained to estimate mean performance; the Pareto set is reported.\r\n\r\n\r\n## Configure the framework\r\n\r\n1) Task definition\r\n- Choose task and shapes: `Task`, `TaskTypesEnum`, `ModelRequirements(input_shape=..., output_shape=... or num_of_classes=...)`.\r\n\r\n2) Search space\r\n- Layer families: `LayersPoolEnum` (e.g., `conv2d`, `kan_conv2d`, `linear`, `kan_linear`).\r\n- Model scaffold: `ModelRequirements` fields `primary`, `secondary`, `min_num_of_conv_layers`, `max_num_of_conv_layers`, `min_nn_depth`, `max_nn_depth`.\r\n- Per-node ranges: `ConvRequirements`, `KANConvRequirements`, `BaseLayerRequirements`, `KANLinearRequirements`.\r\n- Initial graphs: `ConvGraphMaker(requirements=..., rules=...)`, `BaseGraphBuilder().set_builder(...).build(pop_size)`.\r\n- Graph types (when needed): `NasGraph`, `NasNode`.\r\n\r\n3) Validation rules\r\n- DAG soundness: `has_no_cycle`, `has_no_self_cycled_nodes`.\r\n- Classification constraints: `model_has_no_conv_layers`, `model_has_several_starts`, `model_has_several_roots`, `model_has_wrong_number_of_flatten_layers`, `no_linear_layers_before_flatten`, `filter_size_changes_monotonically(increases=True)`.\r\n- Forecasting constraints: `only_conv_layers`, `no_transposed_layers_before_conv`, `filter_size_changes_monotonically(increases=False)`, `right_output_size`, `output_node_has_channels(...)`.\r\n- Shape/complexity checks: `model_has_dim_mismatch(...)`, `has_too_much_parameters(...)` (optionally `has_too_much_flops(...)`, `has_too_much_time(...)`).\r\n- Attach via `GraphGenerationParams(..., rules_for_constraint=[...])`.\r\n\r\n4) Objectives and metrics\r\n- Quality: for classification use `MetricsRepository().metric_by_id(ClassificationMetricsEnum.accuracy)`; for forecasting train with `L1Loss` and report `L1` and `ssim`.\r\n- Complexity: `compute_total_graph_parameters(...)`, optionally `get_flops_from_graph(...)`, `get_time_from_graph(...)`.\r\n- Provide to composer: `.with_metrics([quality_metric, complexity_metric_fn])`.\r\n\r\n5) Search parameters\r\n- Genetic setup: `GPAlgorithmParameters(genetic_scheme_type=GeneticSchemeTypesEnum.steady_state, mutation_types=[MutationTypesEnum.*], crossover_types=[CrossoverTypesEnum.subtree], pop_size=..., max_pop_size=..., regularization_type=RegularizationTypesEnum.none, multi_objective=True)`.\r\n- Custom operators (optional): define `combined_mutation` with `register_native`.\r\n- Graph generation: `DirectAdapter(...)`, `NNNodeFactory(..., DefaultChangeAdvisor())`, `GraphGenerationParams(adapter=..., rules_for_constraint=..., node_factory=...)`.\r\n\r\n6) Training setup\r\n- Trainer: `ModelConstructor(model_class=NASTorchModel, trainer=NeuralSearchModel, device=..., loss_function=..., optimizer=AdamW, metrics=...)`.\r\n    - Classification losses: `CrossEntropyLoss` or `FocalLoss`.\r\n    - Forecasting loss: `L1Loss`.\r\n- Composer pipeline:  \r\n  `ComposerBuilder(task).with_composer(NNComposer).with_optimizer(NNGraphOptimiser).with_requirements(NNComposerRequirements(...)).with_metrics([...]).with_optimizer_params(GPAlgorithmParameters(...)).with_initial_pipelines(initial_pipelines).with_graph_generation_param(GraphGenerationParams(...))` \u2192 `composer = builder.build()` \u2192 `composer.set_trainer(model_trainer)` \u2192 `composer.compose_pipeline(train_data, valid_or_test_data)`.\r\n\r\n7) Outputs\r\n- Persist and reuse: `composer.save(path)`, access `composer.history.final_choices`, restore with `DirectAdapter.restore(...)`, reload runs via `OptHistory.load(path)`.\r\n- Summaries: write metrics to JSON (e.g., `final_results.json`).\r\n\r\n## Outputs\r\n\r\n- NAS history for reuse or post-training only runs.\r\n- Finalist graphs and trained weights if enabled.\r\n- Metrics summary per finalist (e.g., accuracy for classification; L1 and SSIM for forecasting).\r\n- Optional qualitative images for forecasting.\r\n\r\n## Roadmap\r\n\r\n- Richer KAN variants, kernel function libraries\r\n- Larger and more diverse datasets (e.g. ImageNet).\r\n- Experimentation with optimizer, including surrogate models and indirect encodings for search efficiency at the domain of large models.\r\n\r\n## Citation\r\n\r\nTBD\r\n\r\n## License\r\n\r\nThe code is published under the MIT License.\r\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Evolutionary full NAS for fair comparison of CNN and KAN architectures (KAN'o'NAS).",
    "version": "0.1.0",
    "project_urls": {
        "Homepage": "https://github.com/ITMO-NSS-team/kan-o-nas",
        "Issues": "https://github.com/ITMO-NSS-team/kan-o-nas/issues"
    },
    "split_keywords": [
        "nas",
        " evolutionary",
        " kan",
        " cnn",
        " neural-architecture-search"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "a652f193294e8128f3674e60d9bba98751420a2d6bdc9384274068010f7ba0dd",
                "md5": "c0216d7d01fa6e07727a6fba94a02774",
                "sha256": "e41fd7345c648007e61d956644ac624f7e8f11c9089a3d448abba75a16caa7eb"
            },
            "downloads": -1,
            "filename": "kan_o_nas-0.1.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "c0216d7d01fa6e07727a6fba94a02774",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 58126,
            "upload_time": "2025-09-10T01:32:49",
            "upload_time_iso_8601": "2025-09-10T01:32:49.375363Z",
            "url": "https://files.pythonhosted.org/packages/a6/52/f193294e8128f3674e60d9bba98751420a2d6bdc9384274068010f7ba0dd/kan_o_nas-0.1.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "ecaec549fbdecb33305c09b9c10fc50ce62476d4da3337187a5ca7f5023621fd",
                "md5": "3d873680a3eea982df619d91a19f03d2",
                "sha256": "86f47e3b118c253b521245367191204555e939d31247415a2817ef5714ffcb69"
            },
            "downloads": -1,
            "filename": "kan_o_nas-0.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "3d873680a3eea982df619d91a19f03d2",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 43859,
            "upload_time": "2025-09-10T01:32:50",
            "upload_time_iso_8601": "2025-09-10T01:32:50.762550Z",
            "url": "https://files.pythonhosted.org/packages/ec/ae/c549fbdecb33305c09b9c10fc50ce62476d4da3337187a5ca7f5023621fd/kan_o_nas-0.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-09-10 01:32:50",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "ITMO-NSS-team",
    "github_project": "kan-o-nas",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [
        {
            "name": "typing",
            "specs": [
                [
                    "~=",
                    "3.7.4.3"
                ]
            ]
        },
        {
            "name": "fedot",
            "specs": [
                [
                    "==",
                    "0.7.0"
                ]
            ]
        },
        {
            "name": "thegolem",
            "specs": [
                [
                    "==",
                    "0.3.1"
                ]
            ]
        },
        {
            "name": "scikit-learn",
            "specs": [
                [
                    "~=",
                    "1.2.2"
                ]
            ]
        },
        {
            "name": "opencv-python",
            "specs": []
        },
        {
            "name": "numpy",
            "specs": [
                [
                    "~=",
                    "1.23.4"
                ]
            ]
        },
        {
            "name": "pandas",
            "specs": [
                [
                    "~=",
                    "1.4.3"
                ]
            ]
        },
        {
            "name": "matplotlib",
            "specs": [
                [
                    "~=",
                    "3.5.3"
                ]
            ]
        },
        {
            "name": "psutil",
            "specs": [
                [
                    "~=",
                    "5.9.3"
                ]
            ]
        },
        {
            "name": "setuptools",
            "specs": [
                [
                    "~=",
                    "67.8.0"
                ]
            ]
        },
        {
            "name": "tabulate",
            "specs": [
                [
                    "~=",
                    "0.9.0"
                ]
            ]
        },
        {
            "name": "datasets",
            "specs": []
        },
        {
            "name": "torch",
            "specs": []
        }
    ],
    "lcname": "kan-o-nas"
}
        
Elapsed time: 3.59939s