wf2wf


Namewf2wf JSON
Version 1.1.0 PyPI version JSON
download
home_pageNone
SummaryUniversal workflow-format converter built around a loss-preserving intermediate representation
upload_time2025-07-15 19:08:23
maintainerNone
docs_urlNone
authorNone
requires_python>=3.9
licenseMIT
keywords workflow cwl snakemake htcondor dagman nextflow wdl bioinformatics
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # wf2wf – Universal Workflow-Format Converter

[![Python](https://img.shields.io/badge/python-3.9%2B-blue.svg)](https://www.python.org/downloads/)
[![CI](https://github.com/csmcal/wf2wf/actions/workflows/ci.yml/badge.svg)](https://github.com/csmcal/wf2wf/actions/workflows/ci.yml)
[![Docs](https://github.com/csmcal/wf2wf/actions/workflows/docs.yml/badge.svg)](https://csmcal.github.io/wf2wf/)
[![PyPI](https://img.shields.io/pypi/v/wf2wf.svg)](https://pypi.org/project/wf2wf)
[![codecov](https://codecov.io/gh/csmcal/wf2wf/branch/main/graph/badge.svg)](https://codecov.io/gh/csmcal/wf2wf)
[![License](https://img.shields.io/badge/license-MIT-green.svg)](LICENSE)
[![Workflow Schema](https://img.shields.io/badge/schema-wf.json-blue)](wf2wf/schemas/v0.1/wf.json)
[![Loss Schema](https://img.shields.io/badge/schema-loss.json-blue)](wf2wf/schemas/v0.1/loss.json)

`wf2wf` is a **format-agnostic converter**: any supported engine → **Intermediate Representation (IR)** → any other engine.  The core library handles: 

• [Snakemake](https://snakemake.readthedocs.io/) • [Nextflow](https://www.nextflow.io/) • [CWL](https://www.commonwl.org/) • [HTCondor](https://htcondor.org/)/[DAGMan](https://htcondor.readthedocs.io/en/latest/automated-workflows/dagman-introduction.html) • [WDL](https://openwdl.org/) • [Galaxy](https://galaxyproject.org/) • bundled [BioCompute Objects](https://www.biocomputeobject.org/)

```mermaid
graph TD;
  A[Snakemake] -->|import| IR((IR));
  B[DAGMan]   --> IR;
  C[CWL]      --> IR;
  D[Nextflow] --> IR;
  E[WDL]      --> IR;
  F[Galaxy]   --> IR;
  IR -->|export| GA2[Galaxy];
  IR --> SMK2[Snakemake];
  IR --> CWL2[CWL];
  IR --> DAG2[DAGMan];
  IR --> NF2[Nextflow];
  IR --> WDL2[WDL];
```

`wf2wf` is called from the command line as:
```bash
# Convert Snakemake → DAGMan and auto-generate Markdown report
wf2wf convert -i pipeline.smk -o pipeline.dag --auto-env build --interactive --report-md
```

---
## 📋 Table of Contents

- [Features](#-features)
- [Installation](#-installation)
- [Quick CLI Tour](#-quick-cli-tour)
- [Commands](#-commands-overview)
- [Examples](#-examples)
- [Contributing](#-contributing)
- [Support](#-support)
- [License](#-license)
- [Acknowledgements](#-acknowledgements)

---

## ✨ Features

- **🔄 Universal Conversion** – Any supported engine → IR → any other engine with a single command.
- **🧬 Loss-Mapping & Round-Trip Fidelity** – Structured loss reports (`*.loss.json`) and automatic reinjection guarantee nothing disappears silently.
- **🐳 Automated Environment Builds** – Optional Conda-to-OCI pipeline (micromamba → conda-pack → buildx/buildah) with digest-pinned image references, SBOMs and Apptainer conversion.
- **⚖ Regulatory & Provenance Support** – BioCompute Object generation, schema validation and side-car provenance for FDA submissions.
- **🧪 Aiming for Quality** – high test coverage, semantic versioning, graceful degradation when optional external tools are missing.
- **🔧 Smart Configuration Analysis** – Automatic detection and warnings for missing resource requirements, containers, error handling, and file transfer modes when converting between shared filesystem and distributed computing workflows.
- **💬 Interactive Mode** – Guided prompts to help users address configuration gaps and optimize workflows for target execution environments.

### Information-loss workflow

`wf2wf` should record every field the target engine cannot express:

```console
⚠ Conversion losses: 2 (lost), 1 (lost again), 7 (reapplied)
```

* `lost` – field dropped in this conversion
* `lost again` – it was already lost by a previous exporter
* `reapplied` – successfully restored from a side-car when converting back to a richer format

Use `--fail-on-loss` to abort if any *lost/lost again* entries remain.

### Configuration Analysis

When converting between different workflow execution environments, `wf2wf` automatically detects potential issues:

```console
## Configuration Analysis

### Potential Issues for Distributed Computing

* **Memory**: 2 tasks without explicit memory requirements
* **Containers**: 3 tasks without container/conda specifications
* **Error Handling**: 3 tasks without retry specifications
* **File Transfer**: 6 files with auto-detected transfer modes

**Recommendations:**
* Add explicit resource requirements for all tasks
* Specify container images or conda environments for environment isolation
* Configure retry policies for fault tolerance
* Review file transfer modes for distributed execution
```

Use `--interactive` to get guided prompts for addressing these issues automatically.

---

## 📦 Installation

```bash
# PyPI (recommended)
pip install wf2wf

# or conda-forge (once feedstock is merged)
conda install -c conda-forge wf2wf
```

Development install:

```bash
git clone https://github.com/csmcal/wf2wf.git && cd wf2wf
pip install -e .[dev]
pre-commit install
pytest -q
```

---

## 🚀 Quick CLI Tour

```bash
# Convert Snakemake → DAGMan and build digest-pinned images
wf2wf convert -i Snakefile -o pipeline.dag --auto-env build --push-registry ghcr.io/myorg --report-md --interactive

# Convert CWL → Nextflow, abort on any information loss
wf2wf convert -i analysis.cwl -o main.nf --out-format nextflow --fail-on-loss

# Validate a workflow and its loss side-car
wf2wf validate pipeline.dag
```

Interactive prompts (`--interactive`) use `y/n/always/quit`; loss prompts appear only for *warn/error* severities.

---

## 🛠 Commands Overview

| Command | Purpose |
|---------|---------|
| `wf2wf convert`  | Convert workflows between formats (all conversions go via the IR) |
| `wf2wf validate` | Validate a workflow file or a `.loss.json` side-car |
| `wf2wf info`     | Pretty-print summary statistics of a workflow |
| `wf2wf bco sign` | Sign a BioCompute Object and generate provenance attestation |
| `wf2wf bco package` | Bundle a BCO and its artefacts (e.g. eSTAR ZIP) |

Each command accepts `--help` for full usage details.

### Auto-detection matrix

| Extension | Format |
|-----------|--------|
| `.cwl`          | CWL |
| `.dag`          | DAGMan |
| `.ga`           | Galaxy |
| `.json`         | IR (JSON) |
| `.nf`           | Nextflow |
| `.smk`          | Snakemake |
| `.wdl`          | WDL |
| `.yaml`, `.yml` | IR (YAML) |
---

## 🔬 Examples

### Example 1 – Snakemake → DAGMan with automatic environment build

```bash
wf2wf convert -i Snakefile \
              -o pipeline.dag \
              --out-format dagman \
              --auto-env build --push-registry ghcr.io/myorg
```

### Example 2 – CWL → Nextflow round-trip

```bash
# CWL → IR → Nextflow
airflow_cwl="workflow.cwl"
wf2wf convert -i "$airflow_cwl" -o main.nf --out-format nextflow

# … do some edits …
# Nextflow → IR → CWL (should restore metadata)
wf2wf convert -i main.nf -o roundtrip.cwl --out-format cwl
```

### Example 3 – WDL → CWL with loss checking

```bash
wf2wf convert -i assembly.wdl -o assembly.cwl --out-format cwl --fail-on-loss
```

More recipes live in the `examples/` directory.

---

## 🔄 Workflow Conversion Differences

When converting between different workflow execution environments, several key differences need to be addressed:

### Shared Filesystem vs Distributed Computing

**Shared Filesystem Workflows** (Snakemake, CWL, Nextflow):
- Assume all files are accessible on a shared filesystem
- Often have minimal resource specifications
- Rely on system-wide software or conda environments
- Basic error handling and retry mechanisms

**Distributed Computing Workflows** (HTCondor/DAGMan):
- Require explicit file transfer specifications
- Need explicit resource allocation (CPU, memory, disk)
- Require container specifications for environment isolation
- Benefit from sophisticated retry policies and error handling

### Key Conversion Challenges

| Challenge | Shared → Distributed | Distributed → Shared |
|-----------|---------------------|---------------------|
| **File Transfer** | Add `transfer_input_files`/`transfer_output_files` | Remove transfer specifications |
| **Resources** | Add `request_cpus`, `request_memory`, `request_disk` | Convert to engine-specific resource formats |
| **Containers** | Specify Docker/Singularity images | Map to conda environments or system packages |
| **Error Handling** | Add retry policies and error strategies | Convert to engine-specific error handling |
| **Scatter/Gather** | Expand to explicit job definitions | Map to wildcards or engine-specific parallelization |

### Interactive Configuration Assistance

Use `--interactive` mode to get guided assistance:

```bash
# Interactive conversion with configuration prompts
wf2wf convert -i Snakefile -o workflow.dag --interactive

# Example prompts you'll see:
# Found 3 tasks without explicit resource requirements. 
# Distributed systems require explicit resource allocation. 
# Add default resource specifications? (y)es/(n)o/(a)lways/(q)uit: y
```

### Automatic Configuration Analysis

The conversion report includes detailed analysis:

```markdown
## Configuration Analysis

### Potential Issues for Distributed Computing

* **Memory**: 2 tasks without explicit memory requirements
* **Containers**: 3 tasks without container/conda specifications  
* **Error Handling**: 3 tasks without retry specifications
* **File Transfer**: 6 files with auto-detected transfer modes

**Recommendations:**
* Add explicit resource requirements for all tasks
* Specify container images or conda environments for environment isolation
* Configure retry policies for fault tolerance
* Review file transfer modes for distributed execution
```

See [File Transfer Handling](docs/user_guide/file_transfers.md) for detailed information about file transfer modes and best practices.

---
## 🤝 Contributing

1. Fork the repository and create a feature branch (`git checkout -b feature/amazing-feature`)
2. Add tests for new functionality
3. Ensure all tests pass (`pytest -q`)
4. Open a Pull Request – GitHub Actions will run the test matrix automatically

Please read `CONTRIBUTING.md` for the full guidelines.

### 🧪 Testing

Run the comprehensive test suite:
```bash
# Run all tests
python -m pytest tests/ -v

# Run specific test categories
python -m pytest tests/test_conversions.py::TestConversions::test_linear_workflow_conversion -v

# Run with coverage
python -m pytest tests/ --cov=wf2wf --cov-report=html
```
---

## 📞 Support

- 📖 **Documentation** – The `docs/` folder contains a growing knowledge-base, rendered on ReadTheDocs.
- 🐛 **Issues** – Found a bug or missing feature? [Open an issue](../../issues).
- 💬 **Discussions** – General questions and ideas live in [GitHub Discussions](../../discussions).

---

## 📄 License

`wf2wf` is licensed under the MIT License – see the [LICENSE](LICENSE) file for details.

---

## 🙏 Acknowledgments

- [CHTC](https://chtc.cs.wisc.edu/) - The Center for High Throughput Computing for testing and feedback
- The [OpenAI](https://openai.com/) and [Anthropic](https://www.anthropic.com/)-powered coding assistants whose suggestions accelerated feature implementation
- [Cursor](https://cursor.sh/) - Interactive IDE used for pair-programming and AI-assisted development
---

*Bridging workflow ecosystems.*

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "wf2wf",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": "workflow, cwl, snakemake, htcondor, dagman, nextflow, wdl, bioinformatics",
    "author": null,
    "author_email": "wf2wf contributors <maintainers@wf2wf.dev>",
    "download_url": "https://files.pythonhosted.org/packages/ff/da/35d538f441d6fb90a51a292b3e7e06e199a18e80a81859585f1721b462ee/wf2wf-1.1.0.tar.gz",
    "platform": null,
    "description": "# wf2wf \u2013 Universal Workflow-Format Converter\n\n[![Python](https://img.shields.io/badge/python-3.9%2B-blue.svg)](https://www.python.org/downloads/)\n[![CI](https://github.com/csmcal/wf2wf/actions/workflows/ci.yml/badge.svg)](https://github.com/csmcal/wf2wf/actions/workflows/ci.yml)\n[![Docs](https://github.com/csmcal/wf2wf/actions/workflows/docs.yml/badge.svg)](https://csmcal.github.io/wf2wf/)\n[![PyPI](https://img.shields.io/pypi/v/wf2wf.svg)](https://pypi.org/project/wf2wf)\n[![codecov](https://codecov.io/gh/csmcal/wf2wf/branch/main/graph/badge.svg)](https://codecov.io/gh/csmcal/wf2wf)\n[![License](https://img.shields.io/badge/license-MIT-green.svg)](LICENSE)\n[![Workflow Schema](https://img.shields.io/badge/schema-wf.json-blue)](wf2wf/schemas/v0.1/wf.json)\n[![Loss Schema](https://img.shields.io/badge/schema-loss.json-blue)](wf2wf/schemas/v0.1/loss.json)\n\n`wf2wf` is a **format-agnostic converter**: any supported engine \u2192 **Intermediate Representation (IR)** \u2192 any other engine.  The core library handles:\u2003\n\n\u2022 [Snakemake](https://snakemake.readthedocs.io/)\u2003\u2022 [Nextflow](https://www.nextflow.io/)\u2003\u2022 [CWL](https://www.commonwl.org/)\u2003\u2022 [HTCondor](https://htcondor.org/)/[DAGMan](https://htcondor.readthedocs.io/en/latest/automated-workflows/dagman-introduction.html)\u2003\u2022 [WDL](https://openwdl.org/)\u2003\u2022 [Galaxy](https://galaxyproject.org/)\u2003\u2022 bundled [BioCompute Objects](https://www.biocomputeobject.org/)\n\n```mermaid\ngraph TD;\n  A[Snakemake] -->|import| IR((IR));\n  B[DAGMan]   --> IR;\n  C[CWL]      --> IR;\n  D[Nextflow] --> IR;\n  E[WDL]      --> IR;\n  F[Galaxy]   --> IR;\n  IR -->|export| GA2[Galaxy];\n  IR --> SMK2[Snakemake];\n  IR --> CWL2[CWL];\n  IR --> DAG2[DAGMan];\n  IR --> NF2[Nextflow];\n  IR --> WDL2[WDL];\n```\n\n`wf2wf` is called from the command line as:\n```bash\n# Convert Snakemake \u2192 DAGMan and auto-generate Markdown report\nwf2wf convert -i pipeline.smk -o pipeline.dag --auto-env build --interactive --report-md\n```\n\n---\n## \ud83d\udccb Table of Contents\n\n- [Features](#-features)\n- [Installation](#-installation)\n- [Quick CLI Tour](#-quick-cli-tour)\n- [Commands](#-commands-overview)\n- [Examples](#-examples)\n- [Contributing](#-contributing)\n- [Support](#-support)\n- [License](#-license)\n- [Acknowledgements](#-acknowledgements)\n\n---\n\n## \u2728 Features\n\n- **\ud83d\udd04 Universal Conversion** \u2013 Any supported engine \u2192 IR \u2192 any other engine with a single command.\n- **\ud83e\uddec Loss-Mapping & Round-Trip Fidelity** \u2013 Structured loss reports (`*.loss.json`) and automatic reinjection guarantee nothing disappears silently.\n- **\ud83d\udc33 Automated Environment Builds** \u2013 Optional Conda-to-OCI pipeline (micromamba \u2192 conda-pack \u2192 buildx/buildah) with digest-pinned image references, SBOMs and Apptainer conversion.\n- **\u2696 Regulatory & Provenance Support** \u2013 BioCompute Object generation, schema validation and side-car provenance for FDA submissions.\n- **\ud83e\uddea Aiming for Quality** \u2013 high test coverage, semantic versioning, graceful degradation when optional external tools are missing.\n- **\ud83d\udd27 Smart Configuration Analysis** \u2013 Automatic detection and warnings for missing resource requirements, containers, error handling, and file transfer modes when converting between shared filesystem and distributed computing workflows.\n- **\ud83d\udcac Interactive Mode** \u2013 Guided prompts to help users address configuration gaps and optimize workflows for target execution environments.\n\n### Information-loss workflow\n\n`wf2wf` should record every field the target engine cannot express:\n\n```console\n\u26a0 Conversion losses: 2 (lost), 1 (lost again), 7 (reapplied)\n```\n\n* `lost` \u2013 field dropped in this conversion\n* `lost again` \u2013 it was already lost by a previous exporter\n* `reapplied` \u2013 successfully restored from a side-car when converting back to a richer format\n\nUse `--fail-on-loss` to abort if any *lost/lost again* entries remain.\n\n### Configuration Analysis\n\nWhen converting between different workflow execution environments, `wf2wf` automatically detects potential issues:\n\n```console\n## Configuration Analysis\n\n### Potential Issues for Distributed Computing\n\n* **Memory**: 2 tasks without explicit memory requirements\n* **Containers**: 3 tasks without container/conda specifications\n* **Error Handling**: 3 tasks without retry specifications\n* **File Transfer**: 6 files with auto-detected transfer modes\n\n**Recommendations:**\n* Add explicit resource requirements for all tasks\n* Specify container images or conda environments for environment isolation\n* Configure retry policies for fault tolerance\n* Review file transfer modes for distributed execution\n```\n\nUse `--interactive` to get guided prompts for addressing these issues automatically.\n\n---\n\n## \ud83d\udce6 Installation\n\n```bash\n# PyPI (recommended)\npip install wf2wf\n\n# or conda-forge (once feedstock is merged)\nconda install -c conda-forge wf2wf\n```\n\nDevelopment install:\n\n```bash\ngit clone https://github.com/csmcal/wf2wf.git && cd wf2wf\npip install -e .[dev]\npre-commit install\npytest -q\n```\n\n---\n\n## \ud83d\ude80 Quick CLI Tour\n\n```bash\n# Convert Snakemake \u2192 DAGMan and build digest-pinned images\nwf2wf convert -i Snakefile -o pipeline.dag --auto-env build --push-registry ghcr.io/myorg --report-md --interactive\n\n# Convert CWL \u2192 Nextflow, abort on any information loss\nwf2wf convert -i analysis.cwl -o main.nf --out-format nextflow --fail-on-loss\n\n# Validate a workflow and its loss side-car\nwf2wf validate pipeline.dag\n```\n\nInteractive prompts (`--interactive`) use `y/n/always/quit`; loss prompts appear only for *warn/error* severities.\n\n---\n\n## \ud83d\udee0 Commands Overview\n\n| Command | Purpose |\n|---------|---------|\n| `wf2wf convert`  | Convert workflows between formats (all conversions go via the IR) |\n| `wf2wf validate` | Validate a workflow file or a `.loss.json` side-car |\n| `wf2wf info`     | Pretty-print summary statistics of a workflow |\n| `wf2wf bco sign` | Sign a BioCompute Object and generate provenance attestation |\n| `wf2wf bco package` | Bundle a BCO and its artefacts (e.g. eSTAR ZIP) |\n\nEach command accepts `--help` for full usage details.\n\n### Auto-detection matrix\n\n| Extension | Format |\n|-----------|--------|\n| `.cwl`          | CWL |\n| `.dag`          | DAGMan |\n| `.ga`           | Galaxy |\n| `.json`         | IR (JSON) |\n| `.nf`           | Nextflow |\n| `.smk`          | Snakemake |\n| `.wdl`          | WDL |\n| `.yaml`, `.yml` | IR (YAML) |\n---\n\n## \ud83d\udd2c Examples\n\n### Example 1 \u2013 Snakemake \u2192 DAGMan with automatic environment build\n\n```bash\nwf2wf convert -i Snakefile \\\n              -o pipeline.dag \\\n              --out-format dagman \\\n              --auto-env build --push-registry ghcr.io/myorg\n```\n\n### Example 2 \u2013 CWL \u2192 Nextflow round-trip\n\n```bash\n# CWL \u2192 IR \u2192 Nextflow\nairflow_cwl=\"workflow.cwl\"\nwf2wf convert -i \"$airflow_cwl\" -o main.nf --out-format nextflow\n\n# \u2026 do some edits \u2026\n# Nextflow \u2192 IR \u2192 CWL (should restore metadata)\nwf2wf convert -i main.nf -o roundtrip.cwl --out-format cwl\n```\n\n### Example 3 \u2013 WDL \u2192 CWL with loss checking\n\n```bash\nwf2wf convert -i assembly.wdl -o assembly.cwl --out-format cwl --fail-on-loss\n```\n\nMore recipes live in the `examples/` directory.\n\n---\n\n## \ud83d\udd04 Workflow Conversion Differences\n\nWhen converting between different workflow execution environments, several key differences need to be addressed:\n\n### Shared Filesystem vs Distributed Computing\n\n**Shared Filesystem Workflows** (Snakemake, CWL, Nextflow):\n- Assume all files are accessible on a shared filesystem\n- Often have minimal resource specifications\n- Rely on system-wide software or conda environments\n- Basic error handling and retry mechanisms\n\n**Distributed Computing Workflows** (HTCondor/DAGMan):\n- Require explicit file transfer specifications\n- Need explicit resource allocation (CPU, memory, disk)\n- Require container specifications for environment isolation\n- Benefit from sophisticated retry policies and error handling\n\n### Key Conversion Challenges\n\n| Challenge | Shared \u2192 Distributed | Distributed \u2192 Shared |\n|-----------|---------------------|---------------------|\n| **File Transfer** | Add `transfer_input_files`/`transfer_output_files` | Remove transfer specifications |\n| **Resources** | Add `request_cpus`, `request_memory`, `request_disk` | Convert to engine-specific resource formats |\n| **Containers** | Specify Docker/Singularity images | Map to conda environments or system packages |\n| **Error Handling** | Add retry policies and error strategies | Convert to engine-specific error handling |\n| **Scatter/Gather** | Expand to explicit job definitions | Map to wildcards or engine-specific parallelization |\n\n### Interactive Configuration Assistance\n\nUse `--interactive` mode to get guided assistance:\n\n```bash\n# Interactive conversion with configuration prompts\nwf2wf convert -i Snakefile -o workflow.dag --interactive\n\n# Example prompts you'll see:\n# Found 3 tasks without explicit resource requirements. \n# Distributed systems require explicit resource allocation. \n# Add default resource specifications? (y)es/(n)o/(a)lways/(q)uit: y\n```\n\n### Automatic Configuration Analysis\n\nThe conversion report includes detailed analysis:\n\n```markdown\n## Configuration Analysis\n\n### Potential Issues for Distributed Computing\n\n* **Memory**: 2 tasks without explicit memory requirements\n* **Containers**: 3 tasks without container/conda specifications  \n* **Error Handling**: 3 tasks without retry specifications\n* **File Transfer**: 6 files with auto-detected transfer modes\n\n**Recommendations:**\n* Add explicit resource requirements for all tasks\n* Specify container images or conda environments for environment isolation\n* Configure retry policies for fault tolerance\n* Review file transfer modes for distributed execution\n```\n\nSee [File Transfer Handling](docs/user_guide/file_transfers.md) for detailed information about file transfer modes and best practices.\n\n---\n## \ud83e\udd1d Contributing\n\n1. Fork the repository and create a feature branch (`git checkout -b feature/amazing-feature`)\n2. Add tests for new functionality\n3. Ensure all tests pass (`pytest -q`)\n4. Open a Pull Request \u2013 GitHub Actions will run the test matrix automatically\n\nPlease read `CONTRIBUTING.md` for the full guidelines.\n\n### \ud83e\uddea Testing\n\nRun the comprehensive test suite:\n```bash\n# Run all tests\npython -m pytest tests/ -v\n\n# Run specific test categories\npython -m pytest tests/test_conversions.py::TestConversions::test_linear_workflow_conversion -v\n\n# Run with coverage\npython -m pytest tests/ --cov=wf2wf --cov-report=html\n```\n---\n\n## \ud83d\udcde Support\n\n- \ud83d\udcd6 **Documentation** \u2013 The `docs/` folder contains a growing knowledge-base, rendered on ReadTheDocs.\n- \ud83d\udc1b **Issues** \u2013 Found a bug or missing feature? [Open an issue](../../issues).\n- \ud83d\udcac **Discussions** \u2013 General questions and ideas live in [GitHub Discussions](../../discussions).\n\n---\n\n## \ud83d\udcc4 License\n\n`wf2wf` is licensed under the MIT License \u2013 see the [LICENSE](LICENSE) file for details.\n\n---\n\n## \ud83d\ude4f Acknowledgments\n\n- [CHTC](https://chtc.cs.wisc.edu/) - The Center for High Throughput Computing for testing and feedback\n- The [OpenAI](https://openai.com/) and [Anthropic](https://www.anthropic.com/)-powered coding assistants whose suggestions accelerated feature implementation\n- [Cursor](https://cursor.sh/) - Interactive IDE used for pair-programming and AI-assisted development\n---\n\n*Bridging workflow ecosystems.*\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Universal workflow-format converter built around a loss-preserving intermediate representation",
    "version": "1.1.0",
    "project_urls": {
        "Changelog": "https://github.com/csmcal/wf2wf/blob/main/CHANGELOG.md",
        "Documentation": "https://csmcal.github.io/wf2wf/",
        "Homepage": "https://csmcal.github.io/wf2wf/",
        "Issues": "https://github.com/csmcal/wf2wf/issues",
        "Repository": "https://github.com/csmcal/wf2wf"
    },
    "split_keywords": [
        "workflow",
        " cwl",
        " snakemake",
        " htcondor",
        " dagman",
        " nextflow",
        " wdl",
        " bioinformatics"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "a96f5ce1f6c6997abb1cae767ed12ad7c098aabfa91046787620c778632eb7a3",
                "md5": "e7b1a9540eef2d7768eadb7b2d50d754",
                "sha256": "e38563fb476d7ea93c8a1afb143f49c1c5cee8d09c607228e1c26c37a6aad01e"
            },
            "downloads": -1,
            "filename": "wf2wf-1.1.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "e7b1a9540eef2d7768eadb7b2d50d754",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 241413,
            "upload_time": "2025-07-15T19:08:22",
            "upload_time_iso_8601": "2025-07-15T19:08:22.051452Z",
            "url": "https://files.pythonhosted.org/packages/a9/6f/5ce1f6c6997abb1cae767ed12ad7c098aabfa91046787620c778632eb7a3/wf2wf-1.1.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "ffda35d538f441d6fb90a51a292b3e7e06e199a18e80a81859585f1721b462ee",
                "md5": "64f763d754dfec2a4e8688d9d60541e7",
                "sha256": "bc2a3293f3ebb667149642029449d9b3f5e755f0bc82b741c1a263d678ff7a90"
            },
            "downloads": -1,
            "filename": "wf2wf-1.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "64f763d754dfec2a4e8688d9d60541e7",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 254642,
            "upload_time": "2025-07-15T19:08:23",
            "upload_time_iso_8601": "2025-07-15T19:08:23.255973Z",
            "url": "https://files.pythonhosted.org/packages/ff/da/35d538f441d6fb90a51a292b3e7e06e199a18e80a81859585f1721b462ee/wf2wf-1.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-15 19:08:23",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "csmcal",
    "github_project": "wf2wf",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "wf2wf"
}
        
Elapsed time: 0.63882s