aind-codeocean-pipeline-monitor


Nameaind-codeocean-pipeline-monitor JSON
Version 0.3.0 PyPI version JSON
download
home_pageNone
SummaryPackage to define and run a Code Ocean Pipeline Monitor Job
upload_time2024-10-17 00:31:03
maintainerNone
docs_urlNone
authorAllen Institute for Neural Dynamics
requires_python>=3.9
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # aind-codeocean-pipeline-monitor

[![License](https://img.shields.io/badge/license-MIT-brightgreen)](LICENSE)
![Code Style](https://img.shields.io/badge/code%20style-black-black)
[![semantic-release: angular](https://img.shields.io/badge/semantic--release-angular-e10079?logo=semantic-release)](https://github.com/semantic-release/semantic-release)
![Interrogate](https://img.shields.io/badge/interrogate-100.0%25-brightgreen)
![Coverage](https://img.shields.io/badge/coverage-100%25-brightgreen?logo=codecov)
![Python](https://img.shields.io/badge/python->=3.9-blue?logo=python)

Package for starting a pipeline, waiting for it to finish, and optionally capturing the results as a data asset.

## Usage
- Define job using PipelineMonitorJobSettings class.
- Define a CodeOcean client.
- Construct a PipelineMonitorJob with these settings.
- Run the job with the run_job method.

```python
from aind_codeocean_pipeline_monitor.job import PipelineMonitorJob
from aind_codeocean_pipeline_monitor.models import (
    CaptureSettings,
    PipelineMonitorSettings,
)

from codeocean.computation import (
    DataAssetsRunParam,
    RunParams,
)
from codeocean import CodeOcean
import os

domain = os.getenv("CODEOCEAN_DOMAIN")
token = os.getenv("CODEOCEAN_TOKEN")

codeocean = CodeOcean(domain=domain, token=token)

# Please consult Code Ocean docs for info about RunParams and DataAssetParams
settings = PipelineMonitorSettings(
    run_params=RunParams(
        capsule_id="<your capsule id>",
        data_assets=[
            DataAssetsRunParam(
                id="<your input data asset id>",
                mount="<your input data mount>",
            )
        ],
    ),
    capture_settings=CaptureSettings(
        tags=["derived"]
    ),  # 'tags' is the only required field
)

job = PipelineMonitorJob(settings=settings, client=codeocean)
job.run_job()
```


## Installation
To use the software, in the root directory, run
```bash
pip install -e .
```

To develop the code, run
```bash
pip install -e .[dev]
```

## Contributing

### Linters and testing

There are several libraries used to run linters, check documentation, and run tests.

- Please test your changes using the **coverage** library, which will run the tests and log a coverage report:

```bash
coverage run -m unittest discover && coverage report
```

- Use **interrogate** to check that modules, methods, etc. have been documented thoroughly:

```bash
interrogate .
```

- Use **flake8** to check that code is up to standards (no unused imports, etc.):
```bash
flake8 .
```

- Use **black** to automatically format the code into PEP standards:
```bash
black .
```

- Use **isort** to automatically sort import statements:
```bash
isort .
```

### Pull requests

For internal members, please create a branch. For external members, please fork the repository and open a pull request from the fork. We'll primarily use [Angular](https://github.com/angular/angular/blob/main/CONTRIBUTING.md#commit) style for commit messages. Roughly, they should follow the pattern:
```text
<type>(<scope>): <short summary>
```

where scope (optional) describes the packages affected by the code changes and type (mandatory) is one of:

- **build**: Changes that affect build tools or external dependencies (example scopes: pyproject.toml, setup.py)
- **ci**: Changes to our CI configuration files and scripts (examples: .github/workflows/ci.yml)
- **docs**: Documentation only changes
- **feat**: A new feature
- **fix**: A bugfix
- **perf**: A code change that improves performance
- **refactor**: A code change that neither fixes a bug nor adds a feature
- **test**: Adding missing tests or correcting existing tests

### Semantic Release

The table below, from [semantic release](https://github.com/semantic-release/semantic-release), shows which commit message gets you which release type when `semantic-release` runs (using the default configuration):

| Commit message                                                                                                                                                                                   | Release type                                                                                                    |
| ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | --------------------------------------------------------------------------------------------------------------- |
| `fix(pencil): stop graphite breaking when too much pressure applied`                                                                                                                             | ~~Patch~~ Fix Release, Default release                                                                          |
| `feat(pencil): add 'graphiteWidth' option`                                                                                                                                                       | ~~Minor~~ Feature Release                                                                                       |
| `perf(pencil): remove graphiteWidth option`<br><br>`BREAKING CHANGE: The graphiteWidth option has been removed.`<br>`The default graphite width of 10mm is always used for performance reasons.` | ~~Major~~ Breaking Release <br /> (Note that the `BREAKING CHANGE: ` token must be in the footer of the commit) |

### Documentation
To generate the rst files source files for documentation, run
```bash
sphinx-apidoc -o docs/source/ src
```
Then to create the documentation HTML files, run
```bash
sphinx-build -b html docs/source/ docs/build/html
```
More info on sphinx installation can be found [here](https://www.sphinx-doc.org/en/master/usage/installation.html).

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "aind-codeocean-pipeline-monitor",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": "Allen Institute for Neural Dynamics",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/99/2d/8606efdcdb64a28bd45ae4d8321203b338527f1d27255759aa99b166c269/aind_codeocean_pipeline_monitor-0.3.0.tar.gz",
    "platform": null,
    "description": "# aind-codeocean-pipeline-monitor\n\n[![License](https://img.shields.io/badge/license-MIT-brightgreen)](LICENSE)\n![Code Style](https://img.shields.io/badge/code%20style-black-black)\n[![semantic-release: angular](https://img.shields.io/badge/semantic--release-angular-e10079?logo=semantic-release)](https://github.com/semantic-release/semantic-release)\n![Interrogate](https://img.shields.io/badge/interrogate-100.0%25-brightgreen)\n![Coverage](https://img.shields.io/badge/coverage-100%25-brightgreen?logo=codecov)\n![Python](https://img.shields.io/badge/python->=3.9-blue?logo=python)\n\nPackage for starting a pipeline, waiting for it to finish, and optionally capturing the results as a data asset.\n\n## Usage\n- Define job using PipelineMonitorJobSettings class.\n- Define a CodeOcean client.\n- Construct a PipelineMonitorJob with these settings.\n- Run the job with the run_job method.\n\n```python\nfrom aind_codeocean_pipeline_monitor.job import PipelineMonitorJob\nfrom aind_codeocean_pipeline_monitor.models import (\n    CaptureSettings,\n    PipelineMonitorSettings,\n)\n\nfrom codeocean.computation import (\n    DataAssetsRunParam,\n    RunParams,\n)\nfrom codeocean import CodeOcean\nimport os\n\ndomain = os.getenv(\"CODEOCEAN_DOMAIN\")\ntoken = os.getenv(\"CODEOCEAN_TOKEN\")\n\ncodeocean = CodeOcean(domain=domain, token=token)\n\n# Please consult Code Ocean docs for info about RunParams and DataAssetParams\nsettings = PipelineMonitorSettings(\n    run_params=RunParams(\n        capsule_id=\"<your capsule id>\",\n        data_assets=[\n            DataAssetsRunParam(\n                id=\"<your input data asset id>\",\n                mount=\"<your input data mount>\",\n            )\n        ],\n    ),\n    capture_settings=CaptureSettings(\n        tags=[\"derived\"]\n    ),  # 'tags' is the only required field\n)\n\njob = PipelineMonitorJob(settings=settings, client=codeocean)\njob.run_job()\n```\n\n\n## Installation\nTo use the software, in the root directory, run\n```bash\npip install -e .\n```\n\nTo develop the code, run\n```bash\npip install -e .[dev]\n```\n\n## Contributing\n\n### Linters and testing\n\nThere are several libraries used to run linters, check documentation, and run tests.\n\n- Please test your changes using the **coverage** library, which will run the tests and log a coverage report:\n\n```bash\ncoverage run -m unittest discover && coverage report\n```\n\n- Use **interrogate** to check that modules, methods, etc. have been documented thoroughly:\n\n```bash\ninterrogate .\n```\n\n- Use **flake8** to check that code is up to standards (no unused imports, etc.):\n```bash\nflake8 .\n```\n\n- Use **black** to automatically format the code into PEP standards:\n```bash\nblack .\n```\n\n- Use **isort** to automatically sort import statements:\n```bash\nisort .\n```\n\n### Pull requests\n\nFor internal members, please create a branch. For external members, please fork the repository and open a pull request from the fork. We'll primarily use [Angular](https://github.com/angular/angular/blob/main/CONTRIBUTING.md#commit) style for commit messages. Roughly, they should follow the pattern:\n```text\n<type>(<scope>): <short summary>\n```\n\nwhere scope (optional) describes the packages affected by the code changes and type (mandatory) is one of:\n\n- **build**: Changes that affect build tools or external dependencies (example scopes: pyproject.toml, setup.py)\n- **ci**: Changes to our CI configuration files and scripts (examples: .github/workflows/ci.yml)\n- **docs**: Documentation only changes\n- **feat**: A new feature\n- **fix**: A bugfix\n- **perf**: A code change that improves performance\n- **refactor**: A code change that neither fixes a bug nor adds a feature\n- **test**: Adding missing tests or correcting existing tests\n\n### Semantic Release\n\nThe table below, from [semantic release](https://github.com/semantic-release/semantic-release), shows which commit message gets you which release type when `semantic-release` runs (using the default configuration):\n\n| Commit message                                                                                                                                                                                   | Release type                                                                                                    |\n| ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | --------------------------------------------------------------------------------------------------------------- |\n| `fix(pencil): stop graphite breaking when too much pressure applied`                                                                                                                             | ~~Patch~~ Fix Release, Default release                                                                          |\n| `feat(pencil): add 'graphiteWidth' option`                                                                                                                                                       | ~~Minor~~ Feature Release                                                                                       |\n| `perf(pencil): remove graphiteWidth option`<br><br>`BREAKING CHANGE: The graphiteWidth option has been removed.`<br>`The default graphite width of 10mm is always used for performance reasons.` | ~~Major~~ Breaking Release <br /> (Note that the `BREAKING CHANGE: ` token must be in the footer of the commit) |\n\n### Documentation\nTo generate the rst files source files for documentation, run\n```bash\nsphinx-apidoc -o docs/source/ src\n```\nThen to create the documentation HTML files, run\n```bash\nsphinx-build -b html docs/source/ docs/build/html\n```\nMore info on sphinx installation can be found [here](https://www.sphinx-doc.org/en/master/usage/installation.html).\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Package to define and run a Code Ocean Pipeline Monitor Job",
    "version": "0.3.0",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "cf8d1413fd35369a1ebe7b2e2cf6fbfd92a62a0766aa19e51394a90b2d901e71",
                "md5": "c31e704df7226f85a780732475c22c48",
                "sha256": "748b031b4d11789bcfd923d3d3408ff5c782597598d82c6b7717ac488e527505"
            },
            "downloads": -1,
            "filename": "aind_codeocean_pipeline_monitor-0.3.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "c31e704df7226f85a780732475c22c48",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 9014,
            "upload_time": "2024-10-17T00:31:02",
            "upload_time_iso_8601": "2024-10-17T00:31:02.791067Z",
            "url": "https://files.pythonhosted.org/packages/cf/8d/1413fd35369a1ebe7b2e2cf6fbfd92a62a0766aa19e51394a90b2d901e71/aind_codeocean_pipeline_monitor-0.3.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "992d8606efdcdb64a28bd45ae4d8321203b338527f1d27255759aa99b166c269",
                "md5": "02856c8765accdcd12db69df2bfb8691",
                "sha256": "f501eeb7ea8c59f5761f4257e499e46411b6b066976b3b0c5209be5783395a0e"
            },
            "downloads": -1,
            "filename": "aind_codeocean_pipeline_monitor-0.3.0.tar.gz",
            "has_sig": false,
            "md5_digest": "02856c8765accdcd12db69df2bfb8691",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 43805,
            "upload_time": "2024-10-17T00:31:03",
            "upload_time_iso_8601": "2024-10-17T00:31:03.824104Z",
            "url": "https://files.pythonhosted.org/packages/99/2d/8606efdcdb64a28bd45ae4d8321203b338527f1d27255759aa99b166c269/aind_codeocean_pipeline_monitor-0.3.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-10-17 00:31:03",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "aind-codeocean-pipeline-monitor"
}
        
Elapsed time: 0.45350s