wkcuber


Namewkcuber JSON
Version 0.12.6 PyPI version JSON
download
home_pagehttps://docs.webknossos.org/wkcuber
SummaryPython package to create, cube, and work with WEBKNOSSOS WKW datasets
upload_time2023-06-09 11:35:53
maintainer
docs_urlNone
authorscalable minds
requires_python>=3.8,<3.11
licenseAGPL-3.0
keywords
VCS
bugtrack_url
requirements dunamai poetry
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # WEBKNOSSOS cuber (wkcuber)
[![PyPI version](https://img.shields.io/pypi/v/wkcuber)](https://pypi.python.org/pypi/wkcuber)
[![Supported Python Versions](https://img.shields.io/pypi/pyversions/wkcuber.svg)](https://pypi.python.org/pypi/wkcuber)
[![Build Status](https://img.shields.io/github/actions/workflow/status/scalableminds/webknossos-libs/.github/workflows/ci.yml?branch=master)](https://github.com/scalableminds/webknossos-libs/actions?query=workflow%3A%22CI%22)
[![Documentation](https://img.shields.io/badge/docs-passing-brightgreen.svg)](https://docs.webknossos.org/wkcuber/index.html)
[![Code Style](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)

Python library for creating and working with [WEBKNOSSOS](https://webknossos.org) [WKW](https://github.com/scalableminds/webknossos-wrap) datasets. WKW is a container format for efficiently storing large, scale 3D image data as found in (electron) microscopy.

The tools are modular components to allow easy integration into existing pipelines and workflows.

## Features

* `wkcuber`: Convert supported input files to fully ready WKW datasets (includes type detection, downsampling, compressing and metadata generation)
* `wkcuber.convert_image_stack_to_wkw`: Convert image stacks to fully ready WKW datasets (includes downsampling, compressing and metadata generation)
* `wkcuber.export_wkw_as_tiff`: Convert WKW datasets to a tiff stack (writing as tiles to a `z/y/x.tiff` folder structure is also supported)
* `wkcuber.cubing`: Convert image stacks (e.g., `tiff`, `jpg`, `png`, `bmp`, `dm3`, `dm4`) to WKW cubes
* `wkcuber.tile_cubing`: Convert tiled image stacks (e.g. in `z/y/x.ext` folder structure) to WKW cubes
* `wkcuber.convert_knossos`: Convert KNOSSOS cubes to WKW cubes
* `wkcuber.convert_nifti`: Convert NIFTI files to WKW files (Currently without applying transformations).
* `wkcuber.convert_raw`: Convert RAW binary data (.raw, .vol) files to WKW datasets
* `wkcuber.downsampling`: Create downsampled magnifications (with `median`, `mode` and linear interpolation modes). Downsampling compresses the new magnifications by default (disable via `--no_compress`).
* `wkcuber.compress`: Compress WKW cubes for efficient file storage (especially useful for segmentation data)
* `wkcuber.metadata`: Create (or refresh) metadata (with guessing of most parameters)
* `wkcuber.recubing`: Read existing WKW cubes in and write them again specifying the WKW file length. Useful when dataset was written e.g. with file length 1.
* `wkcuber.check_equality`: Compare two WKW datasets to check whether they are equal (e.g., after compressing a dataset, this task can be useful to double-check that the compressed dataset contains the same data).
* Most modules support multiprocessing

## Supported input formats

* Standard image formats, e.g. `tiff`, `jpg`, `png`, `bmp`
* Proprietary image formats, e.g. `dm3`
* Tiled image stacks (used for Catmaid)
* KNOSSOS cubes
* NIFTI files
* Raw binary files

## Installation
### Python 3 with pip from PyPi
- `wkcuber` requires at least Python 3.8

```bash
# Make sure to have lz4 installed:
# Mac: brew install lz4
# Ubuntu/Debian: apt-get install liblz4-1
# CentOS/RHEL: yum install lz4

pip install wkcuber
```

### Docker
Use the CI-built image: [scalableminds/webknossos-cuber](https://hub.docker.com/r/scalableminds/webknossos-cuber/). Example usage `docker run -v <host path>:/data --rm scalableminds/webknossos-cuber wkcuber --layer_name color --scale 11.24,11.24,25 --name great_dataset /data/source/color /data/target`.


## Usage

```bash
# Convert arbitrary, supported input files into wkw datasets. This sets reasonable defaults, but see other commands for customization.
python -m wkcuber \
  --scale 11.24,11.24,25 \
  data/source data/target

# Convert image stacks into wkw datasets
python -m wkcuber.convert_image_stack_to_wkw \
  --layer_name color \
  --scale 11.24,11.24,25 \
  --name great_dataset \
  data/source/color data/target

# Convert image files to wkw cubes
python -m wkcuber.cubing --layer_name color data/source/color data/target
python -m wkcuber.cubing --layer_name segmentation data/source/segmentation data/target

# Convert tiled image files to wkw cubes
python -m wkcuber.tile_cubing --layer_name color data/source data/target

# Convert Knossos cubes to wkw cubes
python -m wkcuber.convert_knossos --layer_name color data/source/mag1 data/target

# Convert NIFTI file to wkw file
python -m wkcuber.convert_nifti --layer_name color --scale 10,10,30 data/source/nifti_file data/target

# Convert folder with NIFTI files to wkw files
python -m wkcuber.convert_nifti --color_file one_nifti_file --segmentation_file --scale 10,10,30 another_nifti data/source/ data/target

# Convert RAW file to wkw file
python -m wkcuber.convert_raw --layer_name color --scale 10,10,30 --input_dtype uint8 --shape 2048,2048,1024 data/source/raw_file.raw data/target

# Create downsampled magnifications
python -m wkcuber.downsampling --layer_name color data/target
python -m wkcuber.downsampling --layer_name segmentation --interpolation_mode mode data/target

# Compress data in-place (mostly useful for segmentation)
python -m wkcuber.compress --layer_name segmentation data/target

# Compress data copy (mostly useful for segmentation)
python -m wkcuber.compress --layer_name segmentation data/target data/target_compress

# Create metadata
python -m wkcuber.metadata --name great_dataset --scale 11.24,11.24,25 data/target

# Refresh metadata so that new layers and/or magnifications are picked up
python -m wkcuber.metadata --refresh data/target

# Recubing an existing dataset
python -m wkcuber.recubing --layer_name color --dtype uint8 /data/source/wkw /data/target

# Check two datasets for equality
python -m wkcuber.check_equality /data/source /data/target
```

### Parallelization

Most tasks can be configured to be executed in a parallelized manner. Via `--distribution_strategy` you can pass `multiprocessing`, `slurm` or `kubernetes`. The first can be further configured with `--jobs` and the latter via `--job_resources='{"mem": "10M"}'`. Use `--help` to get more information.

### Zarr support

Most conversion commands can be configured with `--data_format zarr`. This will produce a Zarr-based dataset instead of WKW. Zarr-based datasets can also be stored on remote storage (e.g. S3, GCS, HTTP). For that, storage-specific credentials and configurations need to be passed in as environment variables.

#### Example S3

```bash
export AWS_SECRET_ACCESS_KEY="..."
export AWS_ACCESS_KEY_ID="..."
export AWS_REGION="..."

python -m wkcuber \
  --scale 11.24,11.24,25 \
  --data_format zarr \
  data/source s3://bucket/data/target
```

#### Example HTTPS

```bash
export HTTP_BASIC_USER="..."
export HTTP_BASIC_PASSWORD="..."

python -m wkcuber \
  --scale 11.24,11.24,25 \
  --data_format zarr \
  data/source https://example.org/data/target
```

Exchange `https://` with `webdav+https://` for WebDAV.


## Development
Make sure to install all the required dependencies using Poetry:
```bash
pip install poetry
poetry install
```

Please, format, lint, and unit test your code changes before merging them.
```bash
poetry run black .
poetry run pylint -j4 wkcuber
poetry run pytest tests
```

Please, run the extended test suite:
```bash
tests/scripts/all_tests.sh
```

PyPi releases are automatically pushed when creating a new Git tag/Github release. 

## API documentation
Check out the [latest version of the API documentation](https://docs.webknossos.org/api/wkcuber.html).

### Generate the API documentation
Run `docs/generate.sh` to open a server displaying the API docs. `docs/generate.sh --persist` persists the html to `docs/api`.

## Test Data Credits
Excerpts for testing purposes have been sampled from:

* Dow Jacobo Hossain Siletti Hudspeth (2018). **Connectomics of the zebrafish's lateral-line neuromast reveals wiring and miswiring in a simple microcircuit.** eLife. [DOI:10.7554/eLife.33988](https://elifesciences.org/articles/33988)
* Zheng Lauritzen Perlman Robinson Nichols Milkie Torrens Price Fisher Sharifi Calle-Schuler Kmecova Ali Karsh Trautman Bogovic Hanslovsky Jefferis Kazhdan Khairy Saalfeld Fetter Bock (2018). **A Complete Electron Microscopy Volume of the Brain of Adult Drosophila melanogaster.** Cell. [DOI:10.1016/j.cell.2018.06.019](https://www.cell.com/cell/fulltext/S0092-8674(18)30787-6). License: [CC BY-NC 4.0](https://creativecommons.org/licenses/by-nc/4.0/)

## License
AGPLv3
Copyright scalable minds

            

Raw data

            {
    "_id": null,
    "home_page": "https://docs.webknossos.org/wkcuber",
    "name": "wkcuber",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.8,<3.11",
    "maintainer_email": "",
    "keywords": "",
    "author": "scalable minds",
    "author_email": "hello@scalableminds.com",
    "download_url": "https://files.pythonhosted.org/packages/2b/30/552aedb22bedc83026536318874e6b9bda9607d26cacf05a422576128263/wkcuber-0.12.6.tar.gz",
    "platform": null,
    "description": "# WEBKNOSSOS cuber (wkcuber)\n[![PyPI version](https://img.shields.io/pypi/v/wkcuber)](https://pypi.python.org/pypi/wkcuber)\n[![Supported Python Versions](https://img.shields.io/pypi/pyversions/wkcuber.svg)](https://pypi.python.org/pypi/wkcuber)\n[![Build Status](https://img.shields.io/github/actions/workflow/status/scalableminds/webknossos-libs/.github/workflows/ci.yml?branch=master)](https://github.com/scalableminds/webknossos-libs/actions?query=workflow%3A%22CI%22)\n[![Documentation](https://img.shields.io/badge/docs-passing-brightgreen.svg)](https://docs.webknossos.org/wkcuber/index.html)\n[![Code Style](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)\n\nPython library for creating and working with [WEBKNOSSOS](https://webknossos.org) [WKW](https://github.com/scalableminds/webknossos-wrap) datasets. WKW is a container format for efficiently storing large, scale 3D image data as found in (electron) microscopy.\n\nThe tools are modular components to allow easy integration into existing pipelines and workflows.\n\n## Features\n\n* `wkcuber`: Convert supported input files to fully ready WKW datasets (includes type detection, downsampling, compressing and metadata generation)\n* `wkcuber.convert_image_stack_to_wkw`: Convert image stacks to fully ready WKW datasets (includes downsampling, compressing and metadata generation)\n* `wkcuber.export_wkw_as_tiff`: Convert WKW datasets to a tiff stack (writing as tiles to a `z/y/x.tiff` folder structure is also supported)\n* `wkcuber.cubing`: Convert image stacks (e.g., `tiff`, `jpg`, `png`, `bmp`, `dm3`, `dm4`) to WKW cubes\n* `wkcuber.tile_cubing`: Convert tiled image stacks (e.g. in `z/y/x.ext` folder structure) to WKW cubes\n* `wkcuber.convert_knossos`: Convert KNOSSOS cubes to WKW cubes\n* `wkcuber.convert_nifti`: Convert NIFTI files to WKW files (Currently without applying transformations).\n* `wkcuber.convert_raw`: Convert RAW binary data (.raw, .vol) files to WKW datasets\n* `wkcuber.downsampling`: Create downsampled magnifications (with `median`, `mode` and linear interpolation modes). Downsampling compresses the new magnifications by default (disable via `--no_compress`).\n* `wkcuber.compress`: Compress WKW cubes for efficient file storage (especially useful for segmentation data)\n* `wkcuber.metadata`: Create (or refresh) metadata (with guessing of most parameters)\n* `wkcuber.recubing`: Read existing WKW cubes in and write them again specifying the WKW file length. Useful when dataset was written e.g. with file length 1.\n* `wkcuber.check_equality`: Compare two WKW datasets to check whether they are equal (e.g., after compressing a dataset, this task can be useful to double-check that the compressed dataset contains the same data).\n* Most modules support multiprocessing\n\n## Supported input formats\n\n* Standard image formats, e.g. `tiff`, `jpg`, `png`, `bmp`\n* Proprietary image formats, e.g. `dm3`\n* Tiled image stacks (used for Catmaid)\n* KNOSSOS cubes\n* NIFTI files\n* Raw binary files\n\n## Installation\n### Python 3 with pip from PyPi\n- `wkcuber` requires at least Python 3.8\n\n```bash\n# Make sure to have lz4 installed:\n# Mac: brew install lz4\n# Ubuntu/Debian: apt-get install liblz4-1\n# CentOS/RHEL: yum install lz4\n\npip install wkcuber\n```\n\n### Docker\nUse the CI-built image: [scalableminds/webknossos-cuber](https://hub.docker.com/r/scalableminds/webknossos-cuber/). Example usage `docker run -v <host path>:/data --rm scalableminds/webknossos-cuber wkcuber --layer_name color --scale 11.24,11.24,25 --name great_dataset /data/source/color /data/target`.\n\n\n## Usage\n\n```bash\n# Convert arbitrary, supported input files into wkw datasets. This sets reasonable defaults, but see other commands for customization.\npython -m wkcuber \\\n  --scale 11.24,11.24,25 \\\n  data/source data/target\n\n# Convert image stacks into wkw datasets\npython -m wkcuber.convert_image_stack_to_wkw \\\n  --layer_name color \\\n  --scale 11.24,11.24,25 \\\n  --name great_dataset \\\n  data/source/color data/target\n\n# Convert image files to wkw cubes\npython -m wkcuber.cubing --layer_name color data/source/color data/target\npython -m wkcuber.cubing --layer_name segmentation data/source/segmentation data/target\n\n# Convert tiled image files to wkw cubes\npython -m wkcuber.tile_cubing --layer_name color data/source data/target\n\n# Convert Knossos cubes to wkw cubes\npython -m wkcuber.convert_knossos --layer_name color data/source/mag1 data/target\n\n# Convert NIFTI file to wkw file\npython -m wkcuber.convert_nifti --layer_name color --scale 10,10,30 data/source/nifti_file data/target\n\n# Convert folder with NIFTI files to wkw files\npython -m wkcuber.convert_nifti --color_file one_nifti_file --segmentation_file --scale 10,10,30 another_nifti data/source/ data/target\n\n# Convert RAW file to wkw file\npython -m wkcuber.convert_raw --layer_name color --scale 10,10,30 --input_dtype uint8 --shape 2048,2048,1024 data/source/raw_file.raw data/target\n\n# Create downsampled magnifications\npython -m wkcuber.downsampling --layer_name color data/target\npython -m wkcuber.downsampling --layer_name segmentation --interpolation_mode mode data/target\n\n# Compress data in-place (mostly useful for segmentation)\npython -m wkcuber.compress --layer_name segmentation data/target\n\n# Compress data copy (mostly useful for segmentation)\npython -m wkcuber.compress --layer_name segmentation data/target data/target_compress\n\n# Create metadata\npython -m wkcuber.metadata --name great_dataset --scale 11.24,11.24,25 data/target\n\n# Refresh metadata so that new layers and/or magnifications are picked up\npython -m wkcuber.metadata --refresh data/target\n\n# Recubing an existing dataset\npython -m wkcuber.recubing --layer_name color --dtype uint8 /data/source/wkw /data/target\n\n# Check two datasets for equality\npython -m wkcuber.check_equality /data/source /data/target\n```\n\n### Parallelization\n\nMost tasks can be configured to be executed in a parallelized manner. Via `--distribution_strategy` you can pass `multiprocessing`, `slurm` or `kubernetes`. The first can be further configured with `--jobs` and the latter via `--job_resources='{\"mem\": \"10M\"}'`. Use `--help` to get more information.\n\n### Zarr support\n\nMost conversion commands can be configured with `--data_format zarr`. This will produce a Zarr-based dataset instead of WKW. Zarr-based datasets can also be stored on remote storage (e.g. S3, GCS, HTTP). For that, storage-specific credentials and configurations need to be passed in as environment variables.\n\n#### Example S3\n\n```bash\nexport AWS_SECRET_ACCESS_KEY=\"...\"\nexport AWS_ACCESS_KEY_ID=\"...\"\nexport AWS_REGION=\"...\"\n\npython -m wkcuber \\\n  --scale 11.24,11.24,25 \\\n  --data_format zarr \\\n  data/source s3://bucket/data/target\n```\n\n#### Example HTTPS\n\n```bash\nexport HTTP_BASIC_USER=\"...\"\nexport HTTP_BASIC_PASSWORD=\"...\"\n\npython -m wkcuber \\\n  --scale 11.24,11.24,25 \\\n  --data_format zarr \\\n  data/source https://example.org/data/target\n```\n\nExchange `https://` with `webdav+https://` for WebDAV.\n\n\n## Development\nMake sure to install all the required dependencies using Poetry:\n```bash\npip install poetry\npoetry install\n```\n\nPlease, format, lint, and unit test your code changes before merging them.\n```bash\npoetry run black .\npoetry run pylint -j4 wkcuber\npoetry run pytest tests\n```\n\nPlease, run the extended test suite:\n```bash\ntests/scripts/all_tests.sh\n```\n\nPyPi releases are automatically pushed when creating a new Git tag/Github release. \n\n## API documentation\nCheck out the [latest version of the API documentation](https://docs.webknossos.org/api/wkcuber.html).\n\n### Generate the API documentation\nRun `docs/generate.sh` to open a server displaying the API docs. `docs/generate.sh --persist` persists the html to `docs/api`.\n\n## Test Data Credits\nExcerpts for testing purposes have been sampled from:\n\n* Dow Jacobo Hossain Siletti Hudspeth (2018). **Connectomics of the zebrafish's lateral-line neuromast reveals wiring and miswiring in a simple microcircuit.** eLife. [DOI:10.7554/eLife.33988](https://elifesciences.org/articles/33988)\n* Zheng Lauritzen Perlman Robinson Nichols Milkie Torrens Price Fisher Sharifi Calle-Schuler Kmecova Ali Karsh Trautman Bogovic Hanslovsky Jefferis Kazhdan Khairy Saalfeld Fetter Bock (2018). **A Complete Electron Microscopy Volume of the Brain of Adult Drosophila melanogaster.** Cell. [DOI:10.1016/j.cell.2018.06.019](https://www.cell.com/cell/fulltext/S0092-8674(18)30787-6). License: [CC BY-NC 4.0](https://creativecommons.org/licenses/by-nc/4.0/)\n\n## License\nAGPLv3\nCopyright scalable minds\n",
    "bugtrack_url": null,
    "license": "AGPL-3.0",
    "summary": "Python package to create, cube, and work with WEBKNOSSOS WKW datasets",
    "version": "0.12.6",
    "project_urls": {
        "Homepage": "https://docs.webknossos.org/wkcuber",
        "Repository": "https://github.com/scalableminds/webknossos-libs"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "ae76c87f670b40284985e8b16df5719b1109218a5e4fa4aced2dd20ef8c8b3d7",
                "md5": "3bec9647f74e9e4cfd32771eb32b5624",
                "sha256": "2d7282248ee3c1bbdc6a6260be389318c770e21d57efe5b92ded35da89f213f3"
            },
            "downloads": -1,
            "filename": "wkcuber-0.12.6-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "3bec9647f74e9e4cfd32771eb32b5624",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8,<3.11",
            "size": 87033,
            "upload_time": "2023-06-09T11:35:51",
            "upload_time_iso_8601": "2023-06-09T11:35:51.662436Z",
            "url": "https://files.pythonhosted.org/packages/ae/76/c87f670b40284985e8b16df5719b1109218a5e4fa4aced2dd20ef8c8b3d7/wkcuber-0.12.6-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "2b30552aedb22bedc83026536318874e6b9bda9607d26cacf05a422576128263",
                "md5": "3ab28e40f6b60c5af2caa468433d8ba8",
                "sha256": "7175310243660ea0c5d57917e96479edf412d5b725b829647adc74b24c5a30a2"
            },
            "downloads": -1,
            "filename": "wkcuber-0.12.6.tar.gz",
            "has_sig": false,
            "md5_digest": "3ab28e40f6b60c5af2caa468433d8ba8",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8,<3.11",
            "size": 72039,
            "upload_time": "2023-06-09T11:35:53",
            "upload_time_iso_8601": "2023-06-09T11:35:53.649511Z",
            "url": "https://files.pythonhosted.org/packages/2b/30/552aedb22bedc83026536318874e6b9bda9607d26cacf05a422576128263/wkcuber-0.12.6.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-06-09 11:35:53",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "scalableminds",
    "github_project": "webknossos-libs",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [
        {
            "name": "dunamai",
            "specs": [
                [
                    "==",
                    "1.17.0"
                ]
            ]
        },
        {
            "name": "poetry",
            "specs": [
                [
                    "==",
                    "1.5.1"
                ]
            ]
        }
    ],
    "lcname": "wkcuber"
}
        
Elapsed time: 0.09527s