# odc.apps.dc_tools
Command line utilities for working with datacube index
## Installation
``` bash
pip install odc-apps-dc-tools
```
## Usage
### dc-sync-products
The tool `dc-sync-products` helps you by keeping a Datacube instance's list of products up to date
with a CSV list of product names and definitions.
Basic usage is:
``` bash
dc-sync-products <path-to-csv> --update-if-exists
```
The `--update-if-exists` flag is optional, and will update a product, including unsafe changes, if it already exists.
The format for the CSV is as follows (note that you can have multiple products defined in one file):
```
product,definition
dem_srtm,https://raw.githubusercontent.com/digitalearthafrica/config/master/products/dem_srtm.odc-product.yaml
ls5_c2l2_sr;ls7_c2l2_sr;ls8_c2l2_sr;ls9_c2l2_sr,https://raw.githubusercontent.com/opendatacube/datacube-dataset-config/main/products/lsX_c2l2_sr.odc-product.yaml
```
### dc-index-export-md
Metadata transformer
Simple usage:
``` bash
TODO:
```
Extended usage:
``` bash
TODO:
```
### dc-index-from-tar
Index ODC metadata that is contained in a .tar file
Simple usage:
``` bash
dc-index-from-tar 'path/to/file.tar'
```
Extended usage:
``` bash
TODO:
```
### sqs-to-dc
A tool to index from an SQS queue
Simple usage:
``` bash
sqs-to-dc example-queue-name 'product-name-a product-name-b'
```
Extended usage:
``` text
Usage: sqs-to-dc [OPTIONS] QUEUE_NAME PRODUCT
Iterate through messages on an SQS queue and add them to datacube
Options:
--skip-lineage Default is not to skip lineage. Set to skip
lineage altogether.
--fail-on-missing-lineage / --auto-add-lineage
Default is to fail if lineage documents not
present in the database. Set auto add to try
to index lineage documents.
--verify-lineage Default is no verification. Set to verify
parent dataset definitions.
--stac Expect STAC 1.0 metadata and attempt to
transform to ODC EO3 metadata
--odc-metadata-link TEXT Expect metadata doc with ODC EO3 metadata
link. Either provide '/' separated path to
find metadata link in a provided metadata
doc e.g. 'foo/bar/link', or if metadata doc
is STAC, provide 'rel' value of the 'links'
object having metadata link. e.g. 'STAC-
LINKS-REL:odc_yaml'
--limit INTEGER Stop indexing after n datasets have been
indexed.
--update If set, update instead of add datasets
--update-if-exists If the dataset already exists, update it
instead of skipping it.
--archive If set, archive datasets
--allow-unsafe Allow unsafe changes to a dataset. Take
care!
--record-path TEXT Filtering option for s3 path, i.e.
'L2/sentinel-2-nrt/S2MSIARD/*/*/ARD-
METADATA.yaml'
--region-code-list-uri TEXT A path to a list (one item per line, in txt
or gzip format) of valide region_codes to
include
--absolute Use absolute paths when converting from stac
--archive-less-mature Find less mature versions of the dataset and
archive them
--publish-action SNS ARN Publish indexing action to SNS topic
--help Show this message and exit.
```
### s3-to-dc
A tool for indexing from S3.
Simple usage:
``` bash
s3-to-dc 's3://bucket/path/**/*.yaml' 'product-name-a product-name-b'
```
Extended usage:
The following command updates the datasets instead of adding them and allows unsafe changes. Be careful!
``` text
Usage: s3-to-dc [OPTIONS] URI PRODUCT
Iterate through files in an S3 bucket and add them to datacube
Options:
--skip-lineage Default is not to skip lineage. Set to skip
lineage altogether.
--fail-on-missing-lineage / --auto-add-lineage
Default is to fail if lineage documents not
present in the database. Set auto add to try
to index lineage documents.
--verify-lineage Default is no verification. Set to verify
parent dataset definitions.
--stac Expect STAC 1.0 metadata and attempt to
transform to ODC EO3 metadata
--update If set, update instead of add datasets
--update-if-exists If the dataset already exists, update it
instead of skipping it.
--allow-unsafe Allow unsafe changes to a dataset. Take
care!
--skip-check Assume file exists when listing exact file
rather than wildcard.
--no-sign-request Do not sign AWS S3 requests
--request-payer Needed when accessing requester pays public
buckets
--archive-less-mature Find less mature versions of the dataset and
archive them
--publish-action SNS ARN Publish indexing action to SNS topic
--help Show this message and exit.
```
### thredds-to-dc
Index from a THREDDS server
Simple usage:
``` bash
TODO:
```
Extended usage:
``` bash
TODO:
```
### esri-lc-to-dc
Removed, use the `stac-to-dc` tool instead.
``` bash
stac-to-dc \
--catalog-href=https://planetarycomputer.microsoft.com/api/stac/v1/ \
--collections='io-lulc'
```
Raw data
{
"_id": null,
"home_page": "https://github.com/opendatacube/odc-tools/",
"name": "odc-apps-dc-tools",
"maintainer": "Open Data Cube",
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": null,
"author": "Open Data Cube",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/f9/19/fdfc1aeeeb457257e3eccb944cecfffb6627a58441930578f117face2865/odc_apps_dc_tools-0.2.18.tar.gz",
"platform": "any",
"description": "# odc.apps.dc_tools\n\nCommand line utilities for working with datacube index\n\n## Installation\n\n``` bash\npip install odc-apps-dc-tools\n```\n\n## Usage\n\n### dc-sync-products\n\nThe tool `dc-sync-products` helps you by keeping a Datacube instance's list of products up to date\nwith a CSV list of product names and definitions.\n\nBasic usage is:\n\n``` bash\ndc-sync-products <path-to-csv> --update-if-exists\n```\n\nThe `--update-if-exists` flag is optional, and will update a product, including unsafe changes, if it already exists.\nThe format for the CSV is as follows (note that you can have multiple products defined in one file):\n\n```\nproduct,definition\ndem_srtm,https://raw.githubusercontent.com/digitalearthafrica/config/master/products/dem_srtm.odc-product.yaml\nls5_c2l2_sr;ls7_c2l2_sr;ls8_c2l2_sr;ls9_c2l2_sr,https://raw.githubusercontent.com/opendatacube/datacube-dataset-config/main/products/lsX_c2l2_sr.odc-product.yaml\n\n```\n\n### dc-index-export-md\n\nMetadata transformer\n\nSimple usage:\n\n``` bash\nTODO:\n\n```\n\nExtended usage:\n\n``` bash\nTODO:\n```\n\n### dc-index-from-tar\n\nIndex ODC metadata that is contained in a .tar file\n\nSimple usage:\n\n``` bash\ndc-index-from-tar 'path/to/file.tar'\n\n```\n\nExtended usage:\n\n``` bash\nTODO:\n```\n\n### sqs-to-dc\n\nA tool to index from an SQS queue\n\nSimple usage:\n\n``` bash\nsqs-to-dc example-queue-name 'product-name-a product-name-b'\n\n```\n\nExtended usage:\n\n``` text\nUsage: sqs-to-dc [OPTIONS] QUEUE_NAME PRODUCT\n\n Iterate through messages on an SQS queue and add them to datacube\n\nOptions:\n --skip-lineage Default is not to skip lineage. Set to skip\n lineage altogether.\n\n --fail-on-missing-lineage / --auto-add-lineage\n Default is to fail if lineage documents not\n present in the database. Set auto add to try\n to index lineage documents.\n\n --verify-lineage Default is no verification. Set to verify\n parent dataset definitions.\n\n --stac Expect STAC 1.0 metadata and attempt to\n transform to ODC EO3 metadata\n\n --odc-metadata-link TEXT Expect metadata doc with ODC EO3 metadata\n link. Either provide '/' separated path to\n find metadata link in a provided metadata\n doc e.g. 'foo/bar/link', or if metadata doc\n is STAC, provide 'rel' value of the 'links'\n object having metadata link. e.g. 'STAC-\n LINKS-REL:odc_yaml'\n\n --limit INTEGER Stop indexing after n datasets have been\n indexed.\n\n --update If set, update instead of add datasets\n --update-if-exists If the dataset already exists, update it\n instead of skipping it.\n\n --archive If set, archive datasets\n --allow-unsafe Allow unsafe changes to a dataset. Take\n care!\n\n --record-path TEXT Filtering option for s3 path, i.e.\n 'L2/sentinel-2-nrt/S2MSIARD/*/*/ARD-\n METADATA.yaml'\n\n --region-code-list-uri TEXT A path to a list (one item per line, in txt\n or gzip format) of valide region_codes to\n include\n\n --absolute Use absolute paths when converting from stac\n\n --archive-less-mature Find less mature versions of the dataset and\n archive them\n \n --publish-action SNS ARN Publish indexing action to SNS topic\n\n --help Show this message and exit.\n```\n\n### s3-to-dc\n\nA tool for indexing from S3.\n\nSimple usage:\n\n``` bash\ns3-to-dc 's3://bucket/path/**/*.yaml' 'product-name-a product-name-b'\n\n```\n\nExtended usage:\n\nThe following command updates the datasets instead of adding them and allows unsafe changes. Be careful!\n\n``` text\nUsage: s3-to-dc [OPTIONS] URI PRODUCT\n\n Iterate through files in an S3 bucket and add them to datacube\n\nOptions:\n --skip-lineage Default is not to skip lineage. Set to skip\n lineage altogether.\n\n --fail-on-missing-lineage / --auto-add-lineage\n Default is to fail if lineage documents not\n present in the database. Set auto add to try\n to index lineage documents.\n\n --verify-lineage Default is no verification. Set to verify\n parent dataset definitions.\n\n --stac Expect STAC 1.0 metadata and attempt to\n transform to ODC EO3 metadata\n\n --update If set, update instead of add datasets\n --update-if-exists If the dataset already exists, update it\n instead of skipping it.\n\n --allow-unsafe Allow unsafe changes to a dataset. Take\n care!\n\n --skip-check Assume file exists when listing exact file\n rather than wildcard.\n\n --no-sign-request Do not sign AWS S3 requests\n --request-payer Needed when accessing requester pays public\n buckets\n\n --archive-less-mature Find less mature versions of the dataset and\n archive them\n\n --publish-action SNS ARN Publish indexing action to SNS topic\n\n --help Show this message and exit.\n```\n\n### thredds-to-dc\n\nIndex from a THREDDS server\n\nSimple usage:\n\n``` bash\nTODO:\n\n```\n\nExtended usage:\n\n``` bash\nTODO:\n```\n\n### esri-lc-to-dc\n\nRemoved, use the `stac-to-dc` tool instead.\n\n``` bash\n stac-to-dc \\\n --catalog-href=https://planetarycomputer.microsoft.com/api/stac/v1/ \\\n --collections='io-lulc'\n```\n",
"bugtrack_url": null,
"license": "Apache License 2.0",
"summary": "CLI utils for working with a datacube index",
"version": "0.2.18",
"project_urls": {
"Homepage": "https://github.com/opendatacube/odc-tools/"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "a4dee153a5c120b9e057b922266582e7babba71fad329857ca9ab880ccb0404a",
"md5": "8a2fb4531ca3648cb87b35f9043f227a",
"sha256": "7d53405802d59bc3121cdb563d1fa08f60b5f9c92718c4103ccab364ef803419"
},
"downloads": -1,
"filename": "odc_apps_dc_tools-0.2.18-py3-none-any.whl",
"has_sig": false,
"md5_digest": "8a2fb4531ca3648cb87b35f9043f227a",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 44250,
"upload_time": "2024-07-24T06:22:37",
"upload_time_iso_8601": "2024-07-24T06:22:37.250983Z",
"url": "https://files.pythonhosted.org/packages/a4/de/e153a5c120b9e057b922266582e7babba71fad329857ca9ab880ccb0404a/odc_apps_dc_tools-0.2.18-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "f919fdfc1aeeeb457257e3eccb944cecfffb6627a58441930578f117face2865",
"md5": "9413497c2fbe943ffb569278e4281670",
"sha256": "0224bb30713e1de3fec58b082aa8bdacb05b447632df00110fb0d71f065592db"
},
"downloads": -1,
"filename": "odc_apps_dc_tools-0.2.18.tar.gz",
"has_sig": false,
"md5_digest": "9413497c2fbe943ffb569278e4281670",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 43797,
"upload_time": "2024-07-24T06:22:38",
"upload_time_iso_8601": "2024-07-24T06:22:38.708867Z",
"url": "https://files.pythonhosted.org/packages/f9/19/fdfc1aeeeb457257e3eccb944cecfffb6627a58441930578f117face2865/odc_apps_dc_tools-0.2.18.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-07-24 06:22:38",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "opendatacube",
"github_project": "odc-tools",
"travis_ci": false,
"coveralls": true,
"github_actions": true,
"lcname": "odc-apps-dc-tools"
}