Name | digdaglog2sql JSON |
Version |
0.1.0
JSON |
| download |
home_page | None |
Summary | Extract SQLs from digdag log |
upload_time | 2024-04-18 00:00:35 |
maintainer | Aki Ariga |
docs_url | None |
author | Aki Ariga |
requires_python | <4.0.0,>=3.8.1 |
license | Apache-2.0 |
keywords |
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# digdaglog2sql
## Install
```sh
pip install --user digdaglog2sql
```
or,
```sh
$ python -m venv .venv
$ source .venv/bin/activate
(.venv)$ pip install digdaglog2sql
```
## Usage
```sh
$ digdaglog2sql --help
Usage: digdaglog2sql [OPTIONS]
Input log by file:
--input FILENAME Input file name of a workflow log. Use - for
STDIN.
Download log by Session ID:
--session-id INTEGER Session ID of the target workflow.
--site [us|jp|eu01|ap02|ap03] Treasure Workflow site name. [default: us]
--endpoint TEXT Digdag server endpoint.
--http Enforce to use http schema.
Output:
--output FILENAME Output file name. Use - for STDOUT. [required]
Other options:
--drop-cdp-db If true, drop cdp_audience_xxx DB name.
--help Show this message and exit.
Constraints:
{--input, --session-id} exactly 1 required
{--site, --endpoint} exactly 1 required if --session-id is set
{--site, --http} mutually exclusive
```
You can use log file on local environment.
```sh
digdaglog2sql --input workflow-log.txt --output output.sql
```
Or, you can use Session ID of Treasure Workflow.
```sh
digdaglog2sql --session-id 12345 --site us --output output.sql
```
Ensure set `TD_API_KEY` into environment variable.
Raw data
{
"_id": null,
"home_page": null,
"name": "digdaglog2sql",
"maintainer": "Aki Ariga",
"docs_url": null,
"requires_python": "<4.0.0,>=3.8.1",
"maintainer_email": "chezou@gmail.com",
"keywords": null,
"author": "Aki Ariga",
"author_email": "chezou@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/06/5a/d92f361b8908eee6978188bc48195b6e002c63d110dac02cbbcdecaba3b5/digdaglog2sql-0.1.0.tar.gz",
"platform": null,
"description": "# digdaglog2sql\n\n## Install\n\n```sh\npip install --user digdaglog2sql\n```\n\nor,\n\n```sh\n$ python -m venv .venv\n$ source .venv/bin/activate\n(.venv)$ pip install digdaglog2sql\n```\n\n## Usage\n\n```sh\n$ digdaglog2sql --help\nUsage: digdaglog2sql [OPTIONS]\n\nInput log by file:\n --input FILENAME Input file name of a workflow log. Use - for\n STDIN.\n\nDownload log by Session ID:\n --session-id INTEGER Session ID of the target workflow.\n --site [us|jp|eu01|ap02|ap03] Treasure Workflow site name. [default: us]\n --endpoint TEXT Digdag server endpoint.\n --http Enforce to use http schema.\n\nOutput:\n --output FILENAME Output file name. Use - for STDOUT. [required]\n\nOther options:\n --drop-cdp-db If true, drop cdp_audience_xxx DB name.\n --help Show this message and exit.\n\nConstraints:\n {--input, --session-id} exactly 1 required\n {--site, --endpoint} exactly 1 required if --session-id is set\n {--site, --http} mutually exclusive\n```\n\nYou can use log file on local environment.\n\n```sh\ndigdaglog2sql --input workflow-log.txt --output output.sql\n```\n\nOr, you can use Session ID of Treasure Workflow.\n\n```sh\ndigdaglog2sql --session-id 12345 --site us --output output.sql\n```\n\nEnsure set `TD_API_KEY` into environment variable.\n",
"bugtrack_url": null,
"license": "Apache-2.0",
"summary": "Extract SQLs from digdag log",
"version": "0.1.0",
"project_urls": null,
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "8f66b5a3e49e9b3a002f5da23fd32cdaa7b84862a25a54af33fd2976d3b183c3",
"md5": "5a13104e1683de5d7c19cf692162d64d",
"sha256": "6be282aa08056d2c7f1ef19cc36d94be30196bb6d1c6fec801595fcb6e49d57d"
},
"downloads": -1,
"filename": "digdaglog2sql-0.1.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "5a13104e1683de5d7c19cf692162d64d",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0.0,>=3.8.1",
"size": 8353,
"upload_time": "2024-04-18T00:00:34",
"upload_time_iso_8601": "2024-04-18T00:00:34.018360Z",
"url": "https://files.pythonhosted.org/packages/8f/66/b5a3e49e9b3a002f5da23fd32cdaa7b84862a25a54af33fd2976d3b183c3/digdaglog2sql-0.1.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "065ad92f361b8908eee6978188bc48195b6e002c63d110dac02cbbcdecaba3b5",
"md5": "632bc8bc58666f65a176c1b7a111dc8d",
"sha256": "0adcacab49e7427283005931f8ccf4f551e8e9d434786a6b9b81299c77673d5d"
},
"downloads": -1,
"filename": "digdaglog2sql-0.1.0.tar.gz",
"has_sig": false,
"md5_digest": "632bc8bc58666f65a176c1b7a111dc8d",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0.0,>=3.8.1",
"size": 7145,
"upload_time": "2024-04-18T00:00:35",
"upload_time_iso_8601": "2024-04-18T00:00:35.650719Z",
"url": "https://files.pythonhosted.org/packages/06/5a/d92f361b8908eee6978188bc48195b6e002c63d110dac02cbbcdecaba3b5/digdaglog2sql-0.1.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-04-18 00:00:35",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "digdaglog2sql"
}