Name | meltanolabs-target-snowflake JSON |
Version |
0.13.0
JSON |
| download |
home_page | None |
Summary | Singer target for Snowflake, built with the Meltano SDK for Singer Targets. |
upload_time | 2024-11-11 22:46:41 |
maintainer | Meltano Team and Community |
docs_url | None |
author | Ken Payne |
requires_python | >=3.9 |
license | Apache-2.0 |
keywords |
elt
snowflake
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# `target-snowflake`
Target for Snowflake.
Built with the [Meltano Singer SDK](https://sdk.meltano.com).
## Capabilities
* `about`
* `stream-maps`
* `schema-flattening`
* `target-schema`
## Settings
| Setting | Required | Default | Description |
|:---------------------------|:---------|:------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| user | True | None | The login name for your Snowflake user. |
| password | False | None | The password for your Snowflake user. |
| private_key | False | None | The private key contents. For KeyPair authentication either private_key or private_key_path must be provided. |
| private_key_path | False | None | Path to file containing private key. For KeyPair authentication either private_key or private_key_path must be provided. |
| private_key_passphrase | False | None | Passphrase to decrypt private key if encrypted. |
| account | True | None | Your account identifier. See [Account Identifiers](https://docs.snowflake.com/en/user-guide/admin-account-identifier.html). |
| database | True | None | The initial database for the Snowflake session. |
| schema | False | None | The initial schema for the Snowflake session. |
| warehouse | False | None | The initial warehouse for the session. |
| role | False | None | The initial role for the session. |
| add_record_metadata | False | 1 | Whether to add metadata columns. |
| clean_up_batch_files | False | 1 | Whether to remove batch files after processing. |
| default_target_schema | False | None | The default target database schema name to use for all streams. |
| hard_delete | False | 0 | Hard delete records. |
| load_method | False | TargetLoadMethods.APPEND_ONLY | The method to use when loading data into the destination. `append-only` will always write all input records whether that records already exists or not. `upsert` will update existing records and insert new records. `overwrite` will delete all existing records and insert all input records. |
| batch_size_rows | False | None | Maximum number of rows in each batch. |
| validate_records | False | 1 | Whether to validate the schema of the incoming streams. |
| stream_maps | False | None | Config object for stream maps capability. For more information check out [Stream Maps](https://sdk.meltano.com/en/latest/stream_maps.html). |
| stream_map_config | False | None | User-defined config values to be used within map expressions. |
| faker_config | False | None | Config for the [`Faker`](https://faker.readthedocs.io/en/master/) instance variable `fake` used within map expressions. Only applicable if the plugin specifies `faker` as an addtional dependency (through the `singer-sdk` `faker` extra or directly). |
| faker_config.seed | False | None | Value to seed the Faker generator for deterministic output: https://faker.readthedocs.io/en/master/#seeding-the-generator |
| faker_config.locale | False | None | One or more LCID locale strings to produce localized output for: https://faker.readthedocs.io/en/master/#localization |
| flattening_enabled | False | None | 'True' to enable schema flattening and automatically expand nested properties. |
| flattening_max_depth | False | None | The max depth to flatten schemas. |
| use_browser_authentication | False | False | If authentication should be done using SSO (via external browser). See See [SSO browser authentication](https://docs.snowflake.com/en/developer-guide/node-js/nodejs-driver-authenticate#using-single-sign-on-sso-through-a-web-browser). |
A full list of supported settings and capabilities is available by running: `target-snowflake --about`
### Initializing a Snowflake Account
This target has an interactive feature that will help you get a Snowflake account initialized with everything needed to get started loading data.
- User
- Role
- Warehouse
- Database
- Proper grants
The CLI will ask you to provide information about the new user/role/etc. you want to create but it will also need SYSADMIN credentials to execute the queries.
You should prepare the following inputs:
- Account
- User that has SYSADMIN and SECURITYADMIN access. These comes default with the user that created the Snowflake account.
- The password for your SYSADMIN user.
Run the following command to get started with the interactive CLI.
Note - the CLI will print the SQL queries it is planning to run and confirm with you before it makes any changes.
```bash
poetry run target-snowflake --initialize
# Alternatively using Meltano CLI
meltano invoke target-snowflake --initialize
```
The CLI also has a "dry run" mode that will print the queries without executing them.
Check out the demo of this [on YouTube](https://youtu.be/9vEFxw-0nxI).
### Configure using environment variables
This Singer target will automatically import any environment variables within the working directory's
`.env` if the `--config=ENV` is provided, such that config values will be considered if a matching
environment variable is set either in the terminal context or in the `.env` file.
## Usage
You can easily run `target-snowflake` by itself or in a pipeline using [Meltano](https://meltano.com/).
### Executing the Target Directly
```bash
target-snowflake --version
target-snowflake --help
# Test using the "Carbon Intensity" sample:
tap-carbon-intensity | target-snowflake --config /path/to/target-snowflake-config.json
```
## Developer Resources
### Initialize your Development Environment
```bash
pipx install poetry
poetry install
```
### Create and Run Tests
Create tests within the `target_snowflake/tests` subfolder and
then run:
```bash
poetry run pytest
```
You can also test the `target-snowflake` CLI interface directly using `poetry run`:
```bash
poetry run target-snowflake --help
```
### Testing with [Meltano](https://meltano.com/)
_**Note:** This target will work in any Singer environment and does not require Meltano.
Examples here are for convenience and to streamline end-to-end orchestration scenarios._
Your project comes with a custom `meltano.yml` project file already created.
Next, install Meltano (if you haven't already) and any needed plugins:
```bash
# Install meltano
pipx install meltano
# Initialize meltano within this directory
cd target-snowflake
meltano install
```
Now you can test and orchestrate using Meltano:
```bash
# Test invocation:
meltano invoke target-snowflake --version
# OR run a test `elt` pipeline with the Carbon Intensity sample tap:
meltano run tap-carbon-intensity target-snowflake
```
### SDK Dev Guide
See the [dev guide](https://sdk.meltano.com/en/latest/dev_guide.html) for more instructions on how to use the Meltano SDK to
develop your own Singer taps and targets.
Raw data
{
"_id": null,
"home_page": null,
"name": "meltanolabs-target-snowflake",
"maintainer": "Meltano Team and Community",
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": "hello@meltano.com",
"keywords": "ELT, Snowflake",
"author": "Ken Payne",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/c9/c0/747754da67ef135bbf9549425cd75bb482c331aa49753b6e4d2f47ebb152/meltanolabs_target_snowflake-0.13.0.tar.gz",
"platform": null,
"description": "# `target-snowflake`\n\nTarget for Snowflake.\n\nBuilt with the [Meltano Singer SDK](https://sdk.meltano.com).\n\n## Capabilities\n\n* `about`\n* `stream-maps`\n* `schema-flattening`\n* `target-schema`\n\n## Settings\n\n| Setting | Required | Default | Description |\n|:---------------------------|:---------|:------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| user | True | None | The login name for your Snowflake user. |\n| password | False | None | The password for your Snowflake user. |\n| private_key | False | None | The private key contents. For KeyPair authentication either private_key or private_key_path must be provided. |\n| private_key_path | False | None | Path to file containing private key. For KeyPair authentication either private_key or private_key_path must be provided. |\n| private_key_passphrase | False | None | Passphrase to decrypt private key if encrypted. |\n| account | True | None | Your account identifier. See [Account Identifiers](https://docs.snowflake.com/en/user-guide/admin-account-identifier.html). |\n| database | True | None | The initial database for the Snowflake session. |\n| schema | False | None | The initial schema for the Snowflake session. |\n| warehouse | False | None | The initial warehouse for the session. |\n| role | False | None | The initial role for the session. |\n| add_record_metadata | False | 1 | Whether to add metadata columns. |\n| clean_up_batch_files | False | 1 | Whether to remove batch files after processing. |\n| default_target_schema | False | None | The default target database schema name to use for all streams. |\n| hard_delete | False | 0 | Hard delete records. |\n| load_method | False | TargetLoadMethods.APPEND_ONLY | The method to use when loading data into the destination. `append-only` will always write all input records whether that records already exists or not. `upsert` will update existing records and insert new records. `overwrite` will delete all existing records and insert all input records. |\n| batch_size_rows | False | None | Maximum number of rows in each batch. |\n| validate_records | False | 1 | Whether to validate the schema of the incoming streams. |\n| stream_maps | False | None | Config object for stream maps capability. For more information check out [Stream Maps](https://sdk.meltano.com/en/latest/stream_maps.html). |\n| stream_map_config | False | None | User-defined config values to be used within map expressions. |\n| faker_config | False | None | Config for the [`Faker`](https://faker.readthedocs.io/en/master/) instance variable `fake` used within map expressions. Only applicable if the plugin specifies `faker` as an addtional dependency (through the `singer-sdk` `faker` extra or directly). |\n| faker_config.seed | False | None | Value to seed the Faker generator for deterministic output: https://faker.readthedocs.io/en/master/#seeding-the-generator |\n| faker_config.locale | False | None | One or more LCID locale strings to produce localized output for: https://faker.readthedocs.io/en/master/#localization |\n| flattening_enabled | False | None | 'True' to enable schema flattening and automatically expand nested properties. |\n| flattening_max_depth | False | None | The max depth to flatten schemas. |\n| use_browser_authentication | False | False | If authentication should be done using SSO (via external browser). See See [SSO browser authentication](https://docs.snowflake.com/en/developer-guide/node-js/nodejs-driver-authenticate#using-single-sign-on-sso-through-a-web-browser). |\n\nA full list of supported settings and capabilities is available by running: `target-snowflake --about`\n\n### Initializing a Snowflake Account\n\nThis target has an interactive feature that will help you get a Snowflake account initialized with everything needed to get started loading data.\n\n- User\n- Role\n- Warehouse\n- Database\n- Proper grants\n\nThe CLI will ask you to provide information about the new user/role/etc. you want to create but it will also need SYSADMIN credentials to execute the queries.\nYou should prepare the following inputs:\n\n- Account\n- User that has SYSADMIN and SECURITYADMIN access. These comes default with the user that created the Snowflake account.\n- The password for your SYSADMIN user.\n\nRun the following command to get started with the interactive CLI.\nNote - the CLI will print the SQL queries it is planning to run and confirm with you before it makes any changes.\n\n```bash\npoetry run target-snowflake --initialize\n\n# Alternatively using Meltano CLI\nmeltano invoke target-snowflake --initialize\n```\n\nThe CLI also has a \"dry run\" mode that will print the queries without executing them.\n\nCheck out the demo of this [on YouTube](https://youtu.be/9vEFxw-0nxI).\n\n### Configure using environment variables\n\nThis Singer target will automatically import any environment variables within the working directory's\n`.env` if the `--config=ENV` is provided, such that config values will be considered if a matching\nenvironment variable is set either in the terminal context or in the `.env` file.\n\n## Usage\n\nYou can easily run `target-snowflake` by itself or in a pipeline using [Meltano](https://meltano.com/).\n\n### Executing the Target Directly\n\n```bash\ntarget-snowflake --version\ntarget-snowflake --help\n# Test using the \"Carbon Intensity\" sample:\ntap-carbon-intensity | target-snowflake --config /path/to/target-snowflake-config.json\n```\n\n## Developer Resources\n\n### Initialize your Development Environment\n\n```bash\npipx install poetry\npoetry install\n```\n\n### Create and Run Tests\n\nCreate tests within the `target_snowflake/tests` subfolder and\n then run:\n\n```bash\npoetry run pytest\n```\n\nYou can also test the `target-snowflake` CLI interface directly using `poetry run`:\n\n```bash\npoetry run target-snowflake --help\n```\n\n### Testing with [Meltano](https://meltano.com/)\n\n_**Note:** This target will work in any Singer environment and does not require Meltano.\nExamples here are for convenience and to streamline end-to-end orchestration scenarios._\n\nYour project comes with a custom `meltano.yml` project file already created.\n\nNext, install Meltano (if you haven't already) and any needed plugins:\n\n```bash\n# Install meltano\npipx install meltano\n# Initialize meltano within this directory\ncd target-snowflake\nmeltano install\n```\n\nNow you can test and orchestrate using Meltano:\n\n```bash\n# Test invocation:\nmeltano invoke target-snowflake --version\n# OR run a test `elt` pipeline with the Carbon Intensity sample tap:\nmeltano run tap-carbon-intensity target-snowflake\n```\n\n### SDK Dev Guide\n\nSee the [dev guide](https://sdk.meltano.com/en/latest/dev_guide.html) for more instructions on how to use the Meltano SDK to\ndevelop your own Singer taps and targets.\n\n",
"bugtrack_url": null,
"license": "Apache-2.0",
"summary": "Singer target for Snowflake, built with the Meltano SDK for Singer Targets.",
"version": "0.13.0",
"project_urls": null,
"split_keywords": [
"elt",
" snowflake"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "eeca2da8a1d03443e405ae03a98d529576d6033d94ab566ebb336101e14466f0",
"md5": "9c10397f6a9b04a2bb9c11ce3200653d",
"sha256": "b4341a1764dc7e6b1b87b0a549181f4d0b2708e0a4c5653718bf3563bb98551f"
},
"downloads": -1,
"filename": "meltanolabs_target_snowflake-0.13.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "9c10397f6a9b04a2bb9c11ce3200653d",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 18747,
"upload_time": "2024-11-11T22:46:39",
"upload_time_iso_8601": "2024-11-11T22:46:39.641873Z",
"url": "https://files.pythonhosted.org/packages/ee/ca/2da8a1d03443e405ae03a98d529576d6033d94ab566ebb336101e14466f0/meltanolabs_target_snowflake-0.13.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "c9c0747754da67ef135bbf9549425cd75bb482c331aa49753b6e4d2f47ebb152",
"md5": "f714521e72069201c82237011b6e7d79",
"sha256": "4b0c3b4cb10de5641fe78c74d7a8e0dfa3ba1b584b7b15a8be70583c2597cc1c"
},
"downloads": -1,
"filename": "meltanolabs_target_snowflake-0.13.0.tar.gz",
"has_sig": false,
"md5_digest": "f714521e72069201c82237011b6e7d79",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 18415,
"upload_time": "2024-11-11T22:46:41",
"upload_time_iso_8601": "2024-11-11T22:46:41.869247Z",
"url": "https://files.pythonhosted.org/packages/c9/c0/747754da67ef135bbf9549425cd75bb482c331aa49753b6e4d2f47ebb152/meltanolabs_target_snowflake-0.13.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-11-11 22:46:41",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "meltanolabs-target-snowflake"
}