macrometa-source-snowflake


Namemacrometa-source-snowflake JSON
Version 1.0.0 PyPI version JSON
download
home_pagehttps://github.com/Macrometacorp/macrometa-source-snowflake
SummaryMacrometa Source for extracting data from Snowflake
upload_time2023-08-01 08:45:23
maintainer
docs_urlNone
authorMacrometa
requires_python
license
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # macrometa-source-snowflake

A macrometa source that extracts data from a [Snowflake](https://www.snowflake.com/) database and produces JSON-formatted data following the [Singer spec](https://github.com/singer-io/getting-started/blob/master/docs/SPEC.md).

## How to use it

TODO: Add proper context
If you want to run this macrometa-source-snowflake connector independently please read further.

### Install and Run

First, make sure Python 3 is installed on your system or follow these
installation instructions for [Mac](http://docs.python-guide.org/en/latest/starting/install3/osx/) or
[Ubuntu](https://www.digitalocean.com/community/tutorials/how-to-install-python-3-and-set-up-a-local-programming-environment-on-ubuntu-16-04).
It's recommended to use a virtualenv:

```bash
make venv
```

### Configuration

1. Create a `config.json` file with connection details to snowflake, here is a [sample config file](./config_sample.json).
   **Note**: `table` is a mandatory parameter as well to avoid a long-running catalog discovery process.
   Please specify fully qualified table and view names and only that ones that you need to extract otherwise you can
   end up with very long running discovery mode of this source connector. Discovery mode is analysing table structures but
   Snowflake doesn't like selecting lot of rows from `INFORMATION_SCHEMA` or running `SHOW` commands that returns lot of
   rows. Please be as specific as possible.
2. Run it in discovery mode to generate a `properties.json`
3. Edit the `properties.json` and select the streams to replicate
4. Run the source connector like any other singer compatible tap:

```
  macrometa-source-snowflake --config config.json --properties properties.json --state state.json
```

### Authentication Methods

You can either use basic user/password authentication or Key Pair authentication.

#### User / Password authentication

Populate `user` and `password` in the `config.json` file

#### Key Pair authentication

To use key pair authentication, omit the `password` and instead provide the `private_key` to the unencrypted version of the private key and, optionally, the `private_key_passphrase`.

### Discovery mode

The macrometa-source-snowflake connector can be invoked in discovery mode to find the available table and columns in the database:

```bash
$ macrometa-source-snowflake --config config.json --discover
```

A discovered catalog is output, with a JSON-schema description of each table. A
source table directly corresponds to a Singer stream.

## Replication methods

The two ways to replicate a given table are `FULL_TABLE` and `LOG_BASED`.

### Full Table

Full-table replication extracts all data from the source table each time the connector
is invoked.

### LogBased

Macrometa source Snowflake connector can be used as a CDC (Change Data Capture) connector by specifying the Replication Method as LOG_BASED to capture any changes done at source and identify what records were inserted or updated or deleted, it will extract first all the records (i.e. FULL_TABLE) and then it will continuously listen to a Stream created on the Table to extract only the changes done.

### To run tests:

1. Define environment variables that requires running the tests

```
  export MACROMETA_SOURCE_SNOWFLAKE_ACCOUNT=<snowflake-account-name>
  export MACROMETA_SOURCE_SNOWFLAKE_DBNAME=<snowflake-database-name>
  export MACROMETA_SOURCE_SNOWFLAKE_USER=<snowflake-user>
  export MACROMETA_SOURCE_SNOWFLAKE_PASSWORD=<snowflake-password>
  export MACROMETA_SOURCE_SNOWFLAKE_PRIVATE_KEY=<snowflake-pk-path>
  export MACROMETA_SOURCE_SNOWFLAKE_PRIVATE_KEY_PASSPHRASE=<snowflake-passphrase>
  export MACROMETA_SOURCE_SNOWFLAKE_WAREHOUSE=<snowflake-warehouse>
```

2. Install python dependencies

```bash
make venv
```

3. To run unit tests:
   **PS**: There are no unit tests at the time of writing this document

```bash
make unit_test
```

4. To run Integration tests

```bash
make integration_test
```

### To run formatting and linting:

```bash
make venv format pylint
```

## License

Apache License Version 2.0
See [LICENSE](LICENSE) to see the full text.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/Macrometacorp/macrometa-source-snowflake",
    "name": "macrometa-source-snowflake",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "",
    "author": "Macrometa",
    "author_email": "",
    "download_url": "https://files.pythonhosted.org/packages/71/1b/95f288a01958e554359acb60372de6f60ebe25ccebb817111d748cccfc31/macrometa-source-snowflake-1.0.0.tar.gz",
    "platform": null,
    "description": "# macrometa-source-snowflake\n\nA macrometa source that extracts data from a [Snowflake](https://www.snowflake.com/) database and produces JSON-formatted data following the [Singer spec](https://github.com/singer-io/getting-started/blob/master/docs/SPEC.md).\n\n## How to use it\n\nTODO: Add proper context\nIf you want to run this macrometa-source-snowflake connector independently please read further.\n\n### Install and Run\n\nFirst, make sure Python 3 is installed on your system or follow these\ninstallation instructions for [Mac](http://docs.python-guide.org/en/latest/starting/install3/osx/) or\n[Ubuntu](https://www.digitalocean.com/community/tutorials/how-to-install-python-3-and-set-up-a-local-programming-environment-on-ubuntu-16-04).\nIt's recommended to use a virtualenv:\n\n```bash\nmake venv\n```\n\n### Configuration\n\n1. Create a `config.json` file with connection details to snowflake, here is a [sample config file](./config_sample.json).\n   **Note**: `table` is a mandatory parameter as well to avoid a long-running catalog discovery process.\n   Please specify fully qualified table and view names and only that ones that you need to extract otherwise you can\n   end up with very long running discovery mode of this source connector. Discovery mode is analysing table structures but\n   Snowflake doesn't like selecting lot of rows from `INFORMATION_SCHEMA` or running `SHOW` commands that returns lot of\n   rows. Please be as specific as possible.\n2. Run it in discovery mode to generate a `properties.json`\n3. Edit the `properties.json` and select the streams to replicate\n4. Run the source connector like any other singer compatible tap:\n\n```\n  macrometa-source-snowflake --config config.json --properties properties.json --state state.json\n```\n\n### Authentication Methods\n\nYou can either use basic user/password authentication or Key Pair authentication.\n\n#### User / Password authentication\n\nPopulate `user` and `password` in the `config.json` file\n\n#### Key Pair authentication\n\nTo use key pair authentication, omit the `password` and instead provide the `private_key` to the unencrypted version of the private key and, optionally, the `private_key_passphrase`.\n\n### Discovery mode\n\nThe macrometa-source-snowflake connector can be invoked in discovery mode to find the available table and columns in the database:\n\n```bash\n$ macrometa-source-snowflake --config config.json --discover\n```\n\nA discovered catalog is output, with a JSON-schema description of each table. A\nsource table directly corresponds to a Singer stream.\n\n## Replication methods\n\nThe two ways to replicate a given table are `FULL_TABLE` and `LOG_BASED`.\n\n### Full Table\n\nFull-table replication extracts all data from the source table each time the connector\nis invoked.\n\n### LogBased\n\nMacrometa source Snowflake connector can be used as a CDC (Change Data Capture) connector by specifying the Replication Method as LOG_BASED to capture any changes done at source and identify what records were inserted or updated or deleted, it will extract first all the records (i.e. FULL_TABLE) and then it will continuously listen to a Stream created on the Table to extract only the changes done.\n\n### To run tests:\n\n1. Define environment variables that requires running the tests\n\n```\n  export MACROMETA_SOURCE_SNOWFLAKE_ACCOUNT=<snowflake-account-name>\n  export MACROMETA_SOURCE_SNOWFLAKE_DBNAME=<snowflake-database-name>\n  export MACROMETA_SOURCE_SNOWFLAKE_USER=<snowflake-user>\n  export MACROMETA_SOURCE_SNOWFLAKE_PASSWORD=<snowflake-password>\n  export MACROMETA_SOURCE_SNOWFLAKE_PRIVATE_KEY=<snowflake-pk-path>\n  export MACROMETA_SOURCE_SNOWFLAKE_PRIVATE_KEY_PASSPHRASE=<snowflake-passphrase>\n  export MACROMETA_SOURCE_SNOWFLAKE_WAREHOUSE=<snowflake-warehouse>\n```\n\n2. Install python dependencies\n\n```bash\nmake venv\n```\n\n3. To run unit tests:\n   **PS**: There are no unit tests at the time of writing this document\n\n```bash\nmake unit_test\n```\n\n4. To run Integration tests\n\n```bash\nmake integration_test\n```\n\n### To run formatting and linting:\n\n```bash\nmake venv format pylint\n```\n\n## License\n\nApache License Version 2.0\nSee [LICENSE](LICENSE) to see the full text.\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "Macrometa Source for extracting data from Snowflake",
    "version": "1.0.0",
    "project_urls": {
        "Homepage": "https://github.com/Macrometacorp/macrometa-source-snowflake"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "183cfea11efcf3b930b37b660dfee81a3ebe324d29d80a23ed0c1adf7e2c4ccc",
                "md5": "7e7df497362f38487cf8992d511410f9",
                "sha256": "2557d7697a41b50239c4eee468de4caa8b496b5aa50c020da7c02e9c2b6ebce9"
            },
            "downloads": -1,
            "filename": "macrometa_source_snowflake-1.0.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "7e7df497362f38487cf8992d511410f9",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 25303,
            "upload_time": "2023-08-01T08:45:21",
            "upload_time_iso_8601": "2023-08-01T08:45:21.880952Z",
            "url": "https://files.pythonhosted.org/packages/18/3c/fea11efcf3b930b37b660dfee81a3ebe324d29d80a23ed0c1adf7e2c4ccc/macrometa_source_snowflake-1.0.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "711b95f288a01958e554359acb60372de6f60ebe25ccebb817111d748cccfc31",
                "md5": "a01ec58fc084d2c386a146d2a430ba26",
                "sha256": "c6f4a407b3f57eb432f28b1c1258828f6a620cc8014f96c7dd466d9238372a29"
            },
            "downloads": -1,
            "filename": "macrometa-source-snowflake-1.0.0.tar.gz",
            "has_sig": false,
            "md5_digest": "a01ec58fc084d2c386a146d2a430ba26",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 23368,
            "upload_time": "2023-08-01T08:45:23",
            "upload_time_iso_8601": "2023-08-01T08:45:23.220993Z",
            "url": "https://files.pythonhosted.org/packages/71/1b/95f288a01958e554359acb60372de6f60ebe25ccebb817111d748cccfc31/macrometa-source-snowflake-1.0.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-08-01 08:45:23",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Macrometacorp",
    "github_project": "macrometa-source-snowflake",
    "github_not_found": true,
    "lcname": "macrometa-source-snowflake"
}
        
Elapsed time: 0.09074s