Name | airbyte-databricks-cache JSON |
Version |
0.1.11
JSON |
| download |
home_page | None |
Summary | None |
upload_time | 2025-01-29 02:26:07 |
maintainer | None |
docs_url | None |
author | ossmht |
requires_python | <3.13,>=3.10 |
license | None |
keywords |
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# airbyte_databricks_cache
Databricks cache implementation for PyAirbyte
## Installation
```sh
pip install airbyte-databricks-cache
```
## Usage
```py
import airbyte as ab
#
# Must import airbyte_databricks_cache to inject the module into airbyte.caches.databricks
import airbyte_databricks_cache
# So this is possible now:
from airbyte.caches.databricks import DatabricksCache # pylint: disable=E0401:import-error
# create airbyte source
source = ab.get_source(...)
# create DatabricksCache
cache_dbks = DatabricksCache(
access_token = ab.get_secret("databricks_access_token"),
server_hostname = ab.get_secret("databricks_server_hostname"),
http_path= ab.get_secret("databricks_http_path"),
catalog = ab.get_secret("databricks_catalog"),
schema_name = ab.get_secret("databricks_target_schema"),
staging_volume_w_location = ab.get_secret("databricks_staging_volume_w_location")
)
destination = ab.get_destination("destination-databricks", cache_dbks)
destination.write(source, ...)
```
## Build and deploy
Happens via github workflow.
```sh
# release
git fetch --tags origin
git describe --tags --abbrev=0
gh release create v0.1.8 --target feat-workflow --generate-notes
git fetch --tags origin
## manual addons
# check long desc for pypi
twine check dist/*
```
Raw data
{
"_id": null,
"home_page": null,
"name": "airbyte-databricks-cache",
"maintainer": null,
"docs_url": null,
"requires_python": "<3.13,>=3.10",
"maintainer_email": null,
"keywords": null,
"author": "ossmht",
"author_email": "ossmht@gmail.com",
"download_url": null,
"platform": null,
"description": "# airbyte_databricks_cache\n\nDatabricks cache implementation for PyAirbyte\n\n## Installation\n\n```sh\npip install airbyte-databricks-cache\n\n```\n\n\n## Usage\n\n```py\n\nimport airbyte as ab\n#\n# Must import airbyte_databricks_cache to inject the module into airbyte.caches.databricks\nimport airbyte_databricks_cache \n# So this is possible now:\nfrom airbyte.caches.databricks import DatabricksCache # pylint: disable=E0401:import-error\n\n# create airbyte source\nsource = ab.get_source(...)\n\n# create DatabricksCache\ncache_dbks = DatabricksCache(\n access_token = ab.get_secret(\"databricks_access_token\"),\n server_hostname = ab.get_secret(\"databricks_server_hostname\"),\n http_path= ab.get_secret(\"databricks_http_path\"),\n catalog = ab.get_secret(\"databricks_catalog\"),\n schema_name = ab.get_secret(\"databricks_target_schema\"),\n staging_volume_w_location = ab.get_secret(\"databricks_staging_volume_w_location\")\n)\n\ndestination = ab.get_destination(\"destination-databricks\", cache_dbks)\n\n\ndestination.write(source, ...)\n```\n\n\n## Build and deploy\n\nHappens via github workflow.\n\n```sh\n\n# release\ngit fetch --tags origin\ngit describe --tags --abbrev=0\ngh release create v0.1.8 --target feat-workflow --generate-notes\ngit fetch --tags origin\n\n\n## manual addons\n# check long desc for pypi\ntwine check dist/*\n```\n\n",
"bugtrack_url": null,
"license": null,
"summary": null,
"version": "0.1.11",
"project_urls": null,
"split_keywords": [],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "ed75615becd23fda1a46ac8ba1f2ef02bceeb23fafcea2392eb63e4da3373fd5",
"md5": "90b4c9a4ad19e7d4e9574972521b7825",
"sha256": "e7bae7fc0a1e82d0f30a1a02013713123c61ce724825a39a17e9f7d6696fdc42"
},
"downloads": -1,
"filename": "airbyte_databricks_cache-0.1.11-py3-none-any.whl",
"has_sig": false,
"md5_digest": "90b4c9a4ad19e7d4e9574972521b7825",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<3.13,>=3.10",
"size": 8076,
"upload_time": "2025-01-29T02:26:07",
"upload_time_iso_8601": "2025-01-29T02:26:07.489666Z",
"url": "https://files.pythonhosted.org/packages/ed/75/615becd23fda1a46ac8ba1f2ef02bceeb23fafcea2392eb63e4da3373fd5/airbyte_databricks_cache-0.1.11-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-01-29 02:26:07",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "airbyte-databricks-cache"
}