# Airflow ClickHouse Plugin
![PyPI - Downloads](https://img.shields.io/pypi/dm/airflow-clickhouse-plugin)
![GitHub Workflow Status](https://img.shields.io/github/actions/workflow/status/bryzgaloff/airflow-clickhouse-plugin/tests.yml?branch=master)
![GitHub contributors](https://img.shields.io/github/contributors/bryzgaloff/airflow-clickhouse-plugin?color=blue)
🔝 The most popular [Apache Airflow][airflow] plugin for ClickHouse, ranked in the top 1% of downloads [on PyPI](https://pypi.org/project/airflow-clickhouse-plugin/). Based on awesome [mymarilyn/clickhouse-driver][ch-driver].
This plugin provides two families of operators: richer [`clickhouse_driver.Client.execute`-based](#clickhouse-driver-family) and standardized [compatible with Python DB API 2.0](#python-db-api-20-family).
Both operators' families are fully supported and covered with tests for different versions of Airflow and Python.
## `clickhouse-driver` family
- `ClickHouseOperator`
- `ClickHouseHook`
- `ClickHouseSensor`
These operators are based on [mymarilyn/clickhouse-driver][ch-driver]'s `Client.execute` method and arguments. They offer a full functionality of `clickhouse-driver` and are recommended if you are starting fresh with ClickHouse in Airflow.
### Features
- **SQL Templating**: SQL queries and other parameters are templated.
- **Multiple SQL Queries**: execute run multiple SQL queries within a single `ClickHouseOperator`. The result of the last query is pushed to XCom (configurable by `do_xcom_push`).
- **Logging**: executed queries are logged in a visually pleasing format, making it easier to track and debug.
- **Efficient Native ClickHouse Protocol**: Utilizes efficient _native_ ClickHouse TCP protocol, thanks to [clickhouse-driver][ch-driver-docs]. **Does not support HTTP protocol.**
- **Custom Connection Parameters**: Supports additional ClickHouse [connection parameters][ch-driver-connection], such as various timeouts, `compression`, `secure`, through the Airflow [Connection.extra][airflow-conn-extra] property.
See reference and examples [below](#usage).
### Installation and dependencies
`pip install -U airflow-clickhouse-plugin`
Dependencies: only `apache-airflow` and `clickhouse-driver`.
## Python DB API 2.0 family
- Operators:
- `ClickHouseSQLExecuteQueryOperator`
- `ClickHouseSQLColumnCheckOperator`
- `ClickHouseSQLTableCheckOperator`
- `ClickHouseSQLCheckOperator`
- `ClickHouseSQLValueCheckOperator`
- `ClickHouseSQLIntervalCheckOperator`
- `ClickHouseSQLThresholdCheckOperator`
- `ClickHouseBranchSQLOperator`
- `ClickHouseDbApiHook`
- `ClickHouseSqlSensor`
These operators combine [`clickhouse_driver.dbapi`][ch-driver-db-api] with [apache-airflow-providers-common-sql]. While they have limited functionality compared to `Client.execute` (not all arguments are supported), they provide a standardized interface. This is useful when porting Airflow pipelines to ClickHouse from another SQL provider backed by `common.sql` Airflow package, such as MySQL, Postgres, BigQuery, and others.
The feature set of this version is fully based on `common.sql` Airflow provider: refer to its [reference][common-sql-reference] and [examples][common-sql-examples] for details.
An example is also available [below](#db-api-20-clickhousesqlsensor-and-clickhousesqlexecutequeryoperator-example).
### Installation and dependencies
Add `common.sql` extra when installing the plugin: `pip install -U airflow-clickhouse-plugin[common.sql]` — to enable DB API 2.0 operators.
Dependencies: `apache-airflow-providers-common-sql` (usually pre-packed with Airflow) in addition to `apache-airflow` and `clickhouse-driver`.
## Python and Airflow versions support
Different versions of the plugin support different combinations of Python and Airflow versions. We _primarily_ support **Airflow 2.0+ and Python 3.8+**. If you need to use the plugin with older Python-Airflow combinations, pick a suitable plugin version:
| airflow-clickhouse-plugin version | Airflow version | Python version |
|-----------------------------------|-------------------------|--------------------|
| 1.4.0 | \>=2.0.0,<2.11.0 | ~=3.8 |
| 1.3.0 | \>=2.0.0,<2.10.0 | ~=3.8 |
| 1.2.0 | \>=2.0.0,<2.9.0 | ~=3.8 |
| 1.1.0 | \>=2.0.0,<2.8.0 | ~=3.8 |
| 1.0.0 | \>=2.0.0,<2.7.0 | ~=3.8 |
| 0.11.0 | ~=2.0.0,\>=2.2.0,<2.7.0 | ~=3.7 |
| 0.10.0,0.10.1 | ~=2.0.0,\>=2.2.0,<2.6.0 | ~=3.7 |
| 0.9.0,0.9.1 | ~=2.0.0,\>=2.2.0,<2.5.0 | ~=3.7 |
| 0.8.2 | \>=2.0.0,<2.4.0 | ~=3.7 |
| 0.8.0,0.8.1 | \>=2.0.0,<2.3.0 | ~=3.6 |
| 0.7.0 | \>=2.0.0,<2.2.0 | ~=3.6 |
| 0.6.0 | ~=2.0.1 | ~=3.6 |
| \>=0.5.4,<0.6.0 | ~=1.10.6 | \>=2.7 or >=3.5.\* |
| \>=0.5.0,<0.5.4 | ==1.10.6 | \>=2.7 or >=3.5.\* |
`~=` means compatible release, see [PEP 440][pep-440-compatible-releases] for an explanation.
[DB API 2.0 functionality](#python-db-api-20-family) requires `apache-airflow>2.9.3` (strictly greater since versions up to 2.9.3 had an MRO-related bug, see [#87](https://github.com/bryzgaloff/airflow-clickhouse-plugin/issues/87)) and `apache-airflow-providers-common-sql>=1.3`: earlier versions are not supported.
Previous versions of the plugin might require `pandas` extra: `pip install airflow-clickhouse-plugin[pandas]==0.11.0`. Check out earlier versions of `README.md` for details.
# Usage
To see examples [scroll down](#examples). To run them, [create an Airflow connection to ClickHouse](#how-to-create-an-airflow-connection-to-clickhouse).
## ClickHouseOperator reference
To import `ClickHouseOperator` use `from airflow_clickhouse_plugin.operators.clickhouse import ClickHouseOperator`.
Supported arguments:
* `sql` (templated, required): query (if argument is a single `str`) or multiple queries (iterable of `str`). Supports files with `.sql` extension.
* `clickhouse_conn_id`: Airflow connection id. Connection schema is described [below](#clickhouse-connection-schema). Default connection id is `clickhouse_default`.
* Arguments of [`clickhouse_driver.Client.execute` method][ch-driver-execute-summary]:
* `parameters` (templated): passed `params` of the `execute` method. (Renamed to avoid name conflict with Airflow tasks' `params` argument.)
* `dict` for `SELECT` queries.
* `list`/`tuple`/generator for `INSERT` queries.
* If multiple queries are provided via `sql` then the `parameters` are passed to _all_ of them.
* `with_column_types` (not templated).
* `external_tables` (templated).
* `query_id` (templated).
* `settings` (templated).
* `types_check` (not templated).
* `columnar` (not templated).
* For the documentation of these arguments, refer to [`clickhouse_driver.Client.execute` API reference][ch-driver-execute-reference].
* `database` (templated): if present, overrides `schema` of Airflow connection.
* Other arguments (including a required `task_id`) are inherited from Airflow [BaseOperator][airflow-base-op].
Result of the _last_ query is pushed to XCom (disable using `do_xcom_push=False` argument).
In other words, the operator simply wraps [`ClickHouseHook.execute` method](#clickhousehook-reference).
See [example](#clickhouseoperator-example) below.
## ClickHouseHook reference
To import `ClickHouseHook` use `from airflow_clickhouse_plugin.hooks.clickhouse import ClickHouseHook`.
Supported kwargs of constructor (`__init__` method):
* `clickhouse_conn_id`: Airflow connection id. Connection schema is described [below](#clickhouse-connection-schema). Default connection id is `clickhouse_default`.
* `database`: if present, overrides `schema` of Airflow connection.
Defines `ClickHouseHook.execute` method which simply wraps [`clickhouse_driver.Client.execute`][ch-driver-execute-reference]. It has all the same arguments, except of:
* `sql` (instead of `execute`'s `query`): query (if argument is a single `str`) or multiple queries (iterable of `str`).
`ClickHouseHook.execute` returns a result of the _last_ query.
Also, the hook defines `get_conn()` method which returns an underlying [`clickhouse_driver.Client`][ch-driver-client] instance.
See [example](#clickhousehook-example) below.
## ClickHouseSensor reference
To import `ClickHouseSensor` use `from airflow_clickhouse_plugin.sensors.clickhouse import ClickHouseSensor`.
This class wraps [`ClickHouseHook.execute` method](#clickhousehook-reference) into an [Airflow sensor][airflow-sensor]. Supports all the arguments of [`ClickHouseOperator`](#clickhouseoperator-reference) and additionally:
* `is_success`: a callable which accepts a single argument — a return value of `ClickHouseHook.execute`. If a return value of `is_success` is truthy, the sensor succeeds. By default, the callable is `bool`: i.e. if the return value of `ClickHouseHook.execute` is truthy, the sensor succeeds. Usually, `execute` is a list of records returned by query: thus, by default it is falsy if no records are returned.
* `is_failure`: a callable which accepts a single argument — a return value of `ClickHouseHook.execute`. If a return value of `is_failure` is truthy, the sensor raises `AirflowException`. By default, `is_failure` is `None` and no failure check is performed.
See [example](#clickhousesensor-example) below.
## How to create an Airflow connection to ClickHouse
As a `type` of a new connection, choose **SQLite** or any other SQL database. There is **no** special ClickHouse connection type yet, so we use any SQL as the closest one.
All the connection attributes are optional: default host is `localhost` and other credentials [have defaults](#default-values) defined by `clickhouse-driver`. If you use non-default values, set them according to the [connection schema](#clickhouse-connection-schema).
If you use a secure connection to ClickHouse (this requires additional configurations on ClickHouse side), set `extra` to `{"secure":true}`. All `extra` connection parameters are passed to [`clickhouse_driver.Client`][ch-driver-client] as-is.
### ClickHouse connection schema
[`clickhouse_driver.Client`][ch-driver-client] is initialized with attributes stored in Airflow [Connection attributes][airflow-connection-howto]:
| Airflow Connection attribute | `Client.__init__` argument |
|------------------------------|----------------------------|
| `host` | `host` |
| `port` (`int`) | `port` |
| `schema` | `database` |
| `login` | `user` |
| `password` | `password` |
| `extra` | `**kwargs` |
`database` argument of `ClickHouseOperator`, `ClickHouseHook`, `ClickHouseSensor`, and others overrides `schema` attribute of the Airflow connection.
### Extra arguments
You may set non-standard arguments of [`clickhouse_driver.Client`][ch-driver-client], such as timeouts, `compression`, `secure`, etc. using Airflow's [`Connection.extra`][airflow-conn-extra] attribute. The attribute should contain a JSON object which will be [deserialized][airflow-conn-dejson] and all of its properties will be passed as-is to the `Client`.
For example, if Airflow connection contains `extra='{"secure": true}'` then the `Client.__init__` will receive `secure=True` keyword argument in addition to other connection attributes.
#### Compression
You should install specific packages to support compression. For example, for lz4:
```bash
pip3 install clickhouse-cityhash lz4
```
Then you should include `compression` parameter in airflow connection uri: `extra='{"compression":"lz4"}'`. You can get additional information about extra options from [official documentation of clickhouse-driver][ch-driver-pypi-install].
Connection URI with compression will look like `clickhouse://login:password@host:port/?compression=lz4`.
See [official documentation][airflow-connection-howto] to learn more about connections management in Airflow.
### Default Values
If some Airflow connection attribute is not set, it is not passed to `clickhouse_driver.Client`. In such cases, the plugin uses a default value from the corresponding [`clickhouse_driver.Connection`][ch-driver-connection] argument. For instance, `user` defaults to `'default'`.
This means that the plugin itself does not define any default values for the ClickHouse connection. You may fully rely on default values of the [clickhouse-driver][ch-driver] version you use.
The only exception is `host`: if the attribute of Airflow connection is not set then `'localhost'` is used.
### Default connection
By default, the plugin uses Airflow connection with id `'clickhouse_default'`.
## Examples
### ClickHouseOperator example
```python
from airflow import DAG
from airflow_clickhouse_plugin.operators.clickhouse import ClickHouseOperator
from airflow.operators.python import PythonOperator
from airflow.utils.dates import days_ago
with DAG(
dag_id='update_income_aggregate',
start_date=days_ago(2),
) as dag:
ClickHouseOperator(
task_id='update_income_aggregate',
database='default',
sql=(
'''
INSERT INTO aggregate
SELECT eventDt, sum(price * qty) AS income FROM sales
WHERE eventDt = '{{ ds }}' GROUP BY eventDt
''', '''
OPTIMIZE TABLE aggregate ON CLUSTER {{ var.value.cluster_name }}
PARTITION toDate('{{ execution_date.format('%Y-%m-01') }}')
''', '''
SELECT sum(income) FROM aggregate
WHERE eventDt BETWEEN
'{{ execution_date.start_of('month').to_date_string() }}'
AND '{{ execution_date.end_of('month').to_date_string() }}'
''',
# result of the last query is pushed to XCom
),
# query_id is templated and allows to quickly identify query in ClickHouse logs
query_id='{{ ti.dag_id }}-{{ ti.task_id }}-{{ ti.run_id }}-{{ ti.try_number }}',
clickhouse_conn_id='clickhouse_test',
) >> PythonOperator(
task_id='print_month_income',
python_callable=lambda task_instance:
# pulling XCom value and printing it
print(task_instance.xcom_pull(task_ids='update_income_aggregate')),
)
```
### ClickHouseHook example
```python
from airflow import DAG
from airflow_clickhouse_plugin.hooks.clickhouse import ClickHouseHook
from airflow.providers.sqlite.hooks.sqlite import SqliteHook
from airflow.operators.python import PythonOperator
from airflow.utils.dates import days_ago
def sqlite_to_clickhouse():
sqlite_hook = SqliteHook()
ch_hook = ClickHouseHook()
records = sqlite_hook.get_records('SELECT * FROM some_sqlite_table')
ch_hook.execute('INSERT INTO some_ch_table VALUES', records)
with DAG(
dag_id='sqlite_to_clickhouse',
start_date=days_ago(2),
) as dag:
dag >> PythonOperator(
task_id='sqlite_to_clickhouse',
python_callable=sqlite_to_clickhouse,
)
```
Important note: don't try to insert values using `ch_hook.execute('INSERT INTO some_ch_table VALUES (1)')` literal form. [`clickhouse-driver` requires][ch-driver-insert] values for `INSERT` query to be provided via `parameters` due to specifics of the native ClickHouse protocol.
### ClickHouseSensor example
```python
from airflow import DAG
from airflow_clickhouse_plugin.sensors.clickhouse import ClickHouseSensor
from airflow_clickhouse_plugin.operators.clickhouse import ClickHouseOperator
from airflow.utils.dates import days_ago
with DAG(
dag_id='listen_warnings',
start_date=days_ago(2),
) as dag:
dag >> ClickHouseSensor(
task_id='poke_events_count',
database='monitor',
sql="SELECT count() FROM warnings WHERE eventDate = '{{ ds }}'",
is_success=lambda cnt: cnt > 10000,
) >> ClickHouseOperator(
task_id='create_alert',
database='alerts',
sql='''
INSERT INTO events SELECT eventDate, count()
FROM monitor.warnings WHERE eventDate = '{{ ds }}'
''',
)
```
### DB API 2.0: ClickHouseSqlSensor and ClickHouseSQLExecuteQueryOperator example
```python
from airflow import DAG
from airflow_clickhouse_plugin.sensors.clickhouse_dbapi import ClickHouseSqlSensor
from airflow_clickhouse_plugin.operators.clickhouse_dbapi import ClickHouseSQLExecuteQueryOperator
from airflow.utils.dates import days_ago
with DAG(
dag_id='listen_warnings',
start_date=days_ago(2),
) as dag:
dag >> ClickHouseSqlSensor(
task_id='poke_events_count',
hook_params=dict(schema='monitor'),
sql="SELECT count() FROM warnings WHERE eventDate = '{{ ds }}'",
success=lambda cnt: cnt > 10000,
conn_id=None, # required by common.sql SqlSensor; use None for default
) >> ClickHouseSQLExecuteQueryOperator(
task_id='create_alert',
database='alerts',
sql='''
INSERT INTO events SELECT eventDate, count()
FROM monitor.warnings WHERE eventDate = '{{ ds }}'
''',
)
```
# How to run tests
Unit tests: `python3 -m unittest discover -t tests -s unit`
Integration tests require access to a ClickHouse server. Here is how to set up a local test environment using Docker:
* Run ClickHouse server in a local Docker container: `docker run -p 9000:9000 --ulimit nofile=262144:262144 -it clickhouse/clickhouse-server`
* Run tests with Airflow connection details set [via environment variable][airflow-conn-env]: `PYTHONPATH=src AIRFLOW_CONN_CLICKHOUSE_DEFAULT=clickhouse://localhost python3 -m unittest discover -t tests -s integration`
* Stop the container after running the tests to deallocate its resources.
Run all (unit&integration) tests with ClickHouse connection defined: `PYTHONPATH=src AIRFLOW_CONN_CLICKHOUSE_DEFAULT=clickhouse://localhost python3 -m unittest discover -s tests`
## GitHub Actions
[GitHub Action][github-action-src] is configured for this project.
## Run all tests inside Docker
Start a ClickHouse server inside Docker: `docker exec -it $(docker run --rm -d clickhouse/clickhouse-server) bash`
The above command will open `bash` inside the container.
Install dependencies into container and run tests (execute inside container):
```bash
apt-get update
apt-get install -y python3 python3-pip git make
git clone https://github.com/whisklabs/airflow-clickhouse-plugin.git
cd airflow-clickhouse-plugin
python3 -m pip install -r requirements.txt
PYTHONPATH=src AIRFLOW_CONN_CLICKHOUSE_DEFAULT=clickhouse://localhost python3 -m unittest discover -s tests
```
Stop the container.
# Contributors
* Created by Anton Bryzgalov, [@bryzgaloff](https://github.com/bryzgaloff), originally at [Whisk, Samsung](https://github.com/whisklabs)
* Inspired by Viktor Taranenko, [@viktortnk](https://github.com/viktortnk) (Whisk, Samsung)
Community contributors:
* Danila Ganchar, [@d-ganchar](https://github.com/d-ganchar)
* Mikhail, [@glader](https://github.com/glader)
* Alexander Chashnikov, [@ne1r0n](https://github.com/ne1r0n)
* Simone Brundu, [@saimon46](https://github.com/saimon46)
* [@gkarg](https://github.com/gkarg)
* Stanislav Morozov, [@r3b-fish](https://github.com/r3b-fish)
* Sergey Bychkov, [@SergeyBychkov](https://github.com/SergeyBychkov)
* [@was-av](https://github.com/was-av)
* Maxim Tarasov, [@MaximTar](https://github.com/MaximTar)
* [@dvnrvn](https://github.com/dvnrvn)
* Giovanni Corsetti, [@CorsettiS](https://github.com/CorsettiS)
* Dmytro Zhyzniev, [@1ng4lipt](https://github.com/1ng4lipt)
* Anton Bezdenezhnykh, [@GaMeRaM](https://github.com/GaMeRaM)
* Andrey, [@bobelev](https://github.com/bobelev)
* Misha Epikhin, [@epikhinm](https://github.com/epikhinm)
* igor, [@cra](https://github.com/cra)
* Yuriy Natarov, [@Acuion](https://github.com/Acuion)
[airflow]: https://airflow.apache.org/
[ch-driver]: https://github.com/mymarilyn/clickhouse-driver
[ch-driver-docs]: https://clickhouse-driver.readthedocs.io/en/latest/
[ch-driver-execute-summary]: https://clickhouse-driver.readthedocs.io/en/latest/quickstart.html#selecting-data
[ch-driver-execute-reference]: https://clickhouse-driver.readthedocs.io/en/latest/api.html#clickhouse_driver.Client.execute
[airflow-base-op]: https://airflow.apache.org/docs/apache-airflow/stable/_api/airflow/models/baseoperator/index.html
[ch-driver-insert]: https://clickhouse-driver.readthedocs.io/en/latest/quickstart.html#inserting-data
[ch-driver-client]: https://clickhouse-driver.readthedocs.io/en/latest/api.html#client
[ch-driver-connection]: https://clickhouse-driver.readthedocs.io/en/latest/api.html#connection
[airflow-conn-extra]: https://airflow.apache.org/docs/2.1.0/_api/airflow/models/connection/index.html#airflow.models.connection.Connection.extra
[airflow-connection-howto]: https://airflow.apache.org/docs/apache-airflow/stable/howto/connection.html
[airflow-conn-dejson]: https://airflow.apache.org/docs/apache-airflow/2.1.0/_api/airflow/models/index.html?highlight=connection#airflow.models.Connection.extra_dejson
[airflow-conn-env]: https://airflow.apache.org/docs/apache-airflow/2.1.0/howto/connection.html#storing-a-connection-in-environment-variables
[github-action-src]: https://github.com/whisklabs/airflow-clickhouse-plugin/tree/master/.github/workflows
[pep-440-compatible-releases]: https://peps.python.org/pep-0440/#compatible-release
[apache-airflow-providers-common-sql]: https://airflow.apache.org/docs/apache-airflow-providers-common-sql/stable/index.html
[db-api-pep]: https://peps.python.org/pep-0249/
[airflow-sensor]: https://airflow.apache.org/docs/apache-airflow/stable/core-concepts/sensors.html
[ch-driver-pypi-install]: https://clickhouse-driver.readthedocs.io/en/latest/installation.html#installation-pypi
[common-sql-reference]: https://airflow.apache.org/docs/apache-airflow-providers-common-sql/stable/_api/airflow/providers/common/sql/index.html
[common-sql-examples]: https://airflow.apache.org/docs/apache-airflow-providers-common-sql/stable/operators.html
[ch-driver-db-api]: https://clickhouse-driver.readthedocs.io/en/latest/dbapi.html
Raw data
{
"_id": null,
"home_page": null,
"name": "airflow-clickhouse-plugin",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": "clickhouse, airflow",
"author": null,
"author_email": "\"Anton Bryzgalov, Viktor Taranenko\" <tony.bryzgaloff@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/b1/79/d29d71e04ec37f2fb23033ef8bbb0f542c25cb887bc6d99262fe0ea898c4/airflow_clickhouse_plugin-1.4.0.tar.gz",
"platform": null,
"description": "# Airflow ClickHouse Plugin\n\n![PyPI - Downloads](https://img.shields.io/pypi/dm/airflow-clickhouse-plugin)\n![GitHub Workflow Status](https://img.shields.io/github/actions/workflow/status/bryzgaloff/airflow-clickhouse-plugin/tests.yml?branch=master)\n![GitHub contributors](https://img.shields.io/github/contributors/bryzgaloff/airflow-clickhouse-plugin?color=blue)\n\n\ud83d\udd1d The most popular [Apache Airflow][airflow] plugin for ClickHouse, ranked in the top 1% of downloads [on PyPI](https://pypi.org/project/airflow-clickhouse-plugin/). Based on awesome [mymarilyn/clickhouse-driver][ch-driver].\n\nThis plugin provides two families of operators: richer [`clickhouse_driver.Client.execute`-based](#clickhouse-driver-family) and standardized [compatible with Python DB API 2.0](#python-db-api-20-family).\n\nBoth operators' families are fully supported and covered with tests for different versions of Airflow and Python.\n\n## `clickhouse-driver` family\n\n- `ClickHouseOperator`\n- `ClickHouseHook`\n- `ClickHouseSensor`\n\nThese operators are based on [mymarilyn/clickhouse-driver][ch-driver]'s `Client.execute` method and arguments. They offer a full functionality of `clickhouse-driver` and are recommended if you are starting fresh with ClickHouse in Airflow.\n\n### Features\n\n- **SQL Templating**: SQL queries and other parameters are templated.\n- **Multiple SQL Queries**: execute run multiple SQL queries within a single `ClickHouseOperator`. The result of the last query is pushed to XCom (configurable by `do_xcom_push`).\n- **Logging**: executed queries are logged in a visually pleasing format, making it easier to track and debug.\n- **Efficient Native ClickHouse Protocol**: Utilizes efficient _native_ ClickHouse TCP protocol, thanks to [clickhouse-driver][ch-driver-docs]. **Does not support HTTP protocol.**\n- **Custom Connection Parameters**: Supports additional ClickHouse [connection parameters][ch-driver-connection], such as various timeouts, `compression`, `secure`, through the Airflow [Connection.extra][airflow-conn-extra] property.\n\nSee reference and examples [below](#usage).\n\n### Installation and dependencies\n\n`pip install -U airflow-clickhouse-plugin`\n\nDependencies: only `apache-airflow` and `clickhouse-driver`.\n\n## Python DB API 2.0 family\n\n- Operators:\n - `ClickHouseSQLExecuteQueryOperator`\n - `ClickHouseSQLColumnCheckOperator`\n - `ClickHouseSQLTableCheckOperator`\n - `ClickHouseSQLCheckOperator`\n - `ClickHouseSQLValueCheckOperator`\n - `ClickHouseSQLIntervalCheckOperator`\n - `ClickHouseSQLThresholdCheckOperator`\n - `ClickHouseBranchSQLOperator`\n- `ClickHouseDbApiHook`\n- `ClickHouseSqlSensor`\n\nThese operators combine [`clickhouse_driver.dbapi`][ch-driver-db-api] with [apache-airflow-providers-common-sql]. While they have limited functionality compared to `Client.execute` (not all arguments are supported), they provide a standardized interface. This is useful when porting Airflow pipelines to ClickHouse from another SQL provider backed by `common.sql` Airflow package, such as MySQL, Postgres, BigQuery, and others.\n\nThe feature set of this version is fully based on `common.sql` Airflow provider: refer to its [reference][common-sql-reference] and [examples][common-sql-examples] for details.\n\nAn example is also available [below](#db-api-20-clickhousesqlsensor-and-clickhousesqlexecutequeryoperator-example).\n\n### Installation and dependencies\n\nAdd `common.sql` extra when installing the plugin: `pip install -U airflow-clickhouse-plugin[common.sql]` \u2014 to enable DB API 2.0 operators.\n\nDependencies: `apache-airflow-providers-common-sql` (usually pre-packed with Airflow) in addition to `apache-airflow` and `clickhouse-driver`.\n\n## Python and Airflow versions support\n\nDifferent versions of the plugin support different combinations of Python and Airflow versions. We _primarily_ support **Airflow 2.0+ and Python 3.8+**. If you need to use the plugin with older Python-Airflow combinations, pick a suitable plugin version:\n\n| airflow-clickhouse-plugin version | Airflow version | Python version |\n|-----------------------------------|-------------------------|--------------------|\n| 1.4.0 | \\>=2.0.0,<2.11.0 | ~=3.8 |\n| 1.3.0 | \\>=2.0.0,<2.10.0 | ~=3.8 |\n| 1.2.0 | \\>=2.0.0,<2.9.0 | ~=3.8 |\n| 1.1.0 | \\>=2.0.0,<2.8.0 | ~=3.8 |\n| 1.0.0 | \\>=2.0.0,<2.7.0 | ~=3.8 |\n| 0.11.0 | ~=2.0.0,\\>=2.2.0,<2.7.0 | ~=3.7 |\n| 0.10.0,0.10.1 | ~=2.0.0,\\>=2.2.0,<2.6.0 | ~=3.7 |\n| 0.9.0,0.9.1 | ~=2.0.0,\\>=2.2.0,<2.5.0 | ~=3.7 |\n| 0.8.2 | \\>=2.0.0,<2.4.0 | ~=3.7 |\n| 0.8.0,0.8.1 | \\>=2.0.0,<2.3.0 | ~=3.6 |\n| 0.7.0 | \\>=2.0.0,<2.2.0 | ~=3.6 |\n| 0.6.0 | ~=2.0.1 | ~=3.6 |\n| \\>=0.5.4,<0.6.0 | ~=1.10.6 | \\>=2.7 or >=3.5.\\* |\n| \\>=0.5.0,<0.5.4 | ==1.10.6 | \\>=2.7 or >=3.5.\\* |\n\n`~=` means compatible release, see [PEP 440][pep-440-compatible-releases] for an explanation.\n\n[DB API 2.0 functionality](#python-db-api-20-family) requires `apache-airflow>2.9.3` (strictly greater since versions up to 2.9.3 had an MRO-related bug, see [#87](https://github.com/bryzgaloff/airflow-clickhouse-plugin/issues/87)) and `apache-airflow-providers-common-sql>=1.3`: earlier versions are not supported.\n\nPrevious versions of the plugin might require `pandas` extra: `pip install airflow-clickhouse-plugin[pandas]==0.11.0`. Check out earlier versions of `README.md` for details.\n\n# Usage\n\nTo see examples [scroll down](#examples). To run them, [create an Airflow connection to ClickHouse](#how-to-create-an-airflow-connection-to-clickhouse).\n\n## ClickHouseOperator reference\n\nTo import `ClickHouseOperator` use `from airflow_clickhouse_plugin.operators.clickhouse import ClickHouseOperator`.\n\nSupported arguments:\n* `sql` (templated, required): query (if argument is a single `str`) or multiple queries (iterable of `str`). Supports files with `.sql` extension.\n* `clickhouse_conn_id`: Airflow connection id. Connection schema is described [below](#clickhouse-connection-schema). Default connection id is `clickhouse_default`.\n* Arguments of [`clickhouse_driver.Client.execute` method][ch-driver-execute-summary]:\n * `parameters` (templated): passed `params` of the `execute` method. (Renamed to avoid name conflict with Airflow tasks' `params` argument.)\n * `dict` for `SELECT` queries.\n * `list`/`tuple`/generator for `INSERT` queries.\n * If multiple queries are provided via `sql` then the `parameters` are passed to _all_ of them.\n * `with_column_types` (not templated).\n * `external_tables` (templated).\n * `query_id` (templated).\n * `settings` (templated).\n * `types_check` (not templated).\n * `columnar` (not templated).\n * For the documentation of these arguments, refer to [`clickhouse_driver.Client.execute` API reference][ch-driver-execute-reference].\n* `database` (templated): if present, overrides `schema` of Airflow connection.\n* Other arguments (including a required `task_id`) are inherited from Airflow [BaseOperator][airflow-base-op].\n\nResult of the _last_ query is pushed to XCom (disable using `do_xcom_push=False` argument).\n\nIn other words, the operator simply wraps [`ClickHouseHook.execute` method](#clickhousehook-reference).\n\nSee [example](#clickhouseoperator-example) below.\n\n## ClickHouseHook reference\n\nTo import `ClickHouseHook` use `from airflow_clickhouse_plugin.hooks.clickhouse import ClickHouseHook`.\n\nSupported kwargs of constructor (`__init__` method):\n* `clickhouse_conn_id`: Airflow connection id. Connection schema is described [below](#clickhouse-connection-schema). Default connection id is `clickhouse_default`.\n* `database`: if present, overrides `schema` of Airflow connection.\n\nDefines `ClickHouseHook.execute` method which simply wraps [`clickhouse_driver.Client.execute`][ch-driver-execute-reference]. It has all the same arguments, except of:\n* `sql` (instead of `execute`'s `query`): query (if argument is a single `str`) or multiple queries (iterable of `str`).\n\n`ClickHouseHook.execute` returns a result of the _last_ query.\n\nAlso, the hook defines `get_conn()` method which returns an underlying [`clickhouse_driver.Client`][ch-driver-client] instance.\n\nSee [example](#clickhousehook-example) below.\n\n## ClickHouseSensor reference\n\nTo import `ClickHouseSensor` use `from airflow_clickhouse_plugin.sensors.clickhouse import ClickHouseSensor`.\n\nThis class wraps [`ClickHouseHook.execute` method](#clickhousehook-reference) into an [Airflow sensor][airflow-sensor]. Supports all the arguments of [`ClickHouseOperator`](#clickhouseoperator-reference) and additionally:\n* `is_success`: a callable which accepts a single argument \u2014 a return value of `ClickHouseHook.execute`. If a return value of `is_success` is truthy, the sensor succeeds. By default, the callable is `bool`: i.e. if the return value of `ClickHouseHook.execute` is truthy, the sensor succeeds. Usually, `execute` is a list of records returned by query: thus, by default it is falsy if no records are returned.\n* `is_failure`: a callable which accepts a single argument \u2014 a return value of `ClickHouseHook.execute`. If a return value of `is_failure` is truthy, the sensor raises `AirflowException`. By default, `is_failure` is `None` and no failure check is performed.\n\nSee [example](#clickhousesensor-example) below.\n\n## How to create an Airflow connection to ClickHouse\n\nAs a `type` of a new connection, choose **SQLite** or any other SQL database. There is **no** special ClickHouse connection type yet, so we use any SQL as the closest one.\n\nAll the connection attributes are optional: default host is `localhost` and other credentials [have defaults](#default-values) defined by `clickhouse-driver`. If you use non-default values, set them according to the [connection schema](#clickhouse-connection-schema).\n\nIf you use a secure connection to ClickHouse (this requires additional configurations on ClickHouse side), set `extra` to `{\"secure\":true}`. All `extra` connection parameters are passed to [`clickhouse_driver.Client`][ch-driver-client] as-is.\n\n### ClickHouse connection schema\n\n[`clickhouse_driver.Client`][ch-driver-client] is initialized with attributes stored in Airflow [Connection attributes][airflow-connection-howto]:\n\n| Airflow Connection attribute | `Client.__init__` argument |\n|------------------------------|----------------------------|\n| `host` | `host` |\n| `port` (`int`) | `port` |\n| `schema` | `database` |\n| `login` | `user` |\n| `password` | `password` |\n| `extra` | `**kwargs` |\n\n`database` argument of `ClickHouseOperator`, `ClickHouseHook`, `ClickHouseSensor`, and others overrides `schema` attribute of the Airflow connection.\n\n### Extra arguments\n\nYou may set non-standard arguments of [`clickhouse_driver.Client`][ch-driver-client], such as timeouts, `compression`, `secure`, etc. using Airflow's [`Connection.extra`][airflow-conn-extra] attribute. The attribute should contain a JSON object which will be [deserialized][airflow-conn-dejson] and all of its properties will be passed as-is to the `Client`.\n\nFor example, if Airflow connection contains `extra='{\"secure\": true}'` then the `Client.__init__` will receive `secure=True` keyword argument in addition to other connection attributes.\n\n#### Compression\n\nYou should install specific packages to support compression. For example, for lz4:\n\n```bash\npip3 install clickhouse-cityhash lz4\n```\n\nThen you should include `compression` parameter in airflow connection uri: `extra='{\"compression\":\"lz4\"}'`. You can get additional information about extra options from [official documentation of clickhouse-driver][ch-driver-pypi-install].\n\nConnection URI with compression will look like `clickhouse://login:password@host:port/?compression=lz4`.\n\nSee [official documentation][airflow-connection-howto] to learn more about connections management in Airflow.\n\n### Default Values\n\nIf some Airflow connection attribute is not set, it is not passed to `clickhouse_driver.Client`. In such cases, the plugin uses a default value from the corresponding [`clickhouse_driver.Connection`][ch-driver-connection] argument. For instance, `user` defaults to `'default'`.\n\nThis means that the plugin itself does not define any default values for the ClickHouse connection. You may fully rely on default values of the [clickhouse-driver][ch-driver] version you use.\n\nThe only exception is `host`: if the attribute of Airflow connection is not set then `'localhost'` is used.\n\n### Default connection\n\nBy default, the plugin uses Airflow connection with id `'clickhouse_default'`.\n\n## Examples\n\n### ClickHouseOperator example\n\n```python\nfrom airflow import DAG\nfrom airflow_clickhouse_plugin.operators.clickhouse import ClickHouseOperator\nfrom airflow.operators.python import PythonOperator\nfrom airflow.utils.dates import days_ago\n\nwith DAG(\n dag_id='update_income_aggregate',\n start_date=days_ago(2),\n) as dag:\n ClickHouseOperator(\n task_id='update_income_aggregate',\n database='default',\n sql=(\n '''\n INSERT INTO aggregate\n SELECT eventDt, sum(price * qty) AS income FROM sales\n WHERE eventDt = '{{ ds }}' GROUP BY eventDt\n ''', '''\n OPTIMIZE TABLE aggregate ON CLUSTER {{ var.value.cluster_name }}\n PARTITION toDate('{{ execution_date.format('%Y-%m-01') }}')\n ''', '''\n SELECT sum(income) FROM aggregate\n WHERE eventDt BETWEEN\n '{{ execution_date.start_of('month').to_date_string() }}'\n AND '{{ execution_date.end_of('month').to_date_string() }}'\n ''',\n # result of the last query is pushed to XCom\n ),\n # query_id is templated and allows to quickly identify query in ClickHouse logs\n query_id='{{ ti.dag_id }}-{{ ti.task_id }}-{{ ti.run_id }}-{{ ti.try_number }}',\n clickhouse_conn_id='clickhouse_test',\n ) >> PythonOperator(\n task_id='print_month_income',\n python_callable=lambda task_instance:\n # pulling XCom value and printing it\n print(task_instance.xcom_pull(task_ids='update_income_aggregate')),\n )\n```\n\n### ClickHouseHook example\n\n```python\nfrom airflow import DAG\nfrom airflow_clickhouse_plugin.hooks.clickhouse import ClickHouseHook\nfrom airflow.providers.sqlite.hooks.sqlite import SqliteHook\nfrom airflow.operators.python import PythonOperator\nfrom airflow.utils.dates import days_ago\n\n\ndef sqlite_to_clickhouse():\n sqlite_hook = SqliteHook()\n ch_hook = ClickHouseHook()\n records = sqlite_hook.get_records('SELECT * FROM some_sqlite_table')\n ch_hook.execute('INSERT INTO some_ch_table VALUES', records)\n\n\nwith DAG(\n dag_id='sqlite_to_clickhouse',\n start_date=days_ago(2),\n) as dag:\n dag >> PythonOperator(\n task_id='sqlite_to_clickhouse',\n python_callable=sqlite_to_clickhouse,\n )\n```\n\nImportant note: don't try to insert values using `ch_hook.execute('INSERT INTO some_ch_table VALUES (1)')` literal form. [`clickhouse-driver` requires][ch-driver-insert] values for `INSERT` query to be provided via `parameters` due to specifics of the native ClickHouse protocol.\n\n### ClickHouseSensor example\n\n```python\nfrom airflow import DAG\nfrom airflow_clickhouse_plugin.sensors.clickhouse import ClickHouseSensor\nfrom airflow_clickhouse_plugin.operators.clickhouse import ClickHouseOperator\nfrom airflow.utils.dates import days_ago\n\n\nwith DAG(\n dag_id='listen_warnings',\n start_date=days_ago(2),\n) as dag:\n dag >> ClickHouseSensor(\n task_id='poke_events_count',\n database='monitor',\n sql=\"SELECT count() FROM warnings WHERE eventDate = '{{ ds }}'\",\n is_success=lambda cnt: cnt > 10000,\n ) >> ClickHouseOperator(\n task_id='create_alert',\n database='alerts',\n sql='''\n INSERT INTO events SELECT eventDate, count()\n FROM monitor.warnings WHERE eventDate = '{{ ds }}'\n ''',\n )\n```\n\n### DB API 2.0: ClickHouseSqlSensor and ClickHouseSQLExecuteQueryOperator example\n\n```python\nfrom airflow import DAG\nfrom airflow_clickhouse_plugin.sensors.clickhouse_dbapi import ClickHouseSqlSensor\nfrom airflow_clickhouse_plugin.operators.clickhouse_dbapi import ClickHouseSQLExecuteQueryOperator\nfrom airflow.utils.dates import days_ago\n\n\nwith DAG(\n dag_id='listen_warnings',\n start_date=days_ago(2),\n) as dag:\n dag >> ClickHouseSqlSensor(\n task_id='poke_events_count',\n hook_params=dict(schema='monitor'),\n sql=\"SELECT count() FROM warnings WHERE eventDate = '{{ ds }}'\",\n success=lambda cnt: cnt > 10000,\n conn_id=None, # required by common.sql SqlSensor; use None for default\n ) >> ClickHouseSQLExecuteQueryOperator(\n task_id='create_alert',\n database='alerts',\n sql='''\n INSERT INTO events SELECT eventDate, count()\n FROM monitor.warnings WHERE eventDate = '{{ ds }}'\n ''',\n )\n```\n\n# How to run tests\n\nUnit tests: `python3 -m unittest discover -t tests -s unit`\n\nIntegration tests require access to a ClickHouse server. Here is how to set up a local test environment using Docker:\n* Run ClickHouse server in a local Docker container: `docker run -p 9000:9000 --ulimit nofile=262144:262144 -it clickhouse/clickhouse-server`\n* Run tests with Airflow connection details set [via environment variable][airflow-conn-env]: `PYTHONPATH=src AIRFLOW_CONN_CLICKHOUSE_DEFAULT=clickhouse://localhost python3 -m unittest discover -t tests -s integration`\n* Stop the container after running the tests to deallocate its resources.\n\nRun all (unit&integration) tests with ClickHouse connection defined: `PYTHONPATH=src AIRFLOW_CONN_CLICKHOUSE_DEFAULT=clickhouse://localhost python3 -m unittest discover -s tests`\n\n## GitHub Actions\n\n[GitHub Action][github-action-src] is configured for this project.\n\n## Run all tests inside Docker\n\nStart a ClickHouse server inside Docker: `docker exec -it $(docker run --rm -d clickhouse/clickhouse-server) bash`\n\nThe above command will open `bash` inside the container.\n\nInstall dependencies into container and run tests (execute inside container):\n\n```bash\napt-get update\napt-get install -y python3 python3-pip git make\ngit clone https://github.com/whisklabs/airflow-clickhouse-plugin.git\ncd airflow-clickhouse-plugin\npython3 -m pip install -r requirements.txt\nPYTHONPATH=src AIRFLOW_CONN_CLICKHOUSE_DEFAULT=clickhouse://localhost python3 -m unittest discover -s tests\n```\n\nStop the container.\n\n# Contributors\n\n* Created by Anton Bryzgalov, [@bryzgaloff](https://github.com/bryzgaloff), originally at [Whisk, Samsung](https://github.com/whisklabs)\n* Inspired by Viktor Taranenko, [@viktortnk](https://github.com/viktortnk) (Whisk, Samsung)\n\nCommunity contributors:\n\n* Danila Ganchar, [@d-ganchar](https://github.com/d-ganchar)\n* Mikhail, [@glader](https://github.com/glader)\n* Alexander Chashnikov, [@ne1r0n](https://github.com/ne1r0n)\n* Simone Brundu, [@saimon46](https://github.com/saimon46)\n* [@gkarg](https://github.com/gkarg)\n* Stanislav Morozov, [@r3b-fish](https://github.com/r3b-fish)\n* Sergey Bychkov, [@SergeyBychkov](https://github.com/SergeyBychkov)\n* [@was-av](https://github.com/was-av)\n* Maxim Tarasov, [@MaximTar](https://github.com/MaximTar)\n* [@dvnrvn](https://github.com/dvnrvn)\n* Giovanni Corsetti, [@CorsettiS](https://github.com/CorsettiS)\n* Dmytro Zhyzniev, [@1ng4lipt](https://github.com/1ng4lipt)\n* Anton Bezdenezhnykh, [@GaMeRaM](https://github.com/GaMeRaM)\n* Andrey, [@bobelev](https://github.com/bobelev)\n* Misha Epikhin, [@epikhinm](https://github.com/epikhinm)\n* igor, [@cra](https://github.com/cra)\n* Yuriy Natarov, [@Acuion](https://github.com/Acuion)\n\n\n[airflow]: https://airflow.apache.org/\n[ch-driver]: https://github.com/mymarilyn/clickhouse-driver\n[ch-driver-docs]: https://clickhouse-driver.readthedocs.io/en/latest/\n[ch-driver-execute-summary]: https://clickhouse-driver.readthedocs.io/en/latest/quickstart.html#selecting-data\n[ch-driver-execute-reference]: https://clickhouse-driver.readthedocs.io/en/latest/api.html#clickhouse_driver.Client.execute\n[airflow-base-op]: https://airflow.apache.org/docs/apache-airflow/stable/_api/airflow/models/baseoperator/index.html\n[ch-driver-insert]: https://clickhouse-driver.readthedocs.io/en/latest/quickstart.html#inserting-data\n[ch-driver-client]: https://clickhouse-driver.readthedocs.io/en/latest/api.html#client\n[ch-driver-connection]: https://clickhouse-driver.readthedocs.io/en/latest/api.html#connection\n[airflow-conn-extra]: https://airflow.apache.org/docs/2.1.0/_api/airflow/models/connection/index.html#airflow.models.connection.Connection.extra\n[airflow-connection-howto]: https://airflow.apache.org/docs/apache-airflow/stable/howto/connection.html\n[airflow-conn-dejson]: https://airflow.apache.org/docs/apache-airflow/2.1.0/_api/airflow/models/index.html?highlight=connection#airflow.models.Connection.extra_dejson\n[airflow-conn-env]: https://airflow.apache.org/docs/apache-airflow/2.1.0/howto/connection.html#storing-a-connection-in-environment-variables\n[github-action-src]: https://github.com/whisklabs/airflow-clickhouse-plugin/tree/master/.github/workflows\n[pep-440-compatible-releases]: https://peps.python.org/pep-0440/#compatible-release\n[apache-airflow-providers-common-sql]: https://airflow.apache.org/docs/apache-airflow-providers-common-sql/stable/index.html\n[db-api-pep]: https://peps.python.org/pep-0249/\n[airflow-sensor]: https://airflow.apache.org/docs/apache-airflow/stable/core-concepts/sensors.html\n[ch-driver-pypi-install]: https://clickhouse-driver.readthedocs.io/en/latest/installation.html#installation-pypi\n[common-sql-reference]: https://airflow.apache.org/docs/apache-airflow-providers-common-sql/stable/_api/airflow/providers/common/sql/index.html\n[common-sql-examples]: https://airflow.apache.org/docs/apache-airflow-providers-common-sql/stable/operators.html\n[ch-driver-db-api]: https://clickhouse-driver.readthedocs.io/en/latest/dbapi.html\n",
"bugtrack_url": null,
"license": "MIT License",
"summary": "airflow-clickhouse-plugin \u2014 Airflow plugin to execute ClickHouse commands and queries",
"version": "1.4.0",
"project_urls": {
"Changelog": "https://github.com/bryzgaloff/airflow-clickhouse-plugin/releases",
"Documentation": "https://github.com/bryzgaloff/airflow-clickhouse-plugin#airflow-clickhouse-plugin",
"GitHub": "https://github.com/bryzgaloff/airflow-clickhouse-plugin",
"Issues": "https://github.com/bryzgaloff/airflow-clickhouse-plugin/issues"
},
"split_keywords": [
"clickhouse",
" airflow"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "ba77de8bb54416ee3eb1bb6125a7226618aa822ef9bcd03396e44cd64f192408",
"md5": "a248105a8e2d417367dbb34c925ef7db",
"sha256": "7d88a2f224cfd146acd9bf5aa7df4a41b1e955fead707fc8cf9b87de85a07996"
},
"downloads": -1,
"filename": "airflow_clickhouse_plugin-1.4.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "a248105a8e2d417367dbb34c925ef7db",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 14114,
"upload_time": "2024-08-23T09:09:37",
"upload_time_iso_8601": "2024-08-23T09:09:37.405089Z",
"url": "https://files.pythonhosted.org/packages/ba/77/de8bb54416ee3eb1bb6125a7226618aa822ef9bcd03396e44cd64f192408/airflow_clickhouse_plugin-1.4.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "b179d29d71e04ec37f2fb23033ef8bbb0f542c25cb887bc6d99262fe0ea898c4",
"md5": "e8bfc52c97f5337394e8391814c47b14",
"sha256": "373385de27ef477ec2efae7bc9cc11cbee39c54907688c202aafa43c6ed68af2"
},
"downloads": -1,
"filename": "airflow_clickhouse_plugin-1.4.0.tar.gz",
"has_sig": false,
"md5_digest": "e8bfc52c97f5337394e8391814c47b14",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 19083,
"upload_time": "2024-08-23T09:09:39",
"upload_time_iso_8601": "2024-08-23T09:09:39.152574Z",
"url": "https://files.pythonhosted.org/packages/b1/79/d29d71e04ec37f2fb23033ef8bbb0f542c25cb887bc6d99262fe0ea898c4/airflow_clickhouse_plugin-1.4.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-08-23 09:09:39",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "bryzgaloff",
"github_project": "airflow-clickhouse-plugin",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [
{
"name": "clickhouse-driver",
"specs": [
[
"~=",
"0.2.0"
]
]
},
{
"name": "apache-airflow",
"specs": [
[
">=",
"2.0.0"
],
[
"<",
"2.11.0"
]
]
}
],
"lcname": "airflow-clickhouse-plugin"
}