bigeye-airflow


Namebigeye-airflow JSON
Version 0.1.35 PyPI version JSON
download
home_pagehttps://docs.bigeye.com/docs
SummaryBigeye Airflow Library supports Airflow versions >=2.2.0, <=2.9.3 and offers custom operators for interacting with your your bigeye workspace.
upload_time2024-11-01 14:51:33
maintainerNone
docs_urlNone
authorBigeye
requires_python<4.0.0,>=3.9.0
licenseProprietary
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Bigeye Airflow Operators for Airflow Versions 2.x

## Operators
### Create Metric Operator (bigeye_airflow.operators.create_metric_operator)

The CreateMetricOperator creates metrics from a list of metric configurations provided to the operator.
This operator will fill in reasonable defaults like setting thresholds.  It authenticates through an Airflow connection 
ID and offers the option to run the metrics after those metrics have been created.  Please review the link below to 
understand the structure of the configurations.

[Create or Update Metric Swagger](https://docs.bigeye.com/reference/createmetric)

#### Parameters
1. connection_id: str - The Airfow connection ID used to store the required Bigeye credential.
2. warehouse_id: int - The Bigeye source/warehouse id to which the metric configurations will be deployed.
3. configuration: List[dict] - A list of metric configurations conforming to the following schema.
    ```
    schema_name: str
    table_name: str
    column_name: str
    metric_template_id: uuid.UUID
    metric_name: str
    description: str
    notifications: List[str]
    thresholds: List[dict]
    filters: List[str]
    group_by: List[str]
    user_defined_metric_name: str
    metric_type: SimpleMetricCategory
    default_check_frequency_hours: int
    update_schedule: str
    delay_at_update: str
    timezone: str
    should_backfill: bool
    lookback_type: str
    lookback_days: int
    window_size: str
    _window_size_seconds
    ```
4. run_after_upsert: bool - If true it will run the metrics after creation.  Defaults to False.
5. workspace_id: Optional[int] - The ID of the workspace where metrics should be created. 
If only 1 workspace configured, then will default to that else this will be required.

### Run Metrics Operator (bigeye_airflow.operators.run_metrics_operator)

The RunMetricsOperator will run metrics in Bigeye based on the following:

1. All metrics for a given table, by providing warehouse ID, schema name and table name.
2. All metrics for a given collection, by providing the collection ID.
3. Any and all metrics, given a list of metric IDs.  

Currently, if a list of metric IDs is provided these will be run instead of metrics provided for
warehouse_id, schema_name, table_name, and collection_id

#### Parameters
1. connection_id: str - The Airfow connection ID used to store the required Bigeye credential.
2. warehouse_id: int - The Bigeye source/warehouse id for which metrics will be run.
3. schema_name: str - The schema name for which metrics will be run.
4. table_name: str - The table name for which metrics will be run.
5. collection_id: int - The ID of the collection where the operator will run the metrics.
6. metric_ids: List[int] - The metric ids to run.
7. workspace_id: Optional[int] - The ID of the workspace where metrics should be run. 
If only 1 workspace configured, then will default to that else this will be required.
8. circuit_breaker_mode: bool - Whether dag should raise an exception if metrics result in alerting state, default False.

### Create Delta Operator (bigeye_airflow.operators.create_delta_operator)

The CreateDeltaOperator creates deltas from a list of delta configurations provided to the operator.
This operator will fill in reasonable defaults like column mappings.  It authenticates through an Airflow connection 
ID and offers the option to run the deltas after those deltas have been created.  Please review the link below to 
understand the structure of the configurations.

#### Parameters
1. connection_id: str - The Airfow connection ID used to store the required Bigeye credential.
2. warehouse_id: int - The Bigeye source/warehouse id to which the metric configurations will be deployed.
3. configuration: List[dict] - A list of delta configurations conforming to the following schema.
    ```
    delta_name: str
    fq_source_table_name: str
    target_table_comparisons: dict
    - example: {"target_table_comparisons": [{"fq_target_table_name": "Snowflake.TOOY_DEMO_DB.PROD_REPL.ORDERS"}]
    tolerance: Optional[float]
    - default = 0.0
    cron_schedule: Optional[dict] 
    - default = None 
    - example: {"cron_schedule": {"name": "Midnight UTC", "cron": "0 0 * * *"}}
    notification_channels: Optional[List[dict]]
    - default = None
    - example: {"notification_channels: [{"slack": "#data-alerts"}]
    ```
4. run_after_upsert: bool - If true it will run the deltas after creation.  Defaults to False.
5. workspace_id: Optional[int] - The ID of the workspace where deltas should be created. 
If only 1 workspace configured, then will default to that else this will be required.

### Run Deltas Operator (bigeye_airflow.operators.run_deltas_operator)

The RunDeltasOperator will run deltas in Bigeye based on the following:

1. All deltas for a given table, by providing warehouse ID, schema name and table name.
2. Any and all deltas, given a list of delta IDs.  

Currently, if a list of delta IDs is provided these will be run instead of metrics provided for
warehouse_id, schema_name, table_name.

#### Parameters
1. connection_id: str - The Airfow connection ID used to store the required Bigeye credential.
2. warehouse_id: int - The Bigeye source/warehouse id for which metrics will be run.
3. schema_name: str - The schema name for which metrics will be run.
4. table_name: str - The table name for which metrics will be run.
5. delta_ids: List[int] - The delta ids to run.
6. workspace_id: Optional[int] - The ID of the workspace where deltas should be run. 
If only 1 workspace configured, then will default to that else this will be required.
7. circuit_breaker_mode: bool - Whether dag should raise an exception if deltas result in alerting state, default False.

            

Raw data

            {
    "_id": null,
    "home_page": "https://docs.bigeye.com/docs",
    "name": "bigeye-airflow",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0.0,>=3.9.0",
    "maintainer_email": null,
    "keywords": null,
    "author": "Bigeye",
    "author_email": "support@bigeye.com",
    "download_url": "https://files.pythonhosted.org/packages/62/76/04c592d5b8abe13e380458d204f00d26607bef0cd4136b6f8a9b0d8b4b23/bigeye_airflow-0.1.35.tar.gz",
    "platform": null,
    "description": "# Bigeye Airflow Operators for Airflow Versions 2.x\n\n## Operators\n### Create Metric Operator (bigeye_airflow.operators.create_metric_operator)\n\nThe CreateMetricOperator creates metrics from a list of metric configurations provided to the operator.\nThis operator will fill in reasonable defaults like setting thresholds.  It authenticates through an Airflow connection \nID and offers the option to run the metrics after those metrics have been created.  Please review the link below to \nunderstand the structure of the configurations.\n\n[Create or Update Metric Swagger](https://docs.bigeye.com/reference/createmetric)\n\n#### Parameters\n1. connection_id: str - The Airfow connection ID used to store the required Bigeye credential.\n2. warehouse_id: int - The Bigeye source/warehouse id to which the metric configurations will be deployed.\n3. configuration: List[dict] - A list of metric configurations conforming to the following schema.\n    ```\n    schema_name: str\n    table_name: str\n    column_name: str\n    metric_template_id: uuid.UUID\n    metric_name: str\n    description: str\n    notifications: List[str]\n    thresholds: List[dict]\n    filters: List[str]\n    group_by: List[str]\n    user_defined_metric_name: str\n    metric_type: SimpleMetricCategory\n    default_check_frequency_hours: int\n    update_schedule: str\n    delay_at_update: str\n    timezone: str\n    should_backfill: bool\n    lookback_type: str\n    lookback_days: int\n    window_size: str\n    _window_size_seconds\n    ```\n4. run_after_upsert: bool - If true it will run the metrics after creation.  Defaults to False.\n5. workspace_id: Optional[int] - The ID of the workspace where metrics should be created. \nIf only 1 workspace configured, then will default to that else this will be required.\n\n### Run Metrics Operator (bigeye_airflow.operators.run_metrics_operator)\n\nThe RunMetricsOperator will run metrics in Bigeye based on the following:\n\n1. All metrics for a given table, by providing warehouse ID, schema name and table name.\n2. All metrics for a given collection, by providing the collection ID.\n3. Any and all metrics, given a list of metric IDs.  \n\nCurrently, if a list of metric IDs is provided these will be run instead of metrics provided for\nwarehouse_id, schema_name, table_name, and collection_id\n\n#### Parameters\n1. connection_id: str - The Airfow connection ID used to store the required Bigeye credential.\n2. warehouse_id: int - The Bigeye source/warehouse id for which metrics will be run.\n3. schema_name: str - The schema name for which metrics will be run.\n4. table_name: str - The table name for which metrics will be run.\n5. collection_id: int - The ID of the collection where the operator will run the metrics.\n6. metric_ids: List[int] - The metric ids to run.\n7. workspace_id: Optional[int] - The ID of the workspace where metrics should be run. \nIf only 1 workspace configured, then will default to that else this will be required.\n8. circuit_breaker_mode: bool - Whether dag should raise an exception if metrics result in alerting state, default False.\n\n### Create Delta Operator (bigeye_airflow.operators.create_delta_operator)\n\nThe CreateDeltaOperator creates deltas from a list of delta configurations provided to the operator.\nThis operator will fill in reasonable defaults like column mappings.  It authenticates through an Airflow connection \nID and offers the option to run the deltas after those deltas have been created.  Please review the link below to \nunderstand the structure of the configurations.\n\n#### Parameters\n1. connection_id: str - The Airfow connection ID used to store the required Bigeye credential.\n2. warehouse_id: int - The Bigeye source/warehouse id to which the metric configurations will be deployed.\n3. configuration: List[dict] - A list of delta configurations conforming to the following schema.\n    ```\n    delta_name: str\n    fq_source_table_name: str\n    target_table_comparisons: dict\n    - example: {\"target_table_comparisons\": [{\"fq_target_table_name\": \"Snowflake.TOOY_DEMO_DB.PROD_REPL.ORDERS\"}]\n    tolerance: Optional[float]\n    - default = 0.0\n    cron_schedule: Optional[dict] \n    - default = None \n    - example: {\"cron_schedule\": {\"name\": \"Midnight UTC\", \"cron\": \"0 0 * * *\"}}\n    notification_channels: Optional[List[dict]]\n    - default = None\n    - example: {\"notification_channels: [{\"slack\": \"#data-alerts\"}]\n    ```\n4. run_after_upsert: bool - If true it will run the deltas after creation.  Defaults to False.\n5. workspace_id: Optional[int] - The ID of the workspace where deltas should be created. \nIf only 1 workspace configured, then will default to that else this will be required.\n\n### Run Deltas Operator (bigeye_airflow.operators.run_deltas_operator)\n\nThe RunDeltasOperator will run deltas in Bigeye based on the following:\n\n1. All deltas for a given table, by providing warehouse ID, schema name and table name.\n2. Any and all deltas, given a list of delta IDs.  \n\nCurrently, if a list of delta IDs is provided these will be run instead of metrics provided for\nwarehouse_id, schema_name, table_name.\n\n#### Parameters\n1. connection_id: str - The Airfow connection ID used to store the required Bigeye credential.\n2. warehouse_id: int - The Bigeye source/warehouse id for which metrics will be run.\n3. schema_name: str - The schema name for which metrics will be run.\n4. table_name: str - The table name for which metrics will be run.\n5. delta_ids: List[int] - The delta ids to run.\n6. workspace_id: Optional[int] - The ID of the workspace where deltas should be run. \nIf only 1 workspace configured, then will default to that else this will be required.\n7. circuit_breaker_mode: bool - Whether dag should raise an exception if deltas result in alerting state, default False.\n",
    "bugtrack_url": null,
    "license": "Proprietary",
    "summary": "Bigeye Airflow Library supports Airflow versions >=2.2.0, <=2.9.3 and offers custom operators for interacting with your your bigeye workspace.",
    "version": "0.1.35",
    "project_urls": {
        "Homepage": "https://docs.bigeye.com/docs"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "fbbeaf9be93f28d7dc03e01a957bfd2ecfcef3fea05bf492822bbf6214a0a2c7",
                "md5": "bdd28dbeb95015c5b1020f1ea5d38786",
                "sha256": "2cf66b6b5afca91d64e624579d0638f8f8e2478d3c005c3544ebeaa8ddcdf2f1"
            },
            "downloads": -1,
            "filename": "bigeye_airflow-0.1.35-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "bdd28dbeb95015c5b1020f1ea5d38786",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0.0,>=3.9.0",
            "size": 18129,
            "upload_time": "2024-11-01T14:51:32",
            "upload_time_iso_8601": "2024-11-01T14:51:32.350772Z",
            "url": "https://files.pythonhosted.org/packages/fb/be/af9be93f28d7dc03e01a957bfd2ecfcef3fea05bf492822bbf6214a0a2c7/bigeye_airflow-0.1.35-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "627604c592d5b8abe13e380458d204f00d26607bef0cd4136b6f8a9b0d8b4b23",
                "md5": "a2eccbf74ba63b3336ef5810f9cd9646",
                "sha256": "f75a4d19752d3c1d23464b3d80e0ae8274693394ca3c3167bded21c236e82396"
            },
            "downloads": -1,
            "filename": "bigeye_airflow-0.1.35.tar.gz",
            "has_sig": false,
            "md5_digest": "a2eccbf74ba63b3336ef5810f9cd9646",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0.0,>=3.9.0",
            "size": 12035,
            "upload_time": "2024-11-01T14:51:33",
            "upload_time_iso_8601": "2024-11-01T14:51:33.261966Z",
            "url": "https://files.pythonhosted.org/packages/62/76/04c592d5b8abe13e380458d204f00d26607bef0cd4136b6f8a9b0d8b4b23/bigeye_airflow-0.1.35.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-01 14:51:33",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "bigeye-airflow"
}
        
Elapsed time: 0.53832s