databricks-labs-pytester


Namedatabricks-labs-pytester JSON
Version 0.2.4 PyPI version JSON
download
home_pageNone
SummaryPython Testing for Databricks
upload_time2024-09-24 10:14:32
maintainerNone
docs_urlNone
authorNone
requires_python>=3.10
licenseNone
keywords databricks pytest
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Python Testing for Databricks

[![python](https://img.shields.io/badge/python-3.10,%203.11,%203.12-green)](https://github.com/databrickslabs/pytester/actions/workflows/push.yml)
[![lines of code](https://tokei.rs/b1/github/databrickslabs/pytester)]([https://github.com/databrickslabs/pytester](https://github.com/databrickslabs/pytester))


<!-- TOC -->
* [Python Testing for Databricks](#python-testing-for-databricks)
  * [Ecosystem](#ecosystem)
  * [PyTest Fixtures](#pytest-fixtures)
    * [Logging](#logging)
    * [Installation](#installation)
    * [`debug_env_name` fixture](#debug_env_name-fixture)
    * [`debug_env` fixture](#debug_env-fixture)
    * [`env_or_skip` fixture](#env_or_skip-fixture)
    * [`ws` fixture](#ws-fixture)
    * [`make_random` fixture](#make_random-fixture)
    * [`make_instance_pool` fixture](#make_instance_pool-fixture)
    * [`make_instance_pool_permissions` fixture](#make_instance_pool_permissions-fixture)
    * [`make_job` fixture](#make_job-fixture)
    * [`make_job_permissions` fixture](#make_job_permissions-fixture)
    * [`make_cluster` fixture](#make_cluster-fixture)
    * [`make_cluster_permissions` fixture](#make_cluster_permissions-fixture)
    * [`make_cluster_policy` fixture](#make_cluster_policy-fixture)
    * [`make_cluster_policy_permissions` fixture](#make_cluster_policy_permissions-fixture)
    * [`make_pipeline` fixture](#make_pipeline-fixture)
    * [`make_warehouse` fixture](#make_warehouse-fixture)
    * [`make_group` fixture](#make_group-fixture)
    * [`make_user` fixture](#make_user-fixture)
    * [`make_pipeline_permissions` fixture](#make_pipeline_permissions-fixture)
    * [`make_notebook` fixture](#make_notebook-fixture)
    * [`make_notebook_permissions` fixture](#make_notebook_permissions-fixture)
    * [`make_directory` fixture](#make_directory-fixture)
    * [`make_directory_permissions` fixture](#make_directory_permissions-fixture)
    * [`make_repo` fixture](#make_repo-fixture)
    * [`make_repo_permissions` fixture](#make_repo_permissions-fixture)
    * [`make_workspace_file_permissions` fixture](#make_workspace_file_permissions-fixture)
    * [`make_workspace_file_path_permissions` fixture](#make_workspace_file_path_permissions-fixture)
    * [`make_secret_scope` fixture](#make_secret_scope-fixture)
    * [`make_secret_scope_acl` fixture](#make_secret_scope_acl-fixture)
    * [`make_authorization_permissions` fixture](#make_authorization_permissions-fixture)
    * [`make_udf` fixture](#make_udf-fixture)
    * [`make_catalog` fixture](#make_catalog-fixture)
    * [`make_schema` fixture](#make_schema-fixture)
    * [`make_table` fixture](#make_table-fixture)
    * [`make_storage_credential` fixture](#make_storage_credential-fixture)
    * [`product_info` fixture](#product_info-fixture)
    * [`sql_backend` fixture](#sql_backend-fixture)
    * [`sql_exec` fixture](#sql_exec-fixture)
    * [`sql_fetch_all` fixture](#sql_fetch_all-fixture)
    * [`make_model` fixture](#make_model-fixture)
    * [`make_experiment` fixture](#make_experiment-fixture)
    * [`make_experiment_permissions` fixture](#make_experiment_permissions-fixture)
    * [`make_warehouse_permissions` fixture](#make_warehouse_permissions-fixture)
    * [`make_lakeview_dashboard_permissions` fixture](#make_lakeview_dashboard_permissions-fixture)
    * [`workspace_library` fixture](#workspace_library-fixture)
    * [`log_workspace_link` fixture](#log_workspace_link-fixture)
    * [`make_dashboard_permissions` fixture](#make_dashboard_permissions-fixture)
    * [`make_alert_permissions` fixture](#make_alert_permissions-fixture)
    * [`make_query` fixture](#make_query-fixture)
    * [`make_query_permissions` fixture](#make_query_permissions-fixture)
    * [`make_registered_model_permissions` fixture](#make_registered_model_permissions-fixture)
    * [`make_serving_endpoint` fixture](#make_serving_endpoint-fixture)
    * [`make_serving_endpoint_permissions` fixture](#make_serving_endpoint_permissions-fixture)
    * [`make_feature_table_permissions` fixture](#make_feature_table_permissions-fixture)
* [Project Support](#project-support)
<!-- TOC -->

## Installation

Add a `databricks-labs-pytester` dependency to your `pyproject.toml` file (or legacy `requirements.txt` file). You can
also install it directly from the command line:

```shell
pip install databricks-labs-pytester
```

If you use `hatch` as a build system, make sure to add `databricks-labs-pytester` as 
a test-time dependency and not as a compile-time dependency, otherwise your wheels will
transitively depend on `pytest`, which is not usually something you need.

```toml
[project]
name = "name-of-your-project"
# ...
dependencies = [
  "databricks-sdk~=0.30",
  # ... dependencies required for your code to execute
]

[tool.hatch.envs.default]
dependencies = [
  # ... dependencies required to test/validate/format your code:
    "black~=24.3.0",
    "coverage[toml]~=7.4.4",
    "mypy~=1.9.0",
    "pylint~=3.2.2",
    "pylint-pytest==2.0.0a0",
    "databricks-labs-pylint~=0.4.0",
    "databricks-labs-pytester~=0.2", # <= this library 
    "pytest~=8.3.3",
    "pytest-cov~=4.1.0",
    "pytest-mock~=3.14.0",
    "pytest-timeout~=2.3.1",
    "pytest-xdist~=3.5.0",
    "python-lsp-server>=1.9.0",
    "ruff~=0.3.4",
    "types-PyYAML~=6.0.12",
    "types-requests~=2.31.0",
]
```

[[back to top](#python-testing-for-databricks)]

## Ecosystem

Built on top of [Databricks SDK for Python](https://github.com/databricks/databricks-sdk-py), this library is part of the Databricks Labs Python ecosystem, which includes the following projects:
* [PyLint Plugin for Databricks](https://github.com/databrickslabs/pylint-plugin) for static code analysis and early bug detection.
* [Blueprint](https://github.com/databrickslabs/blueprint) for 
  [Python-native pathlib.Path-like interfaces](https://github.com/databrickslabs/blueprint#python-native-pathlibpath-like-interfaces),
  [Managing Python App installations within Databricks Workspaces](https://github.com/databrickslabs/blueprint#application-and-installation-state),
  [Application Migrations](https://github.com/databrickslabs/blueprint#application-state-migrations), and
  [Building Wheels](https://github.com/databrickslabs/blueprint#building-wheels).
* [LSQL](https://github.com/databrickslabs/lsql) for lightweight SQL handling and dashboards-as-code.
* [UCX](https://github.com/databrickslabs/ucx) for automated migrations into Unity Catalog and LSP plugin for static code analysis for UC compatibility.

See [this video](https://www.youtube.com/watch?v=CNypO79IATc) for a quick overview of the Databricks Labs Python ecosystem.

[[back to top](#python-testing-for-databricks)]

## PyTest Fixtures

[PyTest Fixtures](https://docs.pytest.org/en/latest/explanation/fixtures.html) are a powerful way to manage test setup and teardown in Python. This library provides 
a set of fixtures to help you write integration tests for Databricks. These fixtures were incubated
within the [Unity Catalog Automated Migrations project](https://github.com/databrickslabs/ucx/blame/df7f1d7647251fb8f0f23c56a548b99092484a7c/src/databricks/labs/ucx/mixins/fixtures.py) 
for more than a year and are now available for other projects to simplify integration testing with Databricks.

[[back to top](#python-testing-for-databricks)]

### Logging

This library is built on years of debugging integration tests for Databricks and its ecosystem. 

That's why it comes with a built-in logger that traces creation and deletion of dummy entities through links in 
the Databricks Workspace UI. If you run the following code:

```python
def test_new_user(make_user, ws):
    new_user = make_user()
    home_dir = ws.workspace.get_status(f"/Users/{new_user.user_name}")
    assert home_dir.object_type == ObjectType.DIRECTORY
```

You will see the following output, where the first line is clickable and will take you to the user's profile in the Databricks Workspace UI:

```text
12:30:53  INFO [d.l.p.fixtures.baseline] Created dummy-xwuq-...@example.com: https://.....azuredatabricks.net/#settings/workspace/identity-and-access/users/735...
12:30:53 DEBUG [d.l.p.fixtures.baseline] added workspace user fixture: User(active=True, display_name='dummy-xwuq-...@example.com', ...)
12:30:58 DEBUG [d.l.p.fixtures.baseline] clearing 1 workspace user fixtures
12:30:58 DEBUG [d.l.p.fixtures.baseline] removing workspace user fixture: User(active=True, display_name='dummy-xwuq-...@example.com', ...)
```

You may need to add the following to your `conftest.py` file to enable this:

```python
import logging

from databricks.labs.blueprint.logger import install_logger

install_logger()

logging.getLogger('databricks.labs.pytester').setLevel(logging.DEBUG)
```

[[back to top](#python-testing-for-databricks)]

<!-- FIXTURES -->
### `debug_env_name` fixture
Specify the name of the debug environment. By default, it is set to `.env`,
which will try to find a [file named `.env`](https://www.dotenv.org/docs/security/env)
in any of the parent directories of the current working directory and load
the environment variables from it via the [`debug_env` fixture](#debug_env-fixture).

Alternatively, if you are concerned of the
[risk of `.env` files getting checked into version control](https://thehackernews.com/2024/08/attackers-exploit-public-env-files-to.html),
we recommend using the `~/.databricks/debug-env.json` file to store different sets of environment variables.
The file cannot be checked into version control by design, because it is stored in the user's home directory.

This file is used for local debugging and integration tests in IDEs like PyCharm, VSCode, and IntelliJ IDEA
while developing Databricks Platform Automation Stack, which includes Databricks SDKs for Python, Go, and Java,
as well as Databricks Terraform Provider and Databricks CLI. This file enables multi-environment and multi-cloud
testing with a single set of integration tests.

The file is typically structured as follows:

```shell
$ cat ~/.databricks/debug-env.json
{
   "ws": {
     "CLOUD_ENV": "azure",
     "DATABRICKS_HOST": "....azuredatabricks.net",
     "DATABRICKS_CLUSTER_ID": "0708-200540-...",
     "DATABRICKS_WAREHOUSE_ID": "33aef...",
        ...
   },
   "acc": {
     "CLOUD_ENV": "aws",
     "DATABRICKS_HOST": "accounts.cloud.databricks.net",
     "DATABRICKS_CLIENT_ID": "....",
     "DATABRICKS_CLIENT_SECRET": "....",
     ...
   }
}
```

And you can load it in your `conftest.py` file as follows:

```python
@pytest.fixture
def debug_env_name():
    return "ws"
```

This will load the `ws` environment from the `~/.databricks/debug-env.json` file.

If any of the environment variables are not found, [`env_or_skip` fixture](#env_or_skip-fixture)
will gracefully skip the execution of tests.

See also [`debug_env`](#debug_env-fixture).


[[back to top](#python-testing-for-databricks)]

### `debug_env` fixture
Loads environment variables specified in [`debug_env_name` fixture](#debug_env_name-fixture) from a file
for local debugging in IDEs, otherwise allowing the tests to run with the default environment variables
specified in the CI/CD pipeline.

See also [`acc`](#acc-fixture), [`env_or_skip`](#env_or_skip-fixture), [`ws`](#ws-fixture), [`debug_env_name`](#debug_env_name-fixture), [`is_in_debug`](#is_in_debug-fixture).


[[back to top](#python-testing-for-databricks)]

### `env_or_skip` fixture
Fixture to get environment variables or skip tests.

It is extremely useful to skip tests if the required environment variables are not set.

In the following example, `test_something` would only run if the environment variable
`SOME_EXTERNAL_SERVICE_TOKEN` is set:

```python
def test_something(env_or_skip):
    token = env_or_skip("SOME_EXTERNAL_SERVICE_TOKEN")
    assert token is not None
```

See also [`acc`](#acc-fixture), [`make_udf`](#make_udf-fixture), [`sql_backend`](#sql_backend-fixture), [`debug_env`](#debug_env-fixture), [`is_in_debug`](#is_in_debug-fixture).


[[back to top](#python-testing-for-databricks)]

### `ws` fixture
Create and provide a Databricks WorkspaceClient object.

This fixture initializes a Databricks WorkspaceClient object, which can be used
to interact with the Databricks workspace API. The created instance of WorkspaceClient
is shared across all test functions within the test session.

See [detailed documentation](https://databricks-sdk-py.readthedocs.io/en/latest/authentication.html) for the list
of environment variables that can be used to authenticate the WorkspaceClient.

In your test functions, include this fixture as an argument to use the WorkspaceClient:

```python
def test_workspace_operations(ws):
    clusters = ws.clusters.list_clusters()
    assert len(clusters) >= 0
```

See also [`log_workspace_link`](#log_workspace_link-fixture), [`make_alert_permissions`](#make_alert_permissions-fixture), [`make_authorization_permissions`](#make_authorization_permissions-fixture), [`make_catalog`](#make_catalog-fixture), [`make_cluster`](#make_cluster-fixture), [`make_cluster_permissions`](#make_cluster_permissions-fixture), [`make_cluster_policy`](#make_cluster_policy-fixture), [`make_cluster_policy_permissions`](#make_cluster_policy_permissions-fixture), [`make_dashboard_permissions`](#make_dashboard_permissions-fixture), [`make_directory`](#make_directory-fixture), [`make_directory_permissions`](#make_directory_permissions-fixture), [`make_experiment`](#make_experiment-fixture), [`make_experiment_permissions`](#make_experiment_permissions-fixture), [`make_feature_table`](#make_feature_table-fixture), [`make_feature_table_permissions`](#make_feature_table_permissions-fixture), [`make_group`](#make_group-fixture), [`make_instance_pool`](#make_instance_pool-fixture), [`make_instance_pool_permissions`](#make_instance_pool_permissions-fixture), [`make_job`](#make_job-fixture), [`make_job_permissions`](#make_job_permissions-fixture), [`make_lakeview_dashboard_permissions`](#make_lakeview_dashboard_permissions-fixture), [`make_model`](#make_model-fixture), [`make_notebook`](#make_notebook-fixture), [`make_notebook_permissions`](#make_notebook_permissions-fixture), [`make_pipeline`](#make_pipeline-fixture), [`make_pipeline_permissions`](#make_pipeline_permissions-fixture), [`make_query`](#make_query-fixture), [`make_query_permissions`](#make_query_permissions-fixture), [`make_registered_model_permissions`](#make_registered_model_permissions-fixture), [`make_repo`](#make_repo-fixture), [`make_repo_permissions`](#make_repo_permissions-fixture), [`make_secret_scope`](#make_secret_scope-fixture), [`make_secret_scope_acl`](#make_secret_scope_acl-fixture), [`make_serving_endpoint`](#make_serving_endpoint-fixture), [`make_serving_endpoint_permissions`](#make_serving_endpoint_permissions-fixture), [`make_storage_credential`](#make_storage_credential-fixture), [`make_udf`](#make_udf-fixture), [`make_user`](#make_user-fixture), [`make_warehouse`](#make_warehouse-fixture), [`make_warehouse_permissions`](#make_warehouse_permissions-fixture), [`make_workspace_file_path_permissions`](#make_workspace_file_path_permissions-fixture), [`make_workspace_file_permissions`](#make_workspace_file_permissions-fixture), [`spark`](#spark-fixture), [`sql_backend`](#sql_backend-fixture), [`debug_env`](#debug_env-fixture), [`product_info`](#product_info-fixture).


[[back to top](#python-testing-for-databricks)]

### `acc` fixture
Create and provide a Databricks AccountClient object.

This fixture initializes a Databricks AccountClient object, which can be used
to interact with the Databricks account API. The created instance of AccountClient
is shared across all test functions within the test session.

Requires `DATABRICKS_ACCOUNT_ID` environment variable to be set. If `DATABRICKS_HOST`
points to a workspace host, the fixture would automatically determine the account host
from it.

See [detailed documentation](https://databricks-sdk-py.readthedocs.io/en/latest/authentication.html) for the list
of environment variables that can be used to authenticate the AccountClient.

In your test functions, include this fixture as an argument to use the AccountClient:

```python
def test_listing_workspaces(acc):
    workspaces = acc.workspaces.list()
    assert len(workspaces) >= 1
```

See also [`make_acc_group`](#make_acc_group-fixture), [`debug_env`](#debug_env-fixture), [`product_info`](#product_info-fixture), [`env_or_skip`](#env_or_skip-fixture).


[[back to top](#python-testing-for-databricks)]

### `spark` fixture
Get Databricks Connect Spark session. Requires `databricks-connect` package to be installed.

Usage:
```python
def test_databricks_connect(spark):
    rows = spark.sql("SELECT 1").collect()
    assert rows[0][0] == 1
```

See also [`ws`](#ws-fixture).


[[back to top](#python-testing-for-databricks)]

### `sql_backend` fixture
Create and provide a SQL backend for executing statements.

Requires the environment variable `DATABRICKS_WAREHOUSE_ID` to be set.

See also [`make_schema`](#make_schema-fixture), [`make_table`](#make_table-fixture), [`make_udf`](#make_udf-fixture), [`sql_exec`](#sql_exec-fixture), [`sql_fetch_all`](#sql_fetch_all-fixture), [`ws`](#ws-fixture), [`env_or_skip`](#env_or_skip-fixture).


[[back to top](#python-testing-for-databricks)]

### `sql_exec` fixture
Execute SQL statement and don't return any results.

See also [`sql_backend`](#sql_backend-fixture).


[[back to top](#python-testing-for-databricks)]

### `sql_fetch_all` fixture
Fetch all rows from a SQL statement.

See also [`sql_backend`](#sql_backend-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_random` fixture
Fixture to generate random strings.

This fixture provides a function to generate random strings of a specified length.
The generated strings are created using a character set consisting of uppercase letters,
lowercase letters, and digits.

To generate a random string with default length of 16 characters:

```python
random_string = make_random()
assert len(random_string) == 16
```

To generate a random string with a specified length:

```python
random_string = make_random(k=8)
assert len(random_string) == 8
```

See also [`make_acc_group`](#make_acc_group-fixture), [`make_catalog`](#make_catalog-fixture), [`make_cluster`](#make_cluster-fixture), [`make_cluster_policy`](#make_cluster_policy-fixture), [`make_directory`](#make_directory-fixture), [`make_experiment`](#make_experiment-fixture), [`make_feature_table`](#make_feature_table-fixture), [`make_group`](#make_group-fixture), [`make_instance_pool`](#make_instance_pool-fixture), [`make_job`](#make_job-fixture), [`make_model`](#make_model-fixture), [`make_notebook`](#make_notebook-fixture), [`make_pipeline`](#make_pipeline-fixture), [`make_query`](#make_query-fixture), [`make_repo`](#make_repo-fixture), [`make_schema`](#make_schema-fixture), [`make_secret_scope`](#make_secret_scope-fixture), [`make_serving_endpoint`](#make_serving_endpoint-fixture), [`make_table`](#make_table-fixture), [`make_udf`](#make_udf-fixture), [`make_user`](#make_user-fixture), [`make_warehouse`](#make_warehouse-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_instance_pool` fixture
Create a Databricks instance pool and clean it up after the test. Returns a function to create instance pools.
Use `instance_pool_id` attribute from the returned object to get an ID of the pool.

Keyword Arguments:
* `instance_pool_name` (str, optional): The name of the instance pool. If not provided, a random name will be generated.
* `node_type_id` (str, optional): The node type ID of the instance pool. If not provided, a node type with local disk and 16GB memory will be used.
* other arguments are passed to `WorkspaceClient.instance_pools.create` method.

Usage:
```python
def test_instance_pool(make_instance_pool):
    logger.info(f"created {make_instance_pool()}")
```

See also [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`watchdog_remove_after`](#watchdog_remove_after-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_instance_pool_permissions` fixture
_No description yet._

See also [`ws`](#ws-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_job` fixture
Create a Databricks job and clean it up after the test. Returns a function to create jobs, that returns
a [`Job`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/jobs.html#databricks.sdk.service.jobs.Job) instance.

Keyword Arguments:
* `notebook_path` (str, optional): The path to the notebook. If not provided, a random notebook will be created.
* `name` (str, optional): The name of the job. If not provided, a random name will be generated.
* `spark_conf` (dict, optional): The Spark configuration of the job.
* `libraries` (list, optional): The list of libraries to install on the job.
* other arguments are passed to `WorkspaceClient.jobs.create` method.

If no task argument is provided, a single task with a notebook task will be created, along with a disposable notebook.
Latest Spark version and a single worker clusters will be used to run this ephemeral job.

Usage:
```python
def test_job(make_job):
    logger.info(f"created {make_job()}")
```

See also [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`make_notebook`](#make_notebook-fixture), [`watchdog_remove_after`](#watchdog_remove_after-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_job_permissions` fixture
_No description yet._

See also [`ws`](#ws-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_cluster` fixture
Create a Databricks cluster, waits for it to start, and clean it up after the test.
Returns a function to create clusters. You can get `cluster_id` attribute from the returned object.

Keyword Arguments:
* `single_node` (bool, optional): Whether to create a single-node cluster. Defaults to False.
* `cluster_name` (str, optional): The name of the cluster. If not provided, a random name will be generated.
* `spark_version` (str, optional): The Spark version of the cluster. If not provided, the latest version will be used.
* `autotermination_minutes` (int, optional): The number of minutes before the cluster is automatically terminated. Defaults to 10.

Usage:
```python
def test_cluster(make_cluster):
    logger.info(f"created {make_cluster(single_node=True)}")
```

See also [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`watchdog_remove_after`](#watchdog_remove_after-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_cluster_permissions` fixture
_No description yet._

See also [`ws`](#ws-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_cluster_policy` fixture
Create a Databricks cluster policy and clean it up after the test. Returns a function to create cluster policies,
which returns [`CreatePolicyResponse`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/compute.html#databricks.sdk.service.compute.CreatePolicyResponse) instance.

Keyword Arguments:
* `name` (str, optional): The name of the cluster policy. If not provided, a random name will be generated.

Usage:
```python
def test_cluster_policy(make_cluster_policy):
    logger.info(f"created {make_cluster_policy()}")
```

See also [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`watchdog_purge_suffix`](#watchdog_purge_suffix-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_cluster_policy_permissions` fixture
_No description yet._

See also [`ws`](#ws-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_pipeline` fixture
Create Delta Live Table Pipeline and clean it up after the test. Returns a function to create pipelines.
Results in a [`CreatePipelineResponse`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/pipelines.html#databricks.sdk.service.pipelines.CreatePipelineResponse) instance.

Keyword Arguments:
* `name` (str, optional): The name of the pipeline. If not provided, a random name will be generated.
* `libraries` (list, optional): The list of libraries to install on the pipeline. If not provided, a random disposable notebook will be created.
* `clusters` (list, optional): The list of clusters to use for the pipeline. If not provided, a single node cluster will be created with 16GB memory and local disk.

Usage:
```python
def test_pipeline(make_pipeline, make_pipeline_permissions, make_group):
    group = make_group()
    pipeline = make_pipeline()
    make_pipeline_permissions(
        object_id=pipeline.pipeline_id,
        permission_level=PermissionLevel.CAN_MANAGE,
        group_name=group.display_name,
    )
```

See also [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`make_notebook`](#make_notebook-fixture), [`watchdog_remove_after`](#watchdog_remove_after-fixture), [`watchdog_purge_suffix`](#watchdog_purge_suffix-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_warehouse` fixture
Create a Databricks warehouse and clean it up after the test. Returns a function to create warehouses.

Keyword Arguments:
* `warehouse_name` (str, optional): The name of the warehouse. If not provided, a random name will be generated.
* `warehouse_type` (CreateWarehouseRequestWarehouseType, optional): The type of the warehouse. Defaults to `PRO`.
* `cluster_size` (str, optional): The size of the cluster. Defaults to `2X-Small`.

Usage:
```python
def test_warehouse_has_remove_after_tag(ws, make_warehouse):
    new_warehouse = make_warehouse()
    created_warehouse = ws.warehouses.get(new_warehouse.response.id)
    warehouse_tags = created_warehouse.tags.as_dict()
    assert warehouse_tags["custom_tags"][0]["key"] == "RemoveAfter"
```

See also [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`watchdog_remove_after`](#watchdog_remove_after-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_group` fixture
This fixture provides a function to manage Databricks workspace groups. Groups can be created with specified
members and roles, and they will be deleted after the test is complete. Deals with eventual consistency issues by
retrying the creation process for 30 seconds and then waiting for up to 3 minutes for the group to be provisioned.
Returns an instance of [`Group`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/iam.html#databricks.sdk.service.iam.Group).

Keyword arguments:
* `members` (list of strings): A list of user IDs to add to the group.
* `roles` (list of strings): A list of roles to assign to the group.
* `display_name` (str): The display name of the group.
* `entitlements` (list of strings): A list of entitlements to assign to the group.

The following example creates a group with a single member and independently verifies that the group was created:

```python
def test_new_group(make_group, make_user, ws):
    user = make_user()
    group = make_group(members=[user.id])
    loaded = ws.groups.get(group.id)
    assert group.display_name == loaded.display_name
    assert group.members == loaded.members
```

See also [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`watchdog_purge_suffix`](#watchdog_purge_suffix-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_acc_group` fixture
This fixture provides a function to manage Databricks account groups. Groups can be created with
specified members and roles, and they will be deleted after the test is complete.

Has the same arguments and behavior as [`make_group` fixture](#make_group-fixture) but uses the account
client instead of the workspace client.

Example usage:
```python
def test_new_account_group(make_acc_group, acc):
    group = make_acc_group()
    loaded = acc.groups.get(group.id)
    assert group.display_name == loaded.display_name
```

See also [`acc`](#acc-fixture), [`make_random`](#make_random-fixture), [`watchdog_purge_suffix`](#watchdog_purge_suffix-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_user` fixture
This fixture returns a function that creates a Databricks workspace user
and removes it after the test is complete. In case of random naming conflicts,
the fixture will retry the creation process for 30 seconds. Returns an instance
of [`User`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/iam.html#databricks.sdk.service.iam.User). Usage:

```python
def test_new_user(make_user, ws):
    new_user = make_user()
    home_dir = ws.workspace.get_status(f"/Users/{new_user.user_name}")
    assert home_dir.object_type == ObjectType.DIRECTORY
```

See also [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`watchdog_purge_suffix`](#watchdog_purge_suffix-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_pipeline_permissions` fixture
_No description yet._

See also [`ws`](#ws-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_notebook` fixture
Returns a function to create Databricks Notebooks and clean them up after the test.
The function returns [`os.PathLike` object](https://github.com/databrickslabs/blueprint?tab=readme-ov-file#python-native-pathlibpath-like-interfaces).

Keyword arguments:
* `path` (str, optional): The path of the notebook. Defaults to `dummy-*` notebook in current user's home folder.
* `content` (typing.BinaryIO, optional): The content of the notebook. Defaults to `print(1)`.
* `language` ([`Language`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/workspace.html#databricks.sdk.service.workspace.Language), optional): The language of the notebook. Defaults to `Language.PYTHON`.
* `format` ([`ImportFormat`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/workspace.html#databricks.sdk.service.workspace.ImportFormat), optional): The format of the notebook. Defaults to `ImportFormat.SOURCE`.
* `overwrite` (bool, optional): Whether to overwrite the notebook if it already exists. Defaults to `False`.

This example creates a notebook and verifies that `print(1)` is in the content:
```python
def test_creates_some_notebook(make_notebook):
    notebook = make_notebook()
    assert "print(1)" in notebook.read_text()
```

See also [`make_job`](#make_job-fixture), [`make_pipeline`](#make_pipeline-fixture), [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`watchdog_purge_suffix`](#watchdog_purge_suffix-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_notebook_permissions` fixture
_No description yet._

See also [`ws`](#ws-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_directory` fixture
Returns a function to create Databricks Workspace Folders and clean them up after the test.
The function returns [`os.PathLike` object](https://github.com/databrickslabs/blueprint?tab=readme-ov-file#python-native-pathlibpath-like-interfaces).

Keyword arguments:
* `path` (str, optional): The path of the notebook. Defaults to `dummy-*` folder in current user's home folder.

This example creates a folder and verifies that it contains a notebook:
```python
def test_creates_some_folder_with_a_notebook(make_directory, make_notebook):
    folder = make_directory()
    notebook = make_notebook(path=folder / 'foo.py')
    files = [_.name for _ in folder.iterdir()]
    assert ['foo.py'] == files
    assert notebook.parent == folder
```

See also [`make_experiment`](#make_experiment-fixture), [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`watchdog_purge_suffix`](#watchdog_purge_suffix-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_directory_permissions` fixture
_No description yet._

See also [`ws`](#ws-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_repo` fixture
Returns a function to create Databricks Repos and clean them up after the test.
The function returns a [`RepoInfo`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/workspace.html#databricks.sdk.service.workspace.RepoInfo) object.

Keyword arguments:
* `url` (str, optional): The URL of the repository.
* `provider` (str, optional): The provider of the repository.
* `path` (str, optional): The path of the repository. Defaults to `/Repos/{current_user}/sdk-{random}-{purge_suffix}`.

Usage:
```python
def test_repo(make_repo):
    logger.info(f"created {make_repo()}")
```

See also [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`watchdog_purge_suffix`](#watchdog_purge_suffix-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_repo_permissions` fixture
_No description yet._

See also [`ws`](#ws-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_workspace_file_permissions` fixture
_No description yet._

See also [`ws`](#ws-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_workspace_file_path_permissions` fixture
_No description yet._

See also [`ws`](#ws-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_secret_scope` fixture
This fixture provides a function to create secret scopes. The created secret scope will be
deleted after the test is complete. Returns the name of the secret scope.

To create a secret scope and use it within a test function:

```python
def test_secret_scope_creation(make_secret_scope):
    secret_scope_name = make_secret_scope()
    assert secret_scope_name.startswith("dummy-")
```

See also [`ws`](#ws-fixture), [`make_random`](#make_random-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_secret_scope_acl` fixture
This fixture provides a function to manage access control lists (ACLs) for secret scopes.
ACLs define permissions for principals (users or groups) on specific secret scopes.

Arguments:
- `scope`: The name of the secret scope.
- `principal`: The name of the principal (user or group).
- `permission`: The permission level for the principal on the secret scope.

Returns a tuple containing the secret scope name and the principal name.

To manage secret scope ACLs using the make_secret_scope_acl fixture:

```python
from databricks.sdk.service.workspace import AclPermission

def test_secret_scope_acl_management(make_user, make_secret_scope, make_secret_scope_acl):
    scope_name = make_secret_scope()
    principal_name = make_user().display_name
    permission = AclPermission.READ

    acl_info = make_secret_scope_acl(
        scope=scope_name,
        principal=principal_name,
        permission=permission,
    )
    assert acl_info == (scope_name, principal_name)
```

See also [`ws`](#ws-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_authorization_permissions` fixture
_No description yet._

See also [`ws`](#ws-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_udf` fixture
Create a UDF and return its info. Remove it after the test. Returns instance of [`FunctionInfo`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/catalog.html#databricks.sdk.service.catalog.FunctionInfo).

Keyword Arguments:
* `catalog_name` (str): The name of the catalog where the UDF will be created. Default is `hive_metastore`.
* `schema_name` (str): The name of the schema where the UDF will be created. Default is a random string.
* `name` (str): The name of the UDF. Default is a random string.
* `hive_udf` (bool): If `True`, the UDF will be created as a Hive UDF. Default is `False`.

Usage:
```python
def test_make_some_udfs(make_schema, make_udf):
    schema_a = make_schema(catalog_name="hive_metastore")
    make_udf(schema_name=schema_a.name)
    make_udf(schema_name=schema_a.name, hive_udf=True)
```

See also [`ws`](#ws-fixture), [`env_or_skip`](#env_or_skip-fixture), [`sql_backend`](#sql_backend-fixture), [`make_schema`](#make_schema-fixture), [`make_random`](#make_random-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_catalog` fixture
Create a catalog and return its info. Remove it after the test.
Returns instance of [`CatalogInfo`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/catalog.html#databricks.sdk.service.catalog.CatalogInfo).

Keyword Arguments:
* `name` (str): The name of the catalog. Default is a random string.

Usage:
```python
def test_catalog_fixture(make_catalog, make_schema, make_table):
    from_catalog = make_catalog()
    from_schema = make_schema(catalog_name=from_catalog.name)
    from_table_1 = make_table(catalog_name=from_catalog.name, schema_name=from_schema.name)
    logger.info(f"Created new schema: {from_table_1}")
```

See also [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`watchdog_remove_after`](#watchdog_remove_after-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_schema` fixture
Create a schema and return its info. Remove it after the test. Returns instance of [`SchemaInfo`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/catalog.html#databricks.sdk.service.catalog.SchemaInfo).

Keyword Arguments:
* `catalog_name` (str): The name of the catalog where the schema will be created. Default is `hive_metastore`.
* `name` (str): The name of the schema. Default is a random string.

Usage:
```python
def test_catalog_fixture(make_catalog, make_schema, make_table):
    from_catalog = make_catalog()
    from_schema = make_schema(catalog_name=from_catalog.name)
    from_table_1 = make_table(catalog_name=from_catalog.name, schema_name=from_schema.name)
    logger.info(f"Created new schema: {from_table_1}")
```

See also [`make_table`](#make_table-fixture), [`make_udf`](#make_udf-fixture), [`sql_backend`](#sql_backend-fixture), [`make_random`](#make_random-fixture), [`watchdog_remove_after`](#watchdog_remove_after-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_table` fixture
Create a table and return its info. Remove it after the test. Returns instance of [`TableInfo`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/catalog.html#databricks.sdk.service.catalog.TableInfo).

Keyword Arguments:
* `catalog_name` (str): The name of the catalog where the table will be created. Default is `hive_metastore`.
* `schema_name` (str): The name of the schema where the table will be created. Default is a random string.
* `name` (str): The name of the table. Default is a random string.
* `ctas` (str): The CTAS statement to create the table. Default is `None`.
* `non_delta` (bool): If `True`, the table will be created as a non-delta table. Default is `False`.
* `external` (bool): If `True`, the table will be created as an external table. Default is `False`.
* `external_csv` (str): The location of the external CSV table. Default is `None`.
* `external_delta` (str): The location of the external Delta table. Default is `None`.
* `view` (bool): If `True`, the table will be created as a view. Default is `False`.
* `tbl_properties` (dict): The table properties. Default is `None`.
* `hiveserde_ddl` (str): The DDL statement to create the table. Default is `None`.
* `storage_override` (str): The storage location override. Default is `None`.
* `columns` (list): The list of columns. Default is `None`.

Usage:
```python
def test_catalog_fixture(make_catalog, make_schema, make_table):
    from_catalog = make_catalog()
    from_schema = make_schema(catalog_name=from_catalog.name)
    from_table_1 = make_table(catalog_name=from_catalog.name, schema_name=from_schema.name)
    logger.info(f"Created new schema: {from_table_1}")
```

See also [`make_query`](#make_query-fixture), [`sql_backend`](#sql_backend-fixture), [`make_schema`](#make_schema-fixture), [`make_random`](#make_random-fixture), [`watchdog_remove_after`](#watchdog_remove_after-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_storage_credential` fixture
Create a storage credential and return its info. Remove it after the test. Returns instance of [`StorageCredentialInfo`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/catalog.html#databricks.sdk.service.catalog.StorageCredentialInfo).

Keyword Arguments:
* `credential_name` (str): The name of the storage credential. Default is a random string.
* `application_id` (str): The application ID for the Azure service principal. Default is an empty string.
* `client_secret` (str): The client secret for the Azure service principal. Default is an empty string.
* `directory_id` (str): The directory ID for the Azure service principal. Default is an empty string.
* `aws_iam_role_arn` (str): The ARN of the AWS IAM role. Default is an empty string.
* `read_only` (bool): If `True`, the storage credential will be read-only. Default is `False`.

Usage:
```python
def test_storage_credential(env_or_skip, make_storage_credential, make_random):
    random = make_random(6).lower()
    credential_name = f"dummy-{random}"
    make_storage_credential(
        credential_name=credential_name,
        aws_iam_role_arn=env_or_skip("TEST_UBER_ROLE_ID"),
    )
```

See also [`ws`](#ws-fixture).


[[back to top](#python-testing-for-databricks)]

### `product_info` fixture
_No description yet._

See also [`acc`](#acc-fixture), [`ws`](#ws-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_model` fixture
Returns a function to create Databricks Models and clean them up after the test.
The function returns a [`GetModelResponse`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/ml.html#databricks.sdk.service.ml.GetModelResponse) object.

Keyword arguments:
* `model_name` (str, optional): The name of the model. Defaults to `dummy-*`.

Usage:
```python
from databricks.sdk.service.iam import PermissionLevel

def test_models(make_group, make_model, make_registered_model_permissions):
    group = make_group()
    model = make_model()
    make_registered_model_permissions(
        object_id=model.id,
        permission_level=PermissionLevel.CAN_MANAGE,
        group_name=group.display_name,
    )
```

See also [`make_serving_endpoint`](#make_serving_endpoint-fixture), [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`watchdog_remove_after`](#watchdog_remove_after-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_experiment` fixture
Returns a function to create Databricks Experiments and clean them up after the test.
The function returns a [`CreateExperimentResponse`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/ml.html#databricks.sdk.service.ml.CreateExperimentResponse) object.

Keyword arguments:
* `path` (str, optional): The path of the experiment. Defaults to `dummy-*` experiment in current user's home folder.
* `experiment_name` (str, optional): The name of the experiment. Defaults to `dummy-*`.

Usage:
```python
from databricks.sdk.service.iam import PermissionLevel

def test_experiments(make_group, make_experiment, make_experiment_permissions):
    group = make_group()
    experiment = make_experiment()
    make_experiment_permissions(
        object_id=experiment.experiment_id,
        permission_level=PermissionLevel.CAN_MANAGE,
        group_name=group.display_name,
    )
```

See also [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`make_directory`](#make_directory-fixture), [`watchdog_purge_suffix`](#watchdog_purge_suffix-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_experiment_permissions` fixture
_No description yet._

See also [`ws`](#ws-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_warehouse_permissions` fixture
_No description yet._

See also [`ws`](#ws-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_lakeview_dashboard_permissions` fixture
_No description yet._

See also [`ws`](#ws-fixture).


[[back to top](#python-testing-for-databricks)]

### `log_workspace_link` fixture
rns a function to log a workspace link.

See also [`ws`](#ws-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_dashboard_permissions` fixture
_No description yet._

See also [`ws`](#ws-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_alert_permissions` fixture
_No description yet._

See also [`ws`](#ws-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_query` fixture
Create a query and remove it after the test is done. Returns the [`LegacyQuery`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/sql.html#databricks.sdk.service.sql.LegacyQuery) object.

Keyword Arguments:
- `query`: The query to be stored. Default is `SELECT * FROM <newly created random table>`.

Usage:
```python
from databricks.sdk.service.sql import PermissionLevel

def test_permissions_for_redash(
    make_user,
    make_query,
    make_query_permissions,
):
    user = make_user()
    query = make_query()
    make_query_permissions(
        object_id=query.id,
        permission_level=PermissionLevel.CAN_EDIT,
        user_name=user.display_name,
    )
```

See also [`ws`](#ws-fixture), [`make_table`](#make_table-fixture), [`make_random`](#make_random-fixture), [`watchdog_remove_after`](#watchdog_remove_after-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_query_permissions` fixture
_No description yet._

See also [`ws`](#ws-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_registered_model_permissions` fixture
_No description yet._

See also [`ws`](#ws-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_serving_endpoint` fixture
Returns a function to create Databricks Serving Endpoints and clean them up after the test.
The function returns a [`ServingEndpointDetailed`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/serving.html#databricks.sdk.service.serving.ServingEndpointDetailed) object.

Under the covers, this fixture also creates a model to serve on a small workload size.

Usage:
```python
def test_endpoints(make_group, make_serving_endpoint, make_serving_endpoint_permissions):
    group = make_group()
    endpoint = make_serving_endpoint()
    make_serving_endpoint_permissions(
        object_id=endpoint.response.id,
        permission_level=PermissionLevel.CAN_QUERY,
        group_name=group.display_name,
    )
```

See also [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`make_model`](#make_model-fixture), [`watchdog_remove_after`](#watchdog_remove_after-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_serving_endpoint_permissions` fixture
_No description yet._

See also [`ws`](#ws-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_feature_table` fixture
_No description yet._

See also [`ws`](#ws-fixture), [`make_random`](#make_random-fixture).


[[back to top](#python-testing-for-databricks)]

### `make_feature_table_permissions` fixture
_No description yet._

See also [`ws`](#ws-fixture).


[[back to top](#python-testing-for-databricks)]

### `watchdog_remove_after` fixture
Purge time for test objects, representing the (UTC-based) hour from which objects may be purged.

See also [`make_catalog`](#make_catalog-fixture), [`make_cluster`](#make_cluster-fixture), [`make_instance_pool`](#make_instance_pool-fixture), [`make_job`](#make_job-fixture), [`make_model`](#make_model-fixture), [`make_pipeline`](#make_pipeline-fixture), [`make_query`](#make_query-fixture), [`make_schema`](#make_schema-fixture), [`make_serving_endpoint`](#make_serving_endpoint-fixture), [`make_table`](#make_table-fixture), [`make_warehouse`](#make_warehouse-fixture), [`watchdog_purge_suffix`](#watchdog_purge_suffix-fixture).


[[back to top](#python-testing-for-databricks)]

### `watchdog_purge_suffix` fixture
HEX-encoded purge time suffix for test objects.

See also [`make_acc_group`](#make_acc_group-fixture), [`make_cluster_policy`](#make_cluster_policy-fixture), [`make_directory`](#make_directory-fixture), [`make_experiment`](#make_experiment-fixture), [`make_group`](#make_group-fixture), [`make_notebook`](#make_notebook-fixture), [`make_pipeline`](#make_pipeline-fixture), [`make_repo`](#make_repo-fixture), [`make_user`](#make_user-fixture), [`watchdog_remove_after`](#watchdog_remove_after-fixture).


[[back to top](#python-testing-for-databricks)]

### `is_in_debug` fixture
Returns true if the test is running from a debugger in IDE, otherwise false.

The following IDE are supported: IntelliJ IDEA (including Community Edition),
PyCharm (including Community Edition), and Visual Studio Code.

See also [`debug_env`](#debug_env-fixture), [`env_or_skip`](#env_or_skip-fixture).


[[back to top](#python-testing-for-databricks)]

<!-- END FIXTURES -->

# Project Support

Please note that this project is provided for your exploration only and is not 
formally supported by Databricks with Service Level Agreements (SLAs). They are 
provided AS-IS, and we do not make any guarantees of any kind. Please do not 
submit a support ticket relating to any issues arising from the use of this project.

Any issues discovered through the use of this project should be filed as GitHub 
[Issues on this repository](https://github.com/databrickslabs/pytester/issues). 
They will be reviewed as time permits, but no formal SLAs for support exist.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "databricks-labs-pytester",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": "Serge Smertin <serge.smertin@databricks.com>",
    "keywords": "Databricks, pytest",
    "author": null,
    "author_email": "Serge Smertin <serge.smertin@databricks.com>, Vuong Nguyen <vuong.nguyen@databricks.com>, Marcin Wojtyczka <marcin.wojtyczka@databricks.com>, Ziyuan Qin <ziyuan.qin@databricks.com>, William Conti <william.conti@databricks.com>, Lars George <lars.george@databricks.com>, Cor Zuurmond <cor.zuurmond@databricks.com>, Andrew Snare <andrew.snare@databricks.com>, Liran Bareket <liran.bareket@databricks.com>, Pritish Pai <pritish.pai@databricks.com>",
    "download_url": "https://files.pythonhosted.org/packages/b8/5d/8d8cdb992231f5d587c43dc5b7bd0ce7fa1363c8b4ce20d82106a19fa5be/databricks_labs_pytester-0.2.4.tar.gz",
    "platform": null,
    "description": "# Python Testing for Databricks\n\n[![python](https://img.shields.io/badge/python-3.10,%203.11,%203.12-green)](https://github.com/databrickslabs/pytester/actions/workflows/push.yml)\n[![lines of code](https://tokei.rs/b1/github/databrickslabs/pytester)]([https://github.com/databrickslabs/pytester](https://github.com/databrickslabs/pytester))\n\n\n<!-- TOC -->\n* [Python Testing for Databricks](#python-testing-for-databricks)\n  * [Ecosystem](#ecosystem)\n  * [PyTest Fixtures](#pytest-fixtures)\n    * [Logging](#logging)\n    * [Installation](#installation)\n    * [`debug_env_name` fixture](#debug_env_name-fixture)\n    * [`debug_env` fixture](#debug_env-fixture)\n    * [`env_or_skip` fixture](#env_or_skip-fixture)\n    * [`ws` fixture](#ws-fixture)\n    * [`make_random` fixture](#make_random-fixture)\n    * [`make_instance_pool` fixture](#make_instance_pool-fixture)\n    * [`make_instance_pool_permissions` fixture](#make_instance_pool_permissions-fixture)\n    * [`make_job` fixture](#make_job-fixture)\n    * [`make_job_permissions` fixture](#make_job_permissions-fixture)\n    * [`make_cluster` fixture](#make_cluster-fixture)\n    * [`make_cluster_permissions` fixture](#make_cluster_permissions-fixture)\n    * [`make_cluster_policy` fixture](#make_cluster_policy-fixture)\n    * [`make_cluster_policy_permissions` fixture](#make_cluster_policy_permissions-fixture)\n    * [`make_pipeline` fixture](#make_pipeline-fixture)\n    * [`make_warehouse` fixture](#make_warehouse-fixture)\n    * [`make_group` fixture](#make_group-fixture)\n    * [`make_user` fixture](#make_user-fixture)\n    * [`make_pipeline_permissions` fixture](#make_pipeline_permissions-fixture)\n    * [`make_notebook` fixture](#make_notebook-fixture)\n    * [`make_notebook_permissions` fixture](#make_notebook_permissions-fixture)\n    * [`make_directory` fixture](#make_directory-fixture)\n    * [`make_directory_permissions` fixture](#make_directory_permissions-fixture)\n    * [`make_repo` fixture](#make_repo-fixture)\n    * [`make_repo_permissions` fixture](#make_repo_permissions-fixture)\n    * [`make_workspace_file_permissions` fixture](#make_workspace_file_permissions-fixture)\n    * [`make_workspace_file_path_permissions` fixture](#make_workspace_file_path_permissions-fixture)\n    * [`make_secret_scope` fixture](#make_secret_scope-fixture)\n    * [`make_secret_scope_acl` fixture](#make_secret_scope_acl-fixture)\n    * [`make_authorization_permissions` fixture](#make_authorization_permissions-fixture)\n    * [`make_udf` fixture](#make_udf-fixture)\n    * [`make_catalog` fixture](#make_catalog-fixture)\n    * [`make_schema` fixture](#make_schema-fixture)\n    * [`make_table` fixture](#make_table-fixture)\n    * [`make_storage_credential` fixture](#make_storage_credential-fixture)\n    * [`product_info` fixture](#product_info-fixture)\n    * [`sql_backend` fixture](#sql_backend-fixture)\n    * [`sql_exec` fixture](#sql_exec-fixture)\n    * [`sql_fetch_all` fixture](#sql_fetch_all-fixture)\n    * [`make_model` fixture](#make_model-fixture)\n    * [`make_experiment` fixture](#make_experiment-fixture)\n    * [`make_experiment_permissions` fixture](#make_experiment_permissions-fixture)\n    * [`make_warehouse_permissions` fixture](#make_warehouse_permissions-fixture)\n    * [`make_lakeview_dashboard_permissions` fixture](#make_lakeview_dashboard_permissions-fixture)\n    * [`workspace_library` fixture](#workspace_library-fixture)\n    * [`log_workspace_link` fixture](#log_workspace_link-fixture)\n    * [`make_dashboard_permissions` fixture](#make_dashboard_permissions-fixture)\n    * [`make_alert_permissions` fixture](#make_alert_permissions-fixture)\n    * [`make_query` fixture](#make_query-fixture)\n    * [`make_query_permissions` fixture](#make_query_permissions-fixture)\n    * [`make_registered_model_permissions` fixture](#make_registered_model_permissions-fixture)\n    * [`make_serving_endpoint` fixture](#make_serving_endpoint-fixture)\n    * [`make_serving_endpoint_permissions` fixture](#make_serving_endpoint_permissions-fixture)\n    * [`make_feature_table_permissions` fixture](#make_feature_table_permissions-fixture)\n* [Project Support](#project-support)\n<!-- TOC -->\n\n## Installation\n\nAdd a `databricks-labs-pytester` dependency to your `pyproject.toml` file (or legacy `requirements.txt` file). You can\nalso install it directly from the command line:\n\n```shell\npip install databricks-labs-pytester\n```\n\nIf you use `hatch` as a build system, make sure to add `databricks-labs-pytester` as \na test-time dependency and not as a compile-time dependency, otherwise your wheels will\ntransitively depend on `pytest`, which is not usually something you need.\n\n```toml\n[project]\nname = \"name-of-your-project\"\n# ...\ndependencies = [\n  \"databricks-sdk~=0.30\",\n  # ... dependencies required for your code to execute\n]\n\n[tool.hatch.envs.default]\ndependencies = [\n  # ... dependencies required to test/validate/format your code:\n    \"black~=24.3.0\",\n    \"coverage[toml]~=7.4.4\",\n    \"mypy~=1.9.0\",\n    \"pylint~=3.2.2\",\n    \"pylint-pytest==2.0.0a0\",\n    \"databricks-labs-pylint~=0.4.0\",\n    \"databricks-labs-pytester~=0.2\", # <= this library \n    \"pytest~=8.3.3\",\n    \"pytest-cov~=4.1.0\",\n    \"pytest-mock~=3.14.0\",\n    \"pytest-timeout~=2.3.1\",\n    \"pytest-xdist~=3.5.0\",\n    \"python-lsp-server>=1.9.0\",\n    \"ruff~=0.3.4\",\n    \"types-PyYAML~=6.0.12\",\n    \"types-requests~=2.31.0\",\n]\n```\n\n[[back to top](#python-testing-for-databricks)]\n\n## Ecosystem\n\nBuilt on top of [Databricks SDK for Python](https://github.com/databricks/databricks-sdk-py), this library is part of the Databricks Labs Python ecosystem, which includes the following projects:\n* [PyLint Plugin for Databricks](https://github.com/databrickslabs/pylint-plugin) for static code analysis and early bug detection.\n* [Blueprint](https://github.com/databrickslabs/blueprint) for \n  [Python-native pathlib.Path-like interfaces](https://github.com/databrickslabs/blueprint#python-native-pathlibpath-like-interfaces),\n  [Managing Python App installations within Databricks Workspaces](https://github.com/databrickslabs/blueprint#application-and-installation-state),\n  [Application Migrations](https://github.com/databrickslabs/blueprint#application-state-migrations), and\n  [Building Wheels](https://github.com/databrickslabs/blueprint#building-wheels).\n* [LSQL](https://github.com/databrickslabs/lsql) for lightweight SQL handling and dashboards-as-code.\n* [UCX](https://github.com/databrickslabs/ucx) for automated migrations into Unity Catalog and LSP plugin for static code analysis for UC compatibility.\n\nSee [this video](https://www.youtube.com/watch?v=CNypO79IATc) for a quick overview of the Databricks Labs Python ecosystem.\n\n[[back to top](#python-testing-for-databricks)]\n\n## PyTest Fixtures\n\n[PyTest Fixtures](https://docs.pytest.org/en/latest/explanation/fixtures.html) are a powerful way to manage test setup and teardown in Python. This library provides \na set of fixtures to help you write integration tests for Databricks. These fixtures were incubated\nwithin the [Unity Catalog Automated Migrations project](https://github.com/databrickslabs/ucx/blame/df7f1d7647251fb8f0f23c56a548b99092484a7c/src/databricks/labs/ucx/mixins/fixtures.py) \nfor more than a year and are now available for other projects to simplify integration testing with Databricks.\n\n[[back to top](#python-testing-for-databricks)]\n\n### Logging\n\nThis library is built on years of debugging integration tests for Databricks and its ecosystem. \n\nThat's why it comes with a built-in logger that traces creation and deletion of dummy entities through links in \nthe Databricks Workspace UI. If you run the following code:\n\n```python\ndef test_new_user(make_user, ws):\n    new_user = make_user()\n    home_dir = ws.workspace.get_status(f\"/Users/{new_user.user_name}\")\n    assert home_dir.object_type == ObjectType.DIRECTORY\n```\n\nYou will see the following output, where the first line is clickable and will take you to the user's profile in the Databricks Workspace UI:\n\n```text\n12:30:53  INFO [d.l.p.fixtures.baseline] Created dummy-xwuq-...@example.com: https://.....azuredatabricks.net/#settings/workspace/identity-and-access/users/735...\n12:30:53 DEBUG [d.l.p.fixtures.baseline] added workspace user fixture: User(active=True, display_name='dummy-xwuq-...@example.com', ...)\n12:30:58 DEBUG [d.l.p.fixtures.baseline] clearing 1 workspace user fixtures\n12:30:58 DEBUG [d.l.p.fixtures.baseline] removing workspace user fixture: User(active=True, display_name='dummy-xwuq-...@example.com', ...)\n```\n\nYou may need to add the following to your `conftest.py` file to enable this:\n\n```python\nimport logging\n\nfrom databricks.labs.blueprint.logger import install_logger\n\ninstall_logger()\n\nlogging.getLogger('databricks.labs.pytester').setLevel(logging.DEBUG)\n```\n\n[[back to top](#python-testing-for-databricks)]\n\n<!-- FIXTURES -->\n### `debug_env_name` fixture\nSpecify the name of the debug environment. By default, it is set to `.env`,\nwhich will try to find a [file named `.env`](https://www.dotenv.org/docs/security/env)\nin any of the parent directories of the current working directory and load\nthe environment variables from it via the [`debug_env` fixture](#debug_env-fixture).\n\nAlternatively, if you are concerned of the\n[risk of `.env` files getting checked into version control](https://thehackernews.com/2024/08/attackers-exploit-public-env-files-to.html),\nwe recommend using the `~/.databricks/debug-env.json` file to store different sets of environment variables.\nThe file cannot be checked into version control by design, because it is stored in the user's home directory.\n\nThis file is used for local debugging and integration tests in IDEs like PyCharm, VSCode, and IntelliJ IDEA\nwhile developing Databricks Platform Automation Stack, which includes Databricks SDKs for Python, Go, and Java,\nas well as Databricks Terraform Provider and Databricks CLI. This file enables multi-environment and multi-cloud\ntesting with a single set of integration tests.\n\nThe file is typically structured as follows:\n\n```shell\n$ cat ~/.databricks/debug-env.json\n{\n   \"ws\": {\n     \"CLOUD_ENV\": \"azure\",\n     \"DATABRICKS_HOST\": \"....azuredatabricks.net\",\n     \"DATABRICKS_CLUSTER_ID\": \"0708-200540-...\",\n     \"DATABRICKS_WAREHOUSE_ID\": \"33aef...\",\n        ...\n   },\n   \"acc\": {\n     \"CLOUD_ENV\": \"aws\",\n     \"DATABRICKS_HOST\": \"accounts.cloud.databricks.net\",\n     \"DATABRICKS_CLIENT_ID\": \"....\",\n     \"DATABRICKS_CLIENT_SECRET\": \"....\",\n     ...\n   }\n}\n```\n\nAnd you can load it in your `conftest.py` file as follows:\n\n```python\n@pytest.fixture\ndef debug_env_name():\n    return \"ws\"\n```\n\nThis will load the `ws` environment from the `~/.databricks/debug-env.json` file.\n\nIf any of the environment variables are not found, [`env_or_skip` fixture](#env_or_skip-fixture)\nwill gracefully skip the execution of tests.\n\nSee also [`debug_env`](#debug_env-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `debug_env` fixture\nLoads environment variables specified in [`debug_env_name` fixture](#debug_env_name-fixture) from a file\nfor local debugging in IDEs, otherwise allowing the tests to run with the default environment variables\nspecified in the CI/CD pipeline.\n\nSee also [`acc`](#acc-fixture), [`env_or_skip`](#env_or_skip-fixture), [`ws`](#ws-fixture), [`debug_env_name`](#debug_env_name-fixture), [`is_in_debug`](#is_in_debug-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `env_or_skip` fixture\nFixture to get environment variables or skip tests.\n\nIt is extremely useful to skip tests if the required environment variables are not set.\n\nIn the following example, `test_something` would only run if the environment variable\n`SOME_EXTERNAL_SERVICE_TOKEN` is set:\n\n```python\ndef test_something(env_or_skip):\n    token = env_or_skip(\"SOME_EXTERNAL_SERVICE_TOKEN\")\n    assert token is not None\n```\n\nSee also [`acc`](#acc-fixture), [`make_udf`](#make_udf-fixture), [`sql_backend`](#sql_backend-fixture), [`debug_env`](#debug_env-fixture), [`is_in_debug`](#is_in_debug-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `ws` fixture\nCreate and provide a Databricks WorkspaceClient object.\n\nThis fixture initializes a Databricks WorkspaceClient object, which can be used\nto interact with the Databricks workspace API. The created instance of WorkspaceClient\nis shared across all test functions within the test session.\n\nSee [detailed documentation](https://databricks-sdk-py.readthedocs.io/en/latest/authentication.html) for the list\nof environment variables that can be used to authenticate the WorkspaceClient.\n\nIn your test functions, include this fixture as an argument to use the WorkspaceClient:\n\n```python\ndef test_workspace_operations(ws):\n    clusters = ws.clusters.list_clusters()\n    assert len(clusters) >= 0\n```\n\nSee also [`log_workspace_link`](#log_workspace_link-fixture), [`make_alert_permissions`](#make_alert_permissions-fixture), [`make_authorization_permissions`](#make_authorization_permissions-fixture), [`make_catalog`](#make_catalog-fixture), [`make_cluster`](#make_cluster-fixture), [`make_cluster_permissions`](#make_cluster_permissions-fixture), [`make_cluster_policy`](#make_cluster_policy-fixture), [`make_cluster_policy_permissions`](#make_cluster_policy_permissions-fixture), [`make_dashboard_permissions`](#make_dashboard_permissions-fixture), [`make_directory`](#make_directory-fixture), [`make_directory_permissions`](#make_directory_permissions-fixture), [`make_experiment`](#make_experiment-fixture), [`make_experiment_permissions`](#make_experiment_permissions-fixture), [`make_feature_table`](#make_feature_table-fixture), [`make_feature_table_permissions`](#make_feature_table_permissions-fixture), [`make_group`](#make_group-fixture), [`make_instance_pool`](#make_instance_pool-fixture), [`make_instance_pool_permissions`](#make_instance_pool_permissions-fixture), [`make_job`](#make_job-fixture), [`make_job_permissions`](#make_job_permissions-fixture), [`make_lakeview_dashboard_permissions`](#make_lakeview_dashboard_permissions-fixture), [`make_model`](#make_model-fixture), [`make_notebook`](#make_notebook-fixture), [`make_notebook_permissions`](#make_notebook_permissions-fixture), [`make_pipeline`](#make_pipeline-fixture), [`make_pipeline_permissions`](#make_pipeline_permissions-fixture), [`make_query`](#make_query-fixture), [`make_query_permissions`](#make_query_permissions-fixture), [`make_registered_model_permissions`](#make_registered_model_permissions-fixture), [`make_repo`](#make_repo-fixture), [`make_repo_permissions`](#make_repo_permissions-fixture), [`make_secret_scope`](#make_secret_scope-fixture), [`make_secret_scope_acl`](#make_secret_scope_acl-fixture), [`make_serving_endpoint`](#make_serving_endpoint-fixture), [`make_serving_endpoint_permissions`](#make_serving_endpoint_permissions-fixture), [`make_storage_credential`](#make_storage_credential-fixture), [`make_udf`](#make_udf-fixture), [`make_user`](#make_user-fixture), [`make_warehouse`](#make_warehouse-fixture), [`make_warehouse_permissions`](#make_warehouse_permissions-fixture), [`make_workspace_file_path_permissions`](#make_workspace_file_path_permissions-fixture), [`make_workspace_file_permissions`](#make_workspace_file_permissions-fixture), [`spark`](#spark-fixture), [`sql_backend`](#sql_backend-fixture), [`debug_env`](#debug_env-fixture), [`product_info`](#product_info-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `acc` fixture\nCreate and provide a Databricks AccountClient object.\n\nThis fixture initializes a Databricks AccountClient object, which can be used\nto interact with the Databricks account API. The created instance of AccountClient\nis shared across all test functions within the test session.\n\nRequires `DATABRICKS_ACCOUNT_ID` environment variable to be set. If `DATABRICKS_HOST`\npoints to a workspace host, the fixture would automatically determine the account host\nfrom it.\n\nSee [detailed documentation](https://databricks-sdk-py.readthedocs.io/en/latest/authentication.html) for the list\nof environment variables that can be used to authenticate the AccountClient.\n\nIn your test functions, include this fixture as an argument to use the AccountClient:\n\n```python\ndef test_listing_workspaces(acc):\n    workspaces = acc.workspaces.list()\n    assert len(workspaces) >= 1\n```\n\nSee also [`make_acc_group`](#make_acc_group-fixture), [`debug_env`](#debug_env-fixture), [`product_info`](#product_info-fixture), [`env_or_skip`](#env_or_skip-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `spark` fixture\nGet Databricks Connect Spark session. Requires `databricks-connect` package to be installed.\n\nUsage:\n```python\ndef test_databricks_connect(spark):\n    rows = spark.sql(\"SELECT 1\").collect()\n    assert rows[0][0] == 1\n```\n\nSee also [`ws`](#ws-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `sql_backend` fixture\nCreate and provide a SQL backend for executing statements.\n\nRequires the environment variable `DATABRICKS_WAREHOUSE_ID` to be set.\n\nSee also [`make_schema`](#make_schema-fixture), [`make_table`](#make_table-fixture), [`make_udf`](#make_udf-fixture), [`sql_exec`](#sql_exec-fixture), [`sql_fetch_all`](#sql_fetch_all-fixture), [`ws`](#ws-fixture), [`env_or_skip`](#env_or_skip-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `sql_exec` fixture\nExecute SQL statement and don't return any results.\n\nSee also [`sql_backend`](#sql_backend-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `sql_fetch_all` fixture\nFetch all rows from a SQL statement.\n\nSee also [`sql_backend`](#sql_backend-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_random` fixture\nFixture to generate random strings.\n\nThis fixture provides a function to generate random strings of a specified length.\nThe generated strings are created using a character set consisting of uppercase letters,\nlowercase letters, and digits.\n\nTo generate a random string with default length of 16 characters:\n\n```python\nrandom_string = make_random()\nassert len(random_string) == 16\n```\n\nTo generate a random string with a specified length:\n\n```python\nrandom_string = make_random(k=8)\nassert len(random_string) == 8\n```\n\nSee also [`make_acc_group`](#make_acc_group-fixture), [`make_catalog`](#make_catalog-fixture), [`make_cluster`](#make_cluster-fixture), [`make_cluster_policy`](#make_cluster_policy-fixture), [`make_directory`](#make_directory-fixture), [`make_experiment`](#make_experiment-fixture), [`make_feature_table`](#make_feature_table-fixture), [`make_group`](#make_group-fixture), [`make_instance_pool`](#make_instance_pool-fixture), [`make_job`](#make_job-fixture), [`make_model`](#make_model-fixture), [`make_notebook`](#make_notebook-fixture), [`make_pipeline`](#make_pipeline-fixture), [`make_query`](#make_query-fixture), [`make_repo`](#make_repo-fixture), [`make_schema`](#make_schema-fixture), [`make_secret_scope`](#make_secret_scope-fixture), [`make_serving_endpoint`](#make_serving_endpoint-fixture), [`make_table`](#make_table-fixture), [`make_udf`](#make_udf-fixture), [`make_user`](#make_user-fixture), [`make_warehouse`](#make_warehouse-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_instance_pool` fixture\nCreate a Databricks instance pool and clean it up after the test. Returns a function to create instance pools.\nUse `instance_pool_id` attribute from the returned object to get an ID of the pool.\n\nKeyword Arguments:\n* `instance_pool_name` (str, optional): The name of the instance pool. If not provided, a random name will be generated.\n* `node_type_id` (str, optional): The node type ID of the instance pool. If not provided, a node type with local disk and 16GB memory will be used.\n* other arguments are passed to `WorkspaceClient.instance_pools.create` method.\n\nUsage:\n```python\ndef test_instance_pool(make_instance_pool):\n    logger.info(f\"created {make_instance_pool()}\")\n```\n\nSee also [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`watchdog_remove_after`](#watchdog_remove_after-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_instance_pool_permissions` fixture\n_No description yet._\n\nSee also [`ws`](#ws-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_job` fixture\nCreate a Databricks job and clean it up after the test. Returns a function to create jobs, that returns\na [`Job`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/jobs.html#databricks.sdk.service.jobs.Job) instance.\n\nKeyword Arguments:\n* `notebook_path` (str, optional): The path to the notebook. If not provided, a random notebook will be created.\n* `name` (str, optional): The name of the job. If not provided, a random name will be generated.\n* `spark_conf` (dict, optional): The Spark configuration of the job.\n* `libraries` (list, optional): The list of libraries to install on the job.\n* other arguments are passed to `WorkspaceClient.jobs.create` method.\n\nIf no task argument is provided, a single task with a notebook task will be created, along with a disposable notebook.\nLatest Spark version and a single worker clusters will be used to run this ephemeral job.\n\nUsage:\n```python\ndef test_job(make_job):\n    logger.info(f\"created {make_job()}\")\n```\n\nSee also [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`make_notebook`](#make_notebook-fixture), [`watchdog_remove_after`](#watchdog_remove_after-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_job_permissions` fixture\n_No description yet._\n\nSee also [`ws`](#ws-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_cluster` fixture\nCreate a Databricks cluster, waits for it to start, and clean it up after the test.\nReturns a function to create clusters. You can get `cluster_id` attribute from the returned object.\n\nKeyword Arguments:\n* `single_node` (bool, optional): Whether to create a single-node cluster. Defaults to False.\n* `cluster_name` (str, optional): The name of the cluster. If not provided, a random name will be generated.\n* `spark_version` (str, optional): The Spark version of the cluster. If not provided, the latest version will be used.\n* `autotermination_minutes` (int, optional): The number of minutes before the cluster is automatically terminated. Defaults to 10.\n\nUsage:\n```python\ndef test_cluster(make_cluster):\n    logger.info(f\"created {make_cluster(single_node=True)}\")\n```\n\nSee also [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`watchdog_remove_after`](#watchdog_remove_after-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_cluster_permissions` fixture\n_No description yet._\n\nSee also [`ws`](#ws-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_cluster_policy` fixture\nCreate a Databricks cluster policy and clean it up after the test. Returns a function to create cluster policies,\nwhich returns [`CreatePolicyResponse`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/compute.html#databricks.sdk.service.compute.CreatePolicyResponse) instance.\n\nKeyword Arguments:\n* `name` (str, optional): The name of the cluster policy. If not provided, a random name will be generated.\n\nUsage:\n```python\ndef test_cluster_policy(make_cluster_policy):\n    logger.info(f\"created {make_cluster_policy()}\")\n```\n\nSee also [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`watchdog_purge_suffix`](#watchdog_purge_suffix-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_cluster_policy_permissions` fixture\n_No description yet._\n\nSee also [`ws`](#ws-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_pipeline` fixture\nCreate Delta Live Table Pipeline and clean it up after the test. Returns a function to create pipelines.\nResults in a [`CreatePipelineResponse`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/pipelines.html#databricks.sdk.service.pipelines.CreatePipelineResponse) instance.\n\nKeyword Arguments:\n* `name` (str, optional): The name of the pipeline. If not provided, a random name will be generated.\n* `libraries` (list, optional): The list of libraries to install on the pipeline. If not provided, a random disposable notebook will be created.\n* `clusters` (list, optional): The list of clusters to use for the pipeline. If not provided, a single node cluster will be created with 16GB memory and local disk.\n\nUsage:\n```python\ndef test_pipeline(make_pipeline, make_pipeline_permissions, make_group):\n    group = make_group()\n    pipeline = make_pipeline()\n    make_pipeline_permissions(\n        object_id=pipeline.pipeline_id,\n        permission_level=PermissionLevel.CAN_MANAGE,\n        group_name=group.display_name,\n    )\n```\n\nSee also [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`make_notebook`](#make_notebook-fixture), [`watchdog_remove_after`](#watchdog_remove_after-fixture), [`watchdog_purge_suffix`](#watchdog_purge_suffix-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_warehouse` fixture\nCreate a Databricks warehouse and clean it up after the test. Returns a function to create warehouses.\n\nKeyword Arguments:\n* `warehouse_name` (str, optional): The name of the warehouse. If not provided, a random name will be generated.\n* `warehouse_type` (CreateWarehouseRequestWarehouseType, optional): The type of the warehouse. Defaults to `PRO`.\n* `cluster_size` (str, optional): The size of the cluster. Defaults to `2X-Small`.\n\nUsage:\n```python\ndef test_warehouse_has_remove_after_tag(ws, make_warehouse):\n    new_warehouse = make_warehouse()\n    created_warehouse = ws.warehouses.get(new_warehouse.response.id)\n    warehouse_tags = created_warehouse.tags.as_dict()\n    assert warehouse_tags[\"custom_tags\"][0][\"key\"] == \"RemoveAfter\"\n```\n\nSee also [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`watchdog_remove_after`](#watchdog_remove_after-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_group` fixture\nThis fixture provides a function to manage Databricks workspace groups. Groups can be created with specified\nmembers and roles, and they will be deleted after the test is complete. Deals with eventual consistency issues by\nretrying the creation process for 30 seconds and then waiting for up to 3 minutes for the group to be provisioned.\nReturns an instance of [`Group`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/iam.html#databricks.sdk.service.iam.Group).\n\nKeyword arguments:\n* `members` (list of strings): A list of user IDs to add to the group.\n* `roles` (list of strings): A list of roles to assign to the group.\n* `display_name` (str): The display name of the group.\n* `entitlements` (list of strings): A list of entitlements to assign to the group.\n\nThe following example creates a group with a single member and independently verifies that the group was created:\n\n```python\ndef test_new_group(make_group, make_user, ws):\n    user = make_user()\n    group = make_group(members=[user.id])\n    loaded = ws.groups.get(group.id)\n    assert group.display_name == loaded.display_name\n    assert group.members == loaded.members\n```\n\nSee also [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`watchdog_purge_suffix`](#watchdog_purge_suffix-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_acc_group` fixture\nThis fixture provides a function to manage Databricks account groups. Groups can be created with\nspecified members and roles, and they will be deleted after the test is complete.\n\nHas the same arguments and behavior as [`make_group` fixture](#make_group-fixture) but uses the account\nclient instead of the workspace client.\n\nExample usage:\n```python\ndef test_new_account_group(make_acc_group, acc):\n    group = make_acc_group()\n    loaded = acc.groups.get(group.id)\n    assert group.display_name == loaded.display_name\n```\n\nSee also [`acc`](#acc-fixture), [`make_random`](#make_random-fixture), [`watchdog_purge_suffix`](#watchdog_purge_suffix-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_user` fixture\nThis fixture returns a function that creates a Databricks workspace user\nand removes it after the test is complete. In case of random naming conflicts,\nthe fixture will retry the creation process for 30 seconds. Returns an instance\nof [`User`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/iam.html#databricks.sdk.service.iam.User). Usage:\n\n```python\ndef test_new_user(make_user, ws):\n    new_user = make_user()\n    home_dir = ws.workspace.get_status(f\"/Users/{new_user.user_name}\")\n    assert home_dir.object_type == ObjectType.DIRECTORY\n```\n\nSee also [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`watchdog_purge_suffix`](#watchdog_purge_suffix-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_pipeline_permissions` fixture\n_No description yet._\n\nSee also [`ws`](#ws-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_notebook` fixture\nReturns a function to create Databricks Notebooks and clean them up after the test.\nThe function returns [`os.PathLike` object](https://github.com/databrickslabs/blueprint?tab=readme-ov-file#python-native-pathlibpath-like-interfaces).\n\nKeyword arguments:\n* `path` (str, optional): The path of the notebook. Defaults to `dummy-*` notebook in current user's home folder.\n* `content` (typing.BinaryIO, optional): The content of the notebook. Defaults to `print(1)`.\n* `language` ([`Language`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/workspace.html#databricks.sdk.service.workspace.Language), optional): The language of the notebook. Defaults to `Language.PYTHON`.\n* `format` ([`ImportFormat`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/workspace.html#databricks.sdk.service.workspace.ImportFormat), optional): The format of the notebook. Defaults to `ImportFormat.SOURCE`.\n* `overwrite` (bool, optional): Whether to overwrite the notebook if it already exists. Defaults to `False`.\n\nThis example creates a notebook and verifies that `print(1)` is in the content:\n```python\ndef test_creates_some_notebook(make_notebook):\n    notebook = make_notebook()\n    assert \"print(1)\" in notebook.read_text()\n```\n\nSee also [`make_job`](#make_job-fixture), [`make_pipeline`](#make_pipeline-fixture), [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`watchdog_purge_suffix`](#watchdog_purge_suffix-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_notebook_permissions` fixture\n_No description yet._\n\nSee also [`ws`](#ws-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_directory` fixture\nReturns a function to create Databricks Workspace Folders and clean them up after the test.\nThe function returns [`os.PathLike` object](https://github.com/databrickslabs/blueprint?tab=readme-ov-file#python-native-pathlibpath-like-interfaces).\n\nKeyword arguments:\n* `path` (str, optional): The path of the notebook. Defaults to `dummy-*` folder in current user's home folder.\n\nThis example creates a folder and verifies that it contains a notebook:\n```python\ndef test_creates_some_folder_with_a_notebook(make_directory, make_notebook):\n    folder = make_directory()\n    notebook = make_notebook(path=folder / 'foo.py')\n    files = [_.name for _ in folder.iterdir()]\n    assert ['foo.py'] == files\n    assert notebook.parent == folder\n```\n\nSee also [`make_experiment`](#make_experiment-fixture), [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`watchdog_purge_suffix`](#watchdog_purge_suffix-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_directory_permissions` fixture\n_No description yet._\n\nSee also [`ws`](#ws-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_repo` fixture\nReturns a function to create Databricks Repos and clean them up after the test.\nThe function returns a [`RepoInfo`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/workspace.html#databricks.sdk.service.workspace.RepoInfo) object.\n\nKeyword arguments:\n* `url` (str, optional): The URL of the repository.\n* `provider` (str, optional): The provider of the repository.\n* `path` (str, optional): The path of the repository. Defaults to `/Repos/{current_user}/sdk-{random}-{purge_suffix}`.\n\nUsage:\n```python\ndef test_repo(make_repo):\n    logger.info(f\"created {make_repo()}\")\n```\n\nSee also [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`watchdog_purge_suffix`](#watchdog_purge_suffix-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_repo_permissions` fixture\n_No description yet._\n\nSee also [`ws`](#ws-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_workspace_file_permissions` fixture\n_No description yet._\n\nSee also [`ws`](#ws-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_workspace_file_path_permissions` fixture\n_No description yet._\n\nSee also [`ws`](#ws-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_secret_scope` fixture\nThis fixture provides a function to create secret scopes. The created secret scope will be\ndeleted after the test is complete. Returns the name of the secret scope.\n\nTo create a secret scope and use it within a test function:\n\n```python\ndef test_secret_scope_creation(make_secret_scope):\n    secret_scope_name = make_secret_scope()\n    assert secret_scope_name.startswith(\"dummy-\")\n```\n\nSee also [`ws`](#ws-fixture), [`make_random`](#make_random-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_secret_scope_acl` fixture\nThis fixture provides a function to manage access control lists (ACLs) for secret scopes.\nACLs define permissions for principals (users or groups) on specific secret scopes.\n\nArguments:\n- `scope`: The name of the secret scope.\n- `principal`: The name of the principal (user or group).\n- `permission`: The permission level for the principal on the secret scope.\n\nReturns a tuple containing the secret scope name and the principal name.\n\nTo manage secret scope ACLs using the make_secret_scope_acl fixture:\n\n```python\nfrom databricks.sdk.service.workspace import AclPermission\n\ndef test_secret_scope_acl_management(make_user, make_secret_scope, make_secret_scope_acl):\n    scope_name = make_secret_scope()\n    principal_name = make_user().display_name\n    permission = AclPermission.READ\n\n    acl_info = make_secret_scope_acl(\n        scope=scope_name,\n        principal=principal_name,\n        permission=permission,\n    )\n    assert acl_info == (scope_name, principal_name)\n```\n\nSee also [`ws`](#ws-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_authorization_permissions` fixture\n_No description yet._\n\nSee also [`ws`](#ws-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_udf` fixture\nCreate a UDF and return its info. Remove it after the test. Returns instance of [`FunctionInfo`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/catalog.html#databricks.sdk.service.catalog.FunctionInfo).\n\nKeyword Arguments:\n* `catalog_name` (str): The name of the catalog where the UDF will be created. Default is `hive_metastore`.\n* `schema_name` (str): The name of the schema where the UDF will be created. Default is a random string.\n* `name` (str): The name of the UDF. Default is a random string.\n* `hive_udf` (bool): If `True`, the UDF will be created as a Hive UDF. Default is `False`.\n\nUsage:\n```python\ndef test_make_some_udfs(make_schema, make_udf):\n    schema_a = make_schema(catalog_name=\"hive_metastore\")\n    make_udf(schema_name=schema_a.name)\n    make_udf(schema_name=schema_a.name, hive_udf=True)\n```\n\nSee also [`ws`](#ws-fixture), [`env_or_skip`](#env_or_skip-fixture), [`sql_backend`](#sql_backend-fixture), [`make_schema`](#make_schema-fixture), [`make_random`](#make_random-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_catalog` fixture\nCreate a catalog and return its info. Remove it after the test.\nReturns instance of [`CatalogInfo`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/catalog.html#databricks.sdk.service.catalog.CatalogInfo).\n\nKeyword Arguments:\n* `name` (str): The name of the catalog. Default is a random string.\n\nUsage:\n```python\ndef test_catalog_fixture(make_catalog, make_schema, make_table):\n    from_catalog = make_catalog()\n    from_schema = make_schema(catalog_name=from_catalog.name)\n    from_table_1 = make_table(catalog_name=from_catalog.name, schema_name=from_schema.name)\n    logger.info(f\"Created new schema: {from_table_1}\")\n```\n\nSee also [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`watchdog_remove_after`](#watchdog_remove_after-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_schema` fixture\nCreate a schema and return its info. Remove it after the test. Returns instance of [`SchemaInfo`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/catalog.html#databricks.sdk.service.catalog.SchemaInfo).\n\nKeyword Arguments:\n* `catalog_name` (str): The name of the catalog where the schema will be created. Default is `hive_metastore`.\n* `name` (str): The name of the schema. Default is a random string.\n\nUsage:\n```python\ndef test_catalog_fixture(make_catalog, make_schema, make_table):\n    from_catalog = make_catalog()\n    from_schema = make_schema(catalog_name=from_catalog.name)\n    from_table_1 = make_table(catalog_name=from_catalog.name, schema_name=from_schema.name)\n    logger.info(f\"Created new schema: {from_table_1}\")\n```\n\nSee also [`make_table`](#make_table-fixture), [`make_udf`](#make_udf-fixture), [`sql_backend`](#sql_backend-fixture), [`make_random`](#make_random-fixture), [`watchdog_remove_after`](#watchdog_remove_after-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_table` fixture\nCreate a table and return its info. Remove it after the test. Returns instance of [`TableInfo`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/catalog.html#databricks.sdk.service.catalog.TableInfo).\n\nKeyword Arguments:\n* `catalog_name` (str): The name of the catalog where the table will be created. Default is `hive_metastore`.\n* `schema_name` (str): The name of the schema where the table will be created. Default is a random string.\n* `name` (str): The name of the table. Default is a random string.\n* `ctas` (str): The CTAS statement to create the table. Default is `None`.\n* `non_delta` (bool): If `True`, the table will be created as a non-delta table. Default is `False`.\n* `external` (bool): If `True`, the table will be created as an external table. Default is `False`.\n* `external_csv` (str): The location of the external CSV table. Default is `None`.\n* `external_delta` (str): The location of the external Delta table. Default is `None`.\n* `view` (bool): If `True`, the table will be created as a view. Default is `False`.\n* `tbl_properties` (dict): The table properties. Default is `None`.\n* `hiveserde_ddl` (str): The DDL statement to create the table. Default is `None`.\n* `storage_override` (str): The storage location override. Default is `None`.\n* `columns` (list): The list of columns. Default is `None`.\n\nUsage:\n```python\ndef test_catalog_fixture(make_catalog, make_schema, make_table):\n    from_catalog = make_catalog()\n    from_schema = make_schema(catalog_name=from_catalog.name)\n    from_table_1 = make_table(catalog_name=from_catalog.name, schema_name=from_schema.name)\n    logger.info(f\"Created new schema: {from_table_1}\")\n```\n\nSee also [`make_query`](#make_query-fixture), [`sql_backend`](#sql_backend-fixture), [`make_schema`](#make_schema-fixture), [`make_random`](#make_random-fixture), [`watchdog_remove_after`](#watchdog_remove_after-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_storage_credential` fixture\nCreate a storage credential and return its info. Remove it after the test. Returns instance of [`StorageCredentialInfo`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/catalog.html#databricks.sdk.service.catalog.StorageCredentialInfo).\n\nKeyword Arguments:\n* `credential_name` (str): The name of the storage credential. Default is a random string.\n* `application_id` (str): The application ID for the Azure service principal. Default is an empty string.\n* `client_secret` (str): The client secret for the Azure service principal. Default is an empty string.\n* `directory_id` (str): The directory ID for the Azure service principal. Default is an empty string.\n* `aws_iam_role_arn` (str): The ARN of the AWS IAM role. Default is an empty string.\n* `read_only` (bool): If `True`, the storage credential will be read-only. Default is `False`.\n\nUsage:\n```python\ndef test_storage_credential(env_or_skip, make_storage_credential, make_random):\n    random = make_random(6).lower()\n    credential_name = f\"dummy-{random}\"\n    make_storage_credential(\n        credential_name=credential_name,\n        aws_iam_role_arn=env_or_skip(\"TEST_UBER_ROLE_ID\"),\n    )\n```\n\nSee also [`ws`](#ws-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `product_info` fixture\n_No description yet._\n\nSee also [`acc`](#acc-fixture), [`ws`](#ws-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_model` fixture\nReturns a function to create Databricks Models and clean them up after the test.\nThe function returns a [`GetModelResponse`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/ml.html#databricks.sdk.service.ml.GetModelResponse) object.\n\nKeyword arguments:\n* `model_name` (str, optional): The name of the model. Defaults to `dummy-*`.\n\nUsage:\n```python\nfrom databricks.sdk.service.iam import PermissionLevel\n\ndef test_models(make_group, make_model, make_registered_model_permissions):\n    group = make_group()\n    model = make_model()\n    make_registered_model_permissions(\n        object_id=model.id,\n        permission_level=PermissionLevel.CAN_MANAGE,\n        group_name=group.display_name,\n    )\n```\n\nSee also [`make_serving_endpoint`](#make_serving_endpoint-fixture), [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`watchdog_remove_after`](#watchdog_remove_after-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_experiment` fixture\nReturns a function to create Databricks Experiments and clean them up after the test.\nThe function returns a [`CreateExperimentResponse`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/ml.html#databricks.sdk.service.ml.CreateExperimentResponse) object.\n\nKeyword arguments:\n* `path` (str, optional): The path of the experiment. Defaults to `dummy-*` experiment in current user's home folder.\n* `experiment_name` (str, optional): The name of the experiment. Defaults to `dummy-*`.\n\nUsage:\n```python\nfrom databricks.sdk.service.iam import PermissionLevel\n\ndef test_experiments(make_group, make_experiment, make_experiment_permissions):\n    group = make_group()\n    experiment = make_experiment()\n    make_experiment_permissions(\n        object_id=experiment.experiment_id,\n        permission_level=PermissionLevel.CAN_MANAGE,\n        group_name=group.display_name,\n    )\n```\n\nSee also [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`make_directory`](#make_directory-fixture), [`watchdog_purge_suffix`](#watchdog_purge_suffix-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_experiment_permissions` fixture\n_No description yet._\n\nSee also [`ws`](#ws-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_warehouse_permissions` fixture\n_No description yet._\n\nSee also [`ws`](#ws-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_lakeview_dashboard_permissions` fixture\n_No description yet._\n\nSee also [`ws`](#ws-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `log_workspace_link` fixture\nrns a function to log a workspace link.\n\nSee also [`ws`](#ws-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_dashboard_permissions` fixture\n_No description yet._\n\nSee also [`ws`](#ws-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_alert_permissions` fixture\n_No description yet._\n\nSee also [`ws`](#ws-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_query` fixture\nCreate a query and remove it after the test is done. Returns the [`LegacyQuery`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/sql.html#databricks.sdk.service.sql.LegacyQuery) object.\n\nKeyword Arguments:\n- `query`: The query to be stored. Default is `SELECT * FROM <newly created random table>`.\n\nUsage:\n```python\nfrom databricks.sdk.service.sql import PermissionLevel\n\ndef test_permissions_for_redash(\n    make_user,\n    make_query,\n    make_query_permissions,\n):\n    user = make_user()\n    query = make_query()\n    make_query_permissions(\n        object_id=query.id,\n        permission_level=PermissionLevel.CAN_EDIT,\n        user_name=user.display_name,\n    )\n```\n\nSee also [`ws`](#ws-fixture), [`make_table`](#make_table-fixture), [`make_random`](#make_random-fixture), [`watchdog_remove_after`](#watchdog_remove_after-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_query_permissions` fixture\n_No description yet._\n\nSee also [`ws`](#ws-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_registered_model_permissions` fixture\n_No description yet._\n\nSee also [`ws`](#ws-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_serving_endpoint` fixture\nReturns a function to create Databricks Serving Endpoints and clean them up after the test.\nThe function returns a [`ServingEndpointDetailed`](https://databricks-sdk-py.readthedocs.io/en/latest/dbdataclasses/serving.html#databricks.sdk.service.serving.ServingEndpointDetailed) object.\n\nUnder the covers, this fixture also creates a model to serve on a small workload size.\n\nUsage:\n```python\ndef test_endpoints(make_group, make_serving_endpoint, make_serving_endpoint_permissions):\n    group = make_group()\n    endpoint = make_serving_endpoint()\n    make_serving_endpoint_permissions(\n        object_id=endpoint.response.id,\n        permission_level=PermissionLevel.CAN_QUERY,\n        group_name=group.display_name,\n    )\n```\n\nSee also [`ws`](#ws-fixture), [`make_random`](#make_random-fixture), [`make_model`](#make_model-fixture), [`watchdog_remove_after`](#watchdog_remove_after-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_serving_endpoint_permissions` fixture\n_No description yet._\n\nSee also [`ws`](#ws-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_feature_table` fixture\n_No description yet._\n\nSee also [`ws`](#ws-fixture), [`make_random`](#make_random-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `make_feature_table_permissions` fixture\n_No description yet._\n\nSee also [`ws`](#ws-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `watchdog_remove_after` fixture\nPurge time for test objects, representing the (UTC-based) hour from which objects may be purged.\n\nSee also [`make_catalog`](#make_catalog-fixture), [`make_cluster`](#make_cluster-fixture), [`make_instance_pool`](#make_instance_pool-fixture), [`make_job`](#make_job-fixture), [`make_model`](#make_model-fixture), [`make_pipeline`](#make_pipeline-fixture), [`make_query`](#make_query-fixture), [`make_schema`](#make_schema-fixture), [`make_serving_endpoint`](#make_serving_endpoint-fixture), [`make_table`](#make_table-fixture), [`make_warehouse`](#make_warehouse-fixture), [`watchdog_purge_suffix`](#watchdog_purge_suffix-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `watchdog_purge_suffix` fixture\nHEX-encoded purge time suffix for test objects.\n\nSee also [`make_acc_group`](#make_acc_group-fixture), [`make_cluster_policy`](#make_cluster_policy-fixture), [`make_directory`](#make_directory-fixture), [`make_experiment`](#make_experiment-fixture), [`make_group`](#make_group-fixture), [`make_notebook`](#make_notebook-fixture), [`make_pipeline`](#make_pipeline-fixture), [`make_repo`](#make_repo-fixture), [`make_user`](#make_user-fixture), [`watchdog_remove_after`](#watchdog_remove_after-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n### `is_in_debug` fixture\nReturns true if the test is running from a debugger in IDE, otherwise false.\n\nThe following IDE are supported: IntelliJ IDEA (including Community Edition),\nPyCharm (including Community Edition), and Visual Studio Code.\n\nSee also [`debug_env`](#debug_env-fixture), [`env_or_skip`](#env_or_skip-fixture).\n\n\n[[back to top](#python-testing-for-databricks)]\n\n<!-- END FIXTURES -->\n\n# Project Support\n\nPlease note that this project is provided for your exploration only and is not \nformally supported by Databricks with Service Level Agreements (SLAs). They are \nprovided AS-IS, and we do not make any guarantees of any kind. Please do not \nsubmit a support ticket relating to any issues arising from the use of this project.\n\nAny issues discovered through the use of this project should be filed as GitHub \n[Issues on this repository](https://github.com/databrickslabs/pytester/issues). \nThey will be reviewed as time permits, but no formal SLAs for support exist.\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Python Testing for Databricks",
    "version": "0.2.4",
    "project_urls": {
        "Issues": "https://github.com/databrickslabs/pytester/issues",
        "Source": "https://github.com/databrickslabs/pytester"
    },
    "split_keywords": [
        "databricks",
        " pytest"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "b4740733cc26cc4fc155d8b47a517686775bd73fc4cb1a8f90ff7dc1c272ebe3",
                "md5": "d24c163b239c12e1241c5a26bdeefdda",
                "sha256": "e91d4d0af21d9e95ec5a4f5e7d8db386b8acbc4f02911445d7a23caf3db15a50"
            },
            "downloads": -1,
            "filename": "databricks_labs_pytester-0.2.4-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "d24c163b239c12e1241c5a26bdeefdda",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 43520,
            "upload_time": "2024-09-24T10:14:30",
            "upload_time_iso_8601": "2024-09-24T10:14:30.988801Z",
            "url": "https://files.pythonhosted.org/packages/b4/74/0733cc26cc4fc155d8b47a517686775bd73fc4cb1a8f90ff7dc1c272ebe3/databricks_labs_pytester-0.2.4-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "b85d8d8cdb992231f5d587c43dc5b7bd0ce7fa1363c8b4ce20d82106a19fa5be",
                "md5": "bfeb36717abd21d62a308e80fe0ae7ab",
                "sha256": "70098dbb339856ef18eae3c6e4015603805ac04069574cf20148dcbce2cf59e2"
            },
            "downloads": -1,
            "filename": "databricks_labs_pytester-0.2.4.tar.gz",
            "has_sig": false,
            "md5_digest": "bfeb36717abd21d62a308e80fe0ae7ab",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 51043,
            "upload_time": "2024-09-24T10:14:32",
            "upload_time_iso_8601": "2024-09-24T10:14:32.890985Z",
            "url": "https://files.pythonhosted.org/packages/b8/5d/8d8cdb992231f5d587c43dc5b7bd0ce7fa1363c8b4ce20d82106a19fa5be/databricks_labs_pytester-0.2.4.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-09-24 10:14:32",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "databrickslabs",
    "github_project": "pytester",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "databricks-labs-pytester"
}
        
Elapsed time: 1.73301s