databricks-api


Namedatabricks-api JSON
Version 0.9.0 PyPI version JSON
download
home_pagehttps://github.com/crflynn/databricks-api
SummaryDatabricks API client auto-generated from the official databricks-cli package
upload_time2023-06-08 16:37:33
maintainer
docs_urlNone
authorChristopher Flynn
requires_python>=3.6,<4.0
licenseMIT
keywords databricks api client
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            databricks-api
==============

**Please switch to the official Databricks SDK for Python (https://github.com/databricks/databricks-sdk-py) by running the following command:**

.. code-block:: bash

    pip install databricks-sdk


|pypi| |pyversions|

.. |pypi| image:: https://img.shields.io/pypi/v/databricks-api.svg
    :target: https://pypi.python.org/pypi/databricks-api

.. |pyversions| image:: https://img.shields.io/pypi/pyversions/databricks-api.svg
    :target: https://pypi.python.org/pypi/databricks-api

*[This documentation is auto-generated]*

This package provides a simplified interface for the Databricks REST API.
The interface is autogenerated on instantiation using the underlying client
library used in the official ``databricks-cli`` python package.

Install using

.. code-block:: bash

    pip install databricks-api
    

The docs here describe the interface for version **0.17.0** of
the ``databricks-cli`` package for API version **2.0**.

The ``databricks-api`` package contains a ``DatabricksAPI`` class which provides
instance attributes for the ``databricks-cli`` ``ApiClient``, as well as each of
the available service instances. The attributes of a ``DatabricksAPI`` instance are:

* DatabricksAPI.client *<databricks_cli.sdk.api_client.ApiClient>*
* DatabricksAPI.jobs *<databricks_cli.sdk.service.JobsService>*
* DatabricksAPI.cluster *<databricks_cli.sdk.service.ClusterService>*
* DatabricksAPI.policy *<databricks_cli.sdk.service.PolicyService>*
* DatabricksAPI.managed_library *<databricks_cli.sdk.service.ManagedLibraryService>*
* DatabricksAPI.dbfs *<databricks_cli.sdk.service.DbfsService>*
* DatabricksAPI.workspace *<databricks_cli.sdk.service.WorkspaceService>*
* DatabricksAPI.secret *<databricks_cli.sdk.service.SecretService>*
* DatabricksAPI.groups *<databricks_cli.sdk.service.GroupsService>*
* DatabricksAPI.token *<databricks_cli.sdk.service.TokenService>*
* DatabricksAPI.instance_pool *<databricks_cli.sdk.service.InstancePoolService>*
* DatabricksAPI.delta_pipelines *<databricks_cli.sdk.service.DeltaPipelinesService>*
* DatabricksAPI.repos *<databricks_cli.sdk.service.ReposService>*

To instantiate the client, provide the databricks host and either a token or
user and password. Also shown is the full signature of the
underlying ``ApiClient.__init__``

.. code-block:: python

    from databricks_api import DatabricksAPI

    # Provide a host and token
    db = DatabricksAPI(
        host="example.cloud.databricks.com",
        token="dpapi123..."
    )

    # OR a host and user and password
    db = DatabricksAPI(
        host="example.cloud.databricks.com",
        user="me@example.com",
        password="password"
    )

    # Full __init__ signature
    db = DatabricksAPI(
        user=None,
        password=None,
        host=None,
        token=None,
        api_version='2.0',
        default_headers={},
        verify=True,
        command_name='',
        jobs_api_version=None
    )

Refer to the `official documentation <https://docs.databricks.com/api/index.html>`_
on the functionality and required arguments of each method below.

Each of the service instance attributes provides the following public methods:

DatabricksAPI.jobs
------------------

.. code-block:: python

    db.jobs.cancel_run(
        run_id,
        headers=None,
        version=None,
    )

    db.jobs.create_job(
        name=None,
        existing_cluster_id=None,
        new_cluster=None,
        libraries=None,
        email_notifications=None,
        timeout_seconds=None,
        max_retries=None,
        min_retry_interval_millis=None,
        retry_on_timeout=None,
        schedule=None,
        notebook_task=None,
        spark_jar_task=None,
        spark_python_task=None,
        spark_submit_task=None,
        max_concurrent_runs=None,
        tasks=None,
        headers=None,
        version=None,
    )

    db.jobs.delete_job(
        job_id,
        headers=None,
        version=None,
    )

    db.jobs.delete_run(
        run_id=None,
        headers=None,
        version=None,
    )

    db.jobs.export_run(
        run_id,
        views_to_export=None,
        headers=None,
        version=None,
    )

    db.jobs.get_job(
        job_id,
        headers=None,
        version=None,
    )

    db.jobs.get_run(
        run_id=None,
        headers=None,
        version=None,
    )

    db.jobs.get_run_output(
        run_id,
        headers=None,
        version=None,
    )

    db.jobs.list_jobs(
        job_type=None,
        expand_tasks=None,
        limit=None,
        offset=None,
        headers=None,
        version=None,
    )

    db.jobs.list_runs(
        job_id=None,
        active_only=None,
        completed_only=None,
        offset=None,
        limit=None,
        headers=None,
        version=None,
    )

    db.jobs.reset_job(
        job_id,
        new_settings,
        headers=None,
        version=None,
    )

    db.jobs.run_now(
        job_id=None,
        jar_params=None,
        notebook_params=None,
        python_params=None,
        spark_submit_params=None,
        python_named_params=None,
        idempotency_token=None,
        headers=None,
        version=None,
    )

    db.jobs.submit_run(
        run_name=None,
        existing_cluster_id=None,
        new_cluster=None,
        libraries=None,
        notebook_task=None,
        spark_jar_task=None,
        spark_python_task=None,
        spark_submit_task=None,
        timeout_seconds=None,
        tasks=None,
        headers=None,
        version=None,
    )


DatabricksAPI.cluster
---------------------

.. code-block:: python

    db.cluster.create_cluster(
        num_workers=None,
        autoscale=None,
        cluster_name=None,
        spark_version=None,
        spark_conf=None,
        aws_attributes=None,
        node_type_id=None,
        driver_node_type_id=None,
        ssh_public_keys=None,
        custom_tags=None,
        cluster_log_conf=None,
        spark_env_vars=None,
        autotermination_minutes=None,
        enable_elastic_disk=None,
        cluster_source=None,
        instance_pool_id=None,
        headers=None,
    )

    db.cluster.delete_cluster(
        cluster_id,
        headers=None,
    )

    db.cluster.edit_cluster(
        cluster_id,
        num_workers=None,
        autoscale=None,
        cluster_name=None,
        spark_version=None,
        spark_conf=None,
        aws_attributes=None,
        node_type_id=None,
        driver_node_type_id=None,
        ssh_public_keys=None,
        custom_tags=None,
        cluster_log_conf=None,
        spark_env_vars=None,
        autotermination_minutes=None,
        enable_elastic_disk=None,
        cluster_source=None,
        instance_pool_id=None,
        headers=None,
    )

    db.cluster.get_cluster(
        cluster_id,
        headers=None,
    )

    db.cluster.get_events(
        cluster_id,
        start_time=None,
        end_time=None,
        order=None,
        event_types=None,
        offset=None,
        limit=None,
        headers=None,
    )

    db.cluster.list_available_zones(headers=None)

    db.cluster.list_clusters(headers=None)

    db.cluster.list_node_types(headers=None)

    db.cluster.list_spark_versions(headers=None)

    db.cluster.permanent_delete_cluster(
        cluster_id,
        headers=None,
    )

    db.cluster.pin_cluster(
        cluster_id,
        headers=None,
    )

    db.cluster.resize_cluster(
        cluster_id,
        num_workers=None,
        autoscale=None,
        headers=None,
    )

    db.cluster.restart_cluster(
        cluster_id,
        headers=None,
    )

    db.cluster.start_cluster(
        cluster_id,
        headers=None,
    )

    db.cluster.unpin_cluster(
        cluster_id,
        headers=None,
    )


DatabricksAPI.policy
--------------------

.. code-block:: python

    db.policy.create_policy(
        policy_name,
        definition,
        headers=None,
    )

    db.policy.delete_policy(
        policy_id,
        headers=None,
    )

    db.policy.edit_policy(
        policy_id,
        policy_name,
        definition,
        headers=None,
    )

    db.policy.get_policy(
        policy_id,
        headers=None,
    )

    db.policy.list_policies(headers=None)


DatabricksAPI.managed_library
-----------------------------

.. code-block:: python

    db.managed_library.all_cluster_statuses(headers=None)

    db.managed_library.cluster_status(
        cluster_id,
        headers=None,
    )

    db.managed_library.install_libraries(
        cluster_id,
        libraries=None,
        headers=None,
    )

    db.managed_library.uninstall_libraries(
        cluster_id,
        libraries=None,
        headers=None,
    )


DatabricksAPI.dbfs
------------------

.. code-block:: python

    db.dbfs.add_block(
        handle,
        data,
        headers=None,
    )

    db.dbfs.add_block_test(
        handle,
        data,
        headers=None,
    )

    db.dbfs.close(
        handle,
        headers=None,
    )

    db.dbfs.close_test(
        handle,
        headers=None,
    )

    db.dbfs.create(
        path,
        overwrite=None,
        headers=None,
    )

    db.dbfs.create_test(
        path,
        overwrite=None,
        headers=None,
    )

    db.dbfs.delete(
        path,
        recursive=None,
        headers=None,
    )

    db.dbfs.delete_test(
        path,
        recursive=None,
        headers=None,
    )

    db.dbfs.get_status(
        path,
        headers=None,
    )

    db.dbfs.get_status_test(
        path,
        headers=None,
    )

    db.dbfs.list(
        path,
        headers=None,
    )

    db.dbfs.list_test(
        path,
        headers=None,
    )

    db.dbfs.mkdirs(
        path,
        headers=None,
    )

    db.dbfs.mkdirs_test(
        path,
        headers=None,
    )

    db.dbfs.move(
        source_path,
        destination_path,
        headers=None,
    )

    db.dbfs.move_test(
        source_path,
        destination_path,
        headers=None,
    )

    db.dbfs.put(
        path,
        contents=None,
        overwrite=None,
        headers=None,
        src_path=None,
    )

    db.dbfs.put_test(
        path,
        contents=None,
        overwrite=None,
        headers=None,
        src_path=None,
    )

    db.dbfs.read(
        path,
        offset=None,
        length=None,
        headers=None,
    )

    db.dbfs.read_test(
        path,
        offset=None,
        length=None,
        headers=None,
    )


DatabricksAPI.workspace
-----------------------

.. code-block:: python

    db.workspace.delete(
        path,
        recursive=None,
        headers=None,
    )

    db.workspace.export_workspace(
        path,
        format=None,
        direct_download=None,
        headers=None,
    )

    db.workspace.get_status(
        path,
        headers=None,
    )

    db.workspace.import_workspace(
        path,
        format=None,
        language=None,
        content=None,
        overwrite=None,
        headers=None,
    )

    db.workspace.list(
        path,
        headers=None,
    )

    db.workspace.mkdirs(
        path,
        headers=None,
    )


DatabricksAPI.secret
--------------------

.. code-block:: python

    db.secret.create_scope(
        scope,
        initial_manage_principal=None,
        scope_backend_type=None,
        backend_azure_keyvault=None,
        headers=None,
    )

    db.secret.delete_acl(
        scope,
        principal,
        headers=None,
    )

    db.secret.delete_scope(
        scope,
        headers=None,
    )

    db.secret.delete_secret(
        scope,
        key,
        headers=None,
    )

    db.secret.get_acl(
        scope,
        principal,
        headers=None,
    )

    db.secret.list_acls(
        scope,
        headers=None,
    )

    db.secret.list_scopes(headers=None)

    db.secret.list_secrets(
        scope,
        headers=None,
    )

    db.secret.put_acl(
        scope,
        principal,
        permission,
        headers=None,
    )

    db.secret.put_secret(
        scope,
        key,
        string_value=None,
        bytes_value=None,
        headers=None,
    )


DatabricksAPI.groups
--------------------

.. code-block:: python

    db.groups.add_to_group(
        parent_name,
        user_name=None,
        group_name=None,
        headers=None,
    )

    db.groups.create_group(
        group_name,
        headers=None,
    )

    db.groups.get_group_members(
        group_name,
        headers=None,
    )

    db.groups.get_groups(headers=None)

    db.groups.get_groups_for_principal(
        user_name=None,
        group_name=None,
        headers=None,
    )

    db.groups.remove_from_group(
        parent_name,
        user_name=None,
        group_name=None,
        headers=None,
    )

    db.groups.remove_group(
        group_name,
        headers=None,
    )


DatabricksAPI.token
-------------------

.. code-block:: python

    db.token.create_token(
        lifetime_seconds=None,
        comment=None,
        headers=None,
    )

    db.token.list_tokens(headers=None)

    db.token.revoke_token(
        token_id,
        headers=None,
    )


DatabricksAPI.instance_pool
---------------------------

.. code-block:: python

    db.instance_pool.create_instance_pool(
        instance_pool_name=None,
        min_idle_instances=None,
        max_capacity=None,
        aws_attributes=None,
        node_type_id=None,
        custom_tags=None,
        idle_instance_autotermination_minutes=None,
        enable_elastic_disk=None,
        disk_spec=None,
        preloaded_spark_versions=None,
        headers=None,
    )

    db.instance_pool.delete_instance_pool(
        instance_pool_id=None,
        headers=None,
    )

    db.instance_pool.edit_instance_pool(
        instance_pool_id,
        instance_pool_name=None,
        min_idle_instances=None,
        max_capacity=None,
        aws_attributes=None,
        node_type_id=None,
        custom_tags=None,
        idle_instance_autotermination_minutes=None,
        enable_elastic_disk=None,
        disk_spec=None,
        preloaded_spark_versions=None,
        headers=None,
    )

    db.instance_pool.get_instance_pool(
        instance_pool_id=None,
        headers=None,
    )

    db.instance_pool.list_instance_pools(headers=None)


DatabricksAPI.delta_pipelines
-----------------------------

.. code-block:: python

    db.delta_pipelines.create(
        id=None,
        name=None,
        storage=None,
        configuration=None,
        clusters=None,
        libraries=None,
        trigger=None,
        filters=None,
        allow_duplicate_names=None,
        headers=None,
    )

    db.delta_pipelines.delete(
        pipeline_id=None,
        headers=None,
    )

    db.delta_pipelines.deploy(
        pipeline_id=None,
        id=None,
        name=None,
        storage=None,
        configuration=None,
        clusters=None,
        libraries=None,
        trigger=None,
        filters=None,
        allow_duplicate_names=None,
        headers=None,
    )

    db.delta_pipelines.get(
        pipeline_id=None,
        headers=None,
    )

    db.delta_pipelines.list(
        pagination=None,
        headers=None,
    )

    db.delta_pipelines.reset(
        pipeline_id=None,
        headers=None,
    )

    db.delta_pipelines.run(
        pipeline_id=None,
        headers=None,
    )

    db.delta_pipelines.start_update(
        pipeline_id=None,
        full_refresh=None,
        headers=None,
    )

    db.delta_pipelines.stop(
        pipeline_id=None,
        headers=None,
    )


DatabricksAPI.repos
-------------------

.. code-block:: python

    db.repos.create_repo(
        url,
        provider,
        path=None,
        headers=None,
    )

    db.repos.delete_repo(
        id,
        headers=None,
    )

    db.repos.get_repo(
        id,
        headers=None,
    )

    db.repos.list_repos(
        path_prefix=None,
        next_page_token=None,
        headers=None,
    )

    db.repos.update_repo(
        id,
        branch=None,
        tag=None,
        headers=None,
    )



            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/crflynn/databricks-api",
    "name": "databricks-api",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.6,<4.0",
    "maintainer_email": "",
    "keywords": "databricks,api,client",
    "author": "Christopher Flynn",
    "author_email": "crf204@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/cc/6b/8e16b793108f0dc8d5d23516d26b377ef2eccea4e19a5b6a11893459ddd0/databricks_api-0.9.0.tar.gz",
    "platform": null,
    "description": "databricks-api\n==============\n\n**Please switch to the official Databricks SDK for Python (https://github.com/databricks/databricks-sdk-py) by running the following command:**\n\n.. code-block:: bash\n\n    pip install databricks-sdk\n\n\n|pypi| |pyversions|\n\n.. |pypi| image:: https://img.shields.io/pypi/v/databricks-api.svg\n    :target: https://pypi.python.org/pypi/databricks-api\n\n.. |pyversions| image:: https://img.shields.io/pypi/pyversions/databricks-api.svg\n    :target: https://pypi.python.org/pypi/databricks-api\n\n*[This documentation is auto-generated]*\n\nThis package provides a simplified interface for the Databricks REST API.\nThe interface is autogenerated on instantiation using the underlying client\nlibrary used in the official ``databricks-cli`` python package.\n\nInstall using\n\n.. code-block:: bash\n\n    pip install databricks-api\n    \n\nThe docs here describe the interface for version **0.17.0** of\nthe ``databricks-cli`` package for API version **2.0**.\n\nThe ``databricks-api`` package contains a ``DatabricksAPI`` class which provides\ninstance attributes for the ``databricks-cli`` ``ApiClient``, as well as each of\nthe available service instances. The attributes of a ``DatabricksAPI`` instance are:\n\n* DatabricksAPI.client *<databricks_cli.sdk.api_client.ApiClient>*\n* DatabricksAPI.jobs *<databricks_cli.sdk.service.JobsService>*\n* DatabricksAPI.cluster *<databricks_cli.sdk.service.ClusterService>*\n* DatabricksAPI.policy *<databricks_cli.sdk.service.PolicyService>*\n* DatabricksAPI.managed_library *<databricks_cli.sdk.service.ManagedLibraryService>*\n* DatabricksAPI.dbfs *<databricks_cli.sdk.service.DbfsService>*\n* DatabricksAPI.workspace *<databricks_cli.sdk.service.WorkspaceService>*\n* DatabricksAPI.secret *<databricks_cli.sdk.service.SecretService>*\n* DatabricksAPI.groups *<databricks_cli.sdk.service.GroupsService>*\n* DatabricksAPI.token *<databricks_cli.sdk.service.TokenService>*\n* DatabricksAPI.instance_pool *<databricks_cli.sdk.service.InstancePoolService>*\n* DatabricksAPI.delta_pipelines *<databricks_cli.sdk.service.DeltaPipelinesService>*\n* DatabricksAPI.repos *<databricks_cli.sdk.service.ReposService>*\n\nTo instantiate the client, provide the databricks host and either a token or\nuser and password. Also shown is the full signature of the\nunderlying ``ApiClient.__init__``\n\n.. code-block:: python\n\n    from databricks_api import DatabricksAPI\n\n    # Provide a host and token\n    db = DatabricksAPI(\n        host=\"example.cloud.databricks.com\",\n        token=\"dpapi123...\"\n    )\n\n    # OR a host and user and password\n    db = DatabricksAPI(\n        host=\"example.cloud.databricks.com\",\n        user=\"me@example.com\",\n        password=\"password\"\n    )\n\n    # Full __init__ signature\n    db = DatabricksAPI(\n        user=None,\n        password=None,\n        host=None,\n        token=None,\n        api_version='2.0',\n        default_headers={},\n        verify=True,\n        command_name='',\n        jobs_api_version=None\n    )\n\nRefer to the `official documentation <https://docs.databricks.com/api/index.html>`_\non the functionality and required arguments of each method below.\n\nEach of the service instance attributes provides the following public methods:\n\nDatabricksAPI.jobs\n------------------\n\n.. code-block:: python\n\n    db.jobs.cancel_run(\n        run_id,\n        headers=None,\n        version=None,\n    )\n\n    db.jobs.create_job(\n        name=None,\n        existing_cluster_id=None,\n        new_cluster=None,\n        libraries=None,\n        email_notifications=None,\n        timeout_seconds=None,\n        max_retries=None,\n        min_retry_interval_millis=None,\n        retry_on_timeout=None,\n        schedule=None,\n        notebook_task=None,\n        spark_jar_task=None,\n        spark_python_task=None,\n        spark_submit_task=None,\n        max_concurrent_runs=None,\n        tasks=None,\n        headers=None,\n        version=None,\n    )\n\n    db.jobs.delete_job(\n        job_id,\n        headers=None,\n        version=None,\n    )\n\n    db.jobs.delete_run(\n        run_id=None,\n        headers=None,\n        version=None,\n    )\n\n    db.jobs.export_run(\n        run_id,\n        views_to_export=None,\n        headers=None,\n        version=None,\n    )\n\n    db.jobs.get_job(\n        job_id,\n        headers=None,\n        version=None,\n    )\n\n    db.jobs.get_run(\n        run_id=None,\n        headers=None,\n        version=None,\n    )\n\n    db.jobs.get_run_output(\n        run_id,\n        headers=None,\n        version=None,\n    )\n\n    db.jobs.list_jobs(\n        job_type=None,\n        expand_tasks=None,\n        limit=None,\n        offset=None,\n        headers=None,\n        version=None,\n    )\n\n    db.jobs.list_runs(\n        job_id=None,\n        active_only=None,\n        completed_only=None,\n        offset=None,\n        limit=None,\n        headers=None,\n        version=None,\n    )\n\n    db.jobs.reset_job(\n        job_id,\n        new_settings,\n        headers=None,\n        version=None,\n    )\n\n    db.jobs.run_now(\n        job_id=None,\n        jar_params=None,\n        notebook_params=None,\n        python_params=None,\n        spark_submit_params=None,\n        python_named_params=None,\n        idempotency_token=None,\n        headers=None,\n        version=None,\n    )\n\n    db.jobs.submit_run(\n        run_name=None,\n        existing_cluster_id=None,\n        new_cluster=None,\n        libraries=None,\n        notebook_task=None,\n        spark_jar_task=None,\n        spark_python_task=None,\n        spark_submit_task=None,\n        timeout_seconds=None,\n        tasks=None,\n        headers=None,\n        version=None,\n    )\n\n\nDatabricksAPI.cluster\n---------------------\n\n.. code-block:: python\n\n    db.cluster.create_cluster(\n        num_workers=None,\n        autoscale=None,\n        cluster_name=None,\n        spark_version=None,\n        spark_conf=None,\n        aws_attributes=None,\n        node_type_id=None,\n        driver_node_type_id=None,\n        ssh_public_keys=None,\n        custom_tags=None,\n        cluster_log_conf=None,\n        spark_env_vars=None,\n        autotermination_minutes=None,\n        enable_elastic_disk=None,\n        cluster_source=None,\n        instance_pool_id=None,\n        headers=None,\n    )\n\n    db.cluster.delete_cluster(\n        cluster_id,\n        headers=None,\n    )\n\n    db.cluster.edit_cluster(\n        cluster_id,\n        num_workers=None,\n        autoscale=None,\n        cluster_name=None,\n        spark_version=None,\n        spark_conf=None,\n        aws_attributes=None,\n        node_type_id=None,\n        driver_node_type_id=None,\n        ssh_public_keys=None,\n        custom_tags=None,\n        cluster_log_conf=None,\n        spark_env_vars=None,\n        autotermination_minutes=None,\n        enable_elastic_disk=None,\n        cluster_source=None,\n        instance_pool_id=None,\n        headers=None,\n    )\n\n    db.cluster.get_cluster(\n        cluster_id,\n        headers=None,\n    )\n\n    db.cluster.get_events(\n        cluster_id,\n        start_time=None,\n        end_time=None,\n        order=None,\n        event_types=None,\n        offset=None,\n        limit=None,\n        headers=None,\n    )\n\n    db.cluster.list_available_zones(headers=None)\n\n    db.cluster.list_clusters(headers=None)\n\n    db.cluster.list_node_types(headers=None)\n\n    db.cluster.list_spark_versions(headers=None)\n\n    db.cluster.permanent_delete_cluster(\n        cluster_id,\n        headers=None,\n    )\n\n    db.cluster.pin_cluster(\n        cluster_id,\n        headers=None,\n    )\n\n    db.cluster.resize_cluster(\n        cluster_id,\n        num_workers=None,\n        autoscale=None,\n        headers=None,\n    )\n\n    db.cluster.restart_cluster(\n        cluster_id,\n        headers=None,\n    )\n\n    db.cluster.start_cluster(\n        cluster_id,\n        headers=None,\n    )\n\n    db.cluster.unpin_cluster(\n        cluster_id,\n        headers=None,\n    )\n\n\nDatabricksAPI.policy\n--------------------\n\n.. code-block:: python\n\n    db.policy.create_policy(\n        policy_name,\n        definition,\n        headers=None,\n    )\n\n    db.policy.delete_policy(\n        policy_id,\n        headers=None,\n    )\n\n    db.policy.edit_policy(\n        policy_id,\n        policy_name,\n        definition,\n        headers=None,\n    )\n\n    db.policy.get_policy(\n        policy_id,\n        headers=None,\n    )\n\n    db.policy.list_policies(headers=None)\n\n\nDatabricksAPI.managed_library\n-----------------------------\n\n.. code-block:: python\n\n    db.managed_library.all_cluster_statuses(headers=None)\n\n    db.managed_library.cluster_status(\n        cluster_id,\n        headers=None,\n    )\n\n    db.managed_library.install_libraries(\n        cluster_id,\n        libraries=None,\n        headers=None,\n    )\n\n    db.managed_library.uninstall_libraries(\n        cluster_id,\n        libraries=None,\n        headers=None,\n    )\n\n\nDatabricksAPI.dbfs\n------------------\n\n.. code-block:: python\n\n    db.dbfs.add_block(\n        handle,\n        data,\n        headers=None,\n    )\n\n    db.dbfs.add_block_test(\n        handle,\n        data,\n        headers=None,\n    )\n\n    db.dbfs.close(\n        handle,\n        headers=None,\n    )\n\n    db.dbfs.close_test(\n        handle,\n        headers=None,\n    )\n\n    db.dbfs.create(\n        path,\n        overwrite=None,\n        headers=None,\n    )\n\n    db.dbfs.create_test(\n        path,\n        overwrite=None,\n        headers=None,\n    )\n\n    db.dbfs.delete(\n        path,\n        recursive=None,\n        headers=None,\n    )\n\n    db.dbfs.delete_test(\n        path,\n        recursive=None,\n        headers=None,\n    )\n\n    db.dbfs.get_status(\n        path,\n        headers=None,\n    )\n\n    db.dbfs.get_status_test(\n        path,\n        headers=None,\n    )\n\n    db.dbfs.list(\n        path,\n        headers=None,\n    )\n\n    db.dbfs.list_test(\n        path,\n        headers=None,\n    )\n\n    db.dbfs.mkdirs(\n        path,\n        headers=None,\n    )\n\n    db.dbfs.mkdirs_test(\n        path,\n        headers=None,\n    )\n\n    db.dbfs.move(\n        source_path,\n        destination_path,\n        headers=None,\n    )\n\n    db.dbfs.move_test(\n        source_path,\n        destination_path,\n        headers=None,\n    )\n\n    db.dbfs.put(\n        path,\n        contents=None,\n        overwrite=None,\n        headers=None,\n        src_path=None,\n    )\n\n    db.dbfs.put_test(\n        path,\n        contents=None,\n        overwrite=None,\n        headers=None,\n        src_path=None,\n    )\n\n    db.dbfs.read(\n        path,\n        offset=None,\n        length=None,\n        headers=None,\n    )\n\n    db.dbfs.read_test(\n        path,\n        offset=None,\n        length=None,\n        headers=None,\n    )\n\n\nDatabricksAPI.workspace\n-----------------------\n\n.. code-block:: python\n\n    db.workspace.delete(\n        path,\n        recursive=None,\n        headers=None,\n    )\n\n    db.workspace.export_workspace(\n        path,\n        format=None,\n        direct_download=None,\n        headers=None,\n    )\n\n    db.workspace.get_status(\n        path,\n        headers=None,\n    )\n\n    db.workspace.import_workspace(\n        path,\n        format=None,\n        language=None,\n        content=None,\n        overwrite=None,\n        headers=None,\n    )\n\n    db.workspace.list(\n        path,\n        headers=None,\n    )\n\n    db.workspace.mkdirs(\n        path,\n        headers=None,\n    )\n\n\nDatabricksAPI.secret\n--------------------\n\n.. code-block:: python\n\n    db.secret.create_scope(\n        scope,\n        initial_manage_principal=None,\n        scope_backend_type=None,\n        backend_azure_keyvault=None,\n        headers=None,\n    )\n\n    db.secret.delete_acl(\n        scope,\n        principal,\n        headers=None,\n    )\n\n    db.secret.delete_scope(\n        scope,\n        headers=None,\n    )\n\n    db.secret.delete_secret(\n        scope,\n        key,\n        headers=None,\n    )\n\n    db.secret.get_acl(\n        scope,\n        principal,\n        headers=None,\n    )\n\n    db.secret.list_acls(\n        scope,\n        headers=None,\n    )\n\n    db.secret.list_scopes(headers=None)\n\n    db.secret.list_secrets(\n        scope,\n        headers=None,\n    )\n\n    db.secret.put_acl(\n        scope,\n        principal,\n        permission,\n        headers=None,\n    )\n\n    db.secret.put_secret(\n        scope,\n        key,\n        string_value=None,\n        bytes_value=None,\n        headers=None,\n    )\n\n\nDatabricksAPI.groups\n--------------------\n\n.. code-block:: python\n\n    db.groups.add_to_group(\n        parent_name,\n        user_name=None,\n        group_name=None,\n        headers=None,\n    )\n\n    db.groups.create_group(\n        group_name,\n        headers=None,\n    )\n\n    db.groups.get_group_members(\n        group_name,\n        headers=None,\n    )\n\n    db.groups.get_groups(headers=None)\n\n    db.groups.get_groups_for_principal(\n        user_name=None,\n        group_name=None,\n        headers=None,\n    )\n\n    db.groups.remove_from_group(\n        parent_name,\n        user_name=None,\n        group_name=None,\n        headers=None,\n    )\n\n    db.groups.remove_group(\n        group_name,\n        headers=None,\n    )\n\n\nDatabricksAPI.token\n-------------------\n\n.. code-block:: python\n\n    db.token.create_token(\n        lifetime_seconds=None,\n        comment=None,\n        headers=None,\n    )\n\n    db.token.list_tokens(headers=None)\n\n    db.token.revoke_token(\n        token_id,\n        headers=None,\n    )\n\n\nDatabricksAPI.instance_pool\n---------------------------\n\n.. code-block:: python\n\n    db.instance_pool.create_instance_pool(\n        instance_pool_name=None,\n        min_idle_instances=None,\n        max_capacity=None,\n        aws_attributes=None,\n        node_type_id=None,\n        custom_tags=None,\n        idle_instance_autotermination_minutes=None,\n        enable_elastic_disk=None,\n        disk_spec=None,\n        preloaded_spark_versions=None,\n        headers=None,\n    )\n\n    db.instance_pool.delete_instance_pool(\n        instance_pool_id=None,\n        headers=None,\n    )\n\n    db.instance_pool.edit_instance_pool(\n        instance_pool_id,\n        instance_pool_name=None,\n        min_idle_instances=None,\n        max_capacity=None,\n        aws_attributes=None,\n        node_type_id=None,\n        custom_tags=None,\n        idle_instance_autotermination_minutes=None,\n        enable_elastic_disk=None,\n        disk_spec=None,\n        preloaded_spark_versions=None,\n        headers=None,\n    )\n\n    db.instance_pool.get_instance_pool(\n        instance_pool_id=None,\n        headers=None,\n    )\n\n    db.instance_pool.list_instance_pools(headers=None)\n\n\nDatabricksAPI.delta_pipelines\n-----------------------------\n\n.. code-block:: python\n\n    db.delta_pipelines.create(\n        id=None,\n        name=None,\n        storage=None,\n        configuration=None,\n        clusters=None,\n        libraries=None,\n        trigger=None,\n        filters=None,\n        allow_duplicate_names=None,\n        headers=None,\n    )\n\n    db.delta_pipelines.delete(\n        pipeline_id=None,\n        headers=None,\n    )\n\n    db.delta_pipelines.deploy(\n        pipeline_id=None,\n        id=None,\n        name=None,\n        storage=None,\n        configuration=None,\n        clusters=None,\n        libraries=None,\n        trigger=None,\n        filters=None,\n        allow_duplicate_names=None,\n        headers=None,\n    )\n\n    db.delta_pipelines.get(\n        pipeline_id=None,\n        headers=None,\n    )\n\n    db.delta_pipelines.list(\n        pagination=None,\n        headers=None,\n    )\n\n    db.delta_pipelines.reset(\n        pipeline_id=None,\n        headers=None,\n    )\n\n    db.delta_pipelines.run(\n        pipeline_id=None,\n        headers=None,\n    )\n\n    db.delta_pipelines.start_update(\n        pipeline_id=None,\n        full_refresh=None,\n        headers=None,\n    )\n\n    db.delta_pipelines.stop(\n        pipeline_id=None,\n        headers=None,\n    )\n\n\nDatabricksAPI.repos\n-------------------\n\n.. code-block:: python\n\n    db.repos.create_repo(\n        url,\n        provider,\n        path=None,\n        headers=None,\n    )\n\n    db.repos.delete_repo(\n        id,\n        headers=None,\n    )\n\n    db.repos.get_repo(\n        id,\n        headers=None,\n    )\n\n    db.repos.list_repos(\n        path_prefix=None,\n        next_page_token=None,\n        headers=None,\n    )\n\n    db.repos.update_repo(\n        id,\n        branch=None,\n        tag=None,\n        headers=None,\n    )\n\n\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Databricks API client auto-generated from the official databricks-cli package",
    "version": "0.9.0",
    "project_urls": {
        "Homepage": "https://github.com/crflynn/databricks-api",
        "Repository": "https://github.com/crflynn/databricks-api"
    },
    "split_keywords": [
        "databricks",
        "api",
        "client"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "d0cc6c3f9cd8b2b6c7a45c95b94d334bc51f1579d875bbfac0ecb8accdb2f756",
                "md5": "bdd44cadb86c54e34fdadd535dcda1da",
                "sha256": "51327fc1a06d9f4125a7a74d6764c3f1e99b6fb8f4b7f7cc178679b2c0d8ae5b"
            },
            "downloads": -1,
            "filename": "databricks_api-0.9.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "bdd44cadb86c54e34fdadd535dcda1da",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6,<4.0",
            "size": 7428,
            "upload_time": "2023-06-08T16:37:30",
            "upload_time_iso_8601": "2023-06-08T16:37:30.394202Z",
            "url": "https://files.pythonhosted.org/packages/d0/cc/6c3f9cd8b2b6c7a45c95b94d334bc51f1579d875bbfac0ecb8accdb2f756/databricks_api-0.9.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "cc6b8e16b793108f0dc8d5d23516d26b377ef2eccea4e19a5b6a11893459ddd0",
                "md5": "7a2bdbae09eca1fcda4caf1d7344a7ed",
                "sha256": "40db26831ae37d2659d2700f4cb253615d895b6d440b99fb995aed51e67928f0"
            },
            "downloads": -1,
            "filename": "databricks_api-0.9.0.tar.gz",
            "has_sig": false,
            "md5_digest": "7a2bdbae09eca1fcda4caf1d7344a7ed",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6,<4.0",
            "size": 8398,
            "upload_time": "2023-06-08T16:37:33",
            "upload_time_iso_8601": "2023-06-08T16:37:33.211950Z",
            "url": "https://files.pythonhosted.org/packages/cc/6b/8e16b793108f0dc8d5d23516d26b377ef2eccea4e19a5b6a11893459ddd0/databricks_api-0.9.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-06-08 16:37:33",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "crflynn",
    "github_project": "databricks-api",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "databricks-api"
}
        
Elapsed time: 0.53965s