# databricks-jobs
The Jobs API allows you to create, edit, and delete jobs.
You should never hard code secrets or store them in plain text. Use the [Secrets API](https://docs.microsoft.com/azure/databricks/dev-tools/api/latest/secrets) to manage secrets in the [Databricks CLI](https://docs.microsoft.com/azure/databricks/dev-tools/cli/index). Use the [Secrets utility](https://docs.microsoft.com/azure/databricks/dev-tools/databricks-utils#dbutils-secrets) to reference secrets in notebooks and jobs.
This Python package is automatically generated by the [OpenAPI Generator](https://openapi-generator.tech) project:
- API version: 2.1
- Package version: 1.0.0
- Build package: org.openapitools.codegen.languages.PythonNextgenClientCodegen
## Requirements.
Python 3.7+
## Installation & Usage
### pip install
If the python package is hosted on a repository, you can install directly using:
```sh
pip install git+https://github.com/GIT_USER_ID/GIT_REPO_ID.git
```
(you may need to run `pip` with root permission: `sudo pip install git+https://github.com/GIT_USER_ID/GIT_REPO_ID.git`)
Then import the package:
```python
import databricks_jobs
```
### Setuptools
Install via [Setuptools](http://pypi.python.org/pypi/setuptools).
```sh
python setup.py install --user
```
(or `sudo python setup.py install` to install the package for all users)
Then import the package:
```python
import databricks_jobs
```
## Getting Started
Please follow the [installation procedure](#installation--usage) and then run the following:
```python
from __future__ import print_function
import time
import databricks_jobs
from databricks_jobs.rest import ApiException
from pprint import pprint
# Defining the host is optional and defaults to https://<databricks-instance>/api
# See configuration.py for a list of all supported configuration parameters.
configuration = databricks_jobs.Configuration(
host = "https://<databricks-instance>/api"
)
# The client must configure the authentication and authorization parameters
# in accordance with the API server security policy.
# Examples for each auth method are provided below, use the example that
# satisfies your auth use case.
# Configure Bearer authorization (api_token): bearerAuth
configuration = databricks_jobs.Configuration(
access_token = os.environ["BEARER_TOKEN"]
)
# Enter a context with an instance of the API client
with databricks_jobs.ApiClient(configuration) as api_client:
# Create an instance of the API class
api_instance = databricks_jobs.DefaultApi(api_client)
jobs_create_request = databricks_jobs.JobsCreateRequest() # JobsCreateRequest |
try:
# Create a new job
api_response = api_instance.jobs_create(jobs_create_request)
print("The response of DefaultApi->jobs_create:\n")
pprint(api_response)
except ApiException as e:
print("Exception when calling DefaultApi->jobs_create: %s\n" % e)
```
## Documentation for API Endpoints
All URIs are relative to *https://<databricks-instance>/api*
Class | Method | HTTP request | Description
------------ | ------------- | ------------- | -------------
*DefaultApi* | [**jobs_create**](docs/DefaultApi.md#jobs_create) | **POST** /2.1/jobs/create | Create a new job
*DefaultApi* | [**jobs_delete**](docs/DefaultApi.md#jobs_delete) | **POST** /2.1/jobs/delete | Delete a job
*DefaultApi* | [**jobs_get**](docs/DefaultApi.md#jobs_get) | **GET** /2.1/jobs/get | Get a single job
*DefaultApi* | [**jobs_list**](docs/DefaultApi.md#jobs_list) | **GET** /2.1/jobs/list | List all jobs
*DefaultApi* | [**jobs_reset**](docs/DefaultApi.md#jobs_reset) | **POST** /2.1/jobs/reset | Overwrites all settings for a job
*DefaultApi* | [**jobs_run_now**](docs/DefaultApi.md#jobs_run_now) | **POST** /2.1/jobs/run-now | Trigger a new job run
*DefaultApi* | [**jobs_runs_cancel**](docs/DefaultApi.md#jobs_runs_cancel) | **POST** /2.1/jobs/runs/cancel | Cancel a job run
*DefaultApi* | [**jobs_runs_cancel_all**](docs/DefaultApi.md#jobs_runs_cancel_all) | **POST** /2.1/jobs/runs/cancel-all | Cancel all runs of a job
*DefaultApi* | [**jobs_runs_delete**](docs/DefaultApi.md#jobs_runs_delete) | **POST** /2.1/jobs/runs/delete | Delete a job run
*DefaultApi* | [**jobs_runs_export**](docs/DefaultApi.md#jobs_runs_export) | **GET** /2.0/jobs/runs/export | Export and retrieve a job run
*DefaultApi* | [**jobs_runs_get**](docs/DefaultApi.md#jobs_runs_get) | **GET** /2.1/jobs/runs/get | Get a single job run
*DefaultApi* | [**jobs_runs_get_output**](docs/DefaultApi.md#jobs_runs_get_output) | **GET** /2.1/jobs/runs/get-output | Get the output for a single run
*DefaultApi* | [**jobs_runs_list**](docs/DefaultApi.md#jobs_runs_list) | **GET** /2.1/jobs/runs/list | List runs for a job
*DefaultApi* | [**jobs_runs_repair**](docs/DefaultApi.md#jobs_runs_repair) | **POST** /2.1/jobs/runs/repair | Repair a job run
*DefaultApi* | [**jobs_runs_submit**](docs/DefaultApi.md#jobs_runs_submit) | **POST** /2.1/jobs/runs/submit | Create and trigger a one-time run
*DefaultApi* | [**jobs_update**](docs/DefaultApi.md#jobs_update) | **POST** /2.1/jobs/update | Partially updates a job
## Documentation For Models
- [AccessControlList](docs/AccessControlList.md)
- [AccessControlRequest](docs/AccessControlRequest.md)
- [AccessControlRequestForGroup](docs/AccessControlRequestForGroup.md)
- [AccessControlRequestForServicePrincipal](docs/AccessControlRequestForServicePrincipal.md)
- [AccessControlRequestForUser](docs/AccessControlRequestForUser.md)
- [Adlsgen2Info](docs/Adlsgen2Info.md)
- [AutoScale](docs/AutoScale.md)
- [AwsAttributes](docs/AwsAttributes.md)
- [AzureAttributes](docs/AzureAttributes.md)
- [CanManage](docs/CanManage.md)
- [CanManageRun](docs/CanManageRun.md)
- [CanView](docs/CanView.md)
- [ClusterAttributes](docs/ClusterAttributes.md)
- [ClusterCloudProviderNodeInfo](docs/ClusterCloudProviderNodeInfo.md)
- [ClusterCloudProviderNodeStatus](docs/ClusterCloudProviderNodeStatus.md)
- [ClusterEvent](docs/ClusterEvent.md)
- [ClusterEventType](docs/ClusterEventType.md)
- [ClusterInfo](docs/ClusterInfo.md)
- [ClusterInstance](docs/ClusterInstance.md)
- [ClusterLibraryStatuses](docs/ClusterLibraryStatuses.md)
- [ClusterLogConf](docs/ClusterLogConf.md)
- [ClusterSize](docs/ClusterSize.md)
- [ClusterSource](docs/ClusterSource.md)
- [ClusterSpec](docs/ClusterSpec.md)
- [ClusterState](docs/ClusterState.md)
- [CronSchedule](docs/CronSchedule.md)
- [DbfsStorageInfo](docs/DbfsStorageInfo.md)
- [DbtOutput](docs/DbtOutput.md)
- [DbtTask](docs/DbtTask.md)
- [DockerBasicAuth](docs/DockerBasicAuth.md)
- [DockerImage](docs/DockerImage.md)
- [Error](docs/Error.md)
- [EventDetails](docs/EventDetails.md)
- [FileStorageInfo](docs/FileStorageInfo.md)
- [GcpAttributes](docs/GcpAttributes.md)
- [GitBranchSource](docs/GitBranchSource.md)
- [GitCommitSource](docs/GitCommitSource.md)
- [GitProvider](docs/GitProvider.md)
- [GitSnapshot](docs/GitSnapshot.md)
- [GitSource](docs/GitSource.md)
- [GitTagSource](docs/GitTagSource.md)
- [InitScriptInfo](docs/InitScriptInfo.md)
- [IsOwner](docs/IsOwner.md)
- [Job](docs/Job.md)
- [JobCluster](docs/JobCluster.md)
- [JobEmailNotifications](docs/JobEmailNotifications.md)
- [JobSettings](docs/JobSettings.md)
- [JobTask](docs/JobTask.md)
- [JobTaskSettings](docs/JobTaskSettings.md)
- [JobsCreate200Response](docs/JobsCreate200Response.md)
- [JobsCreateRequest](docs/JobsCreateRequest.md)
- [JobsDeleteRequest](docs/JobsDeleteRequest.md)
- [JobsGet200Response](docs/JobsGet200Response.md)
- [JobsList200Response](docs/JobsList200Response.md)
- [JobsResetRequest](docs/JobsResetRequest.md)
- [JobsRunNow200Response](docs/JobsRunNow200Response.md)
- [JobsRunNowRequest](docs/JobsRunNowRequest.md)
- [JobsRunsCancelAllRequest](docs/JobsRunsCancelAllRequest.md)
- [JobsRunsCancelRequest](docs/JobsRunsCancelRequest.md)
- [JobsRunsDeleteRequest](docs/JobsRunsDeleteRequest.md)
- [JobsRunsExport200Response](docs/JobsRunsExport200Response.md)
- [JobsRunsGet200Response](docs/JobsRunsGet200Response.md)
- [JobsRunsGetOutput200Response](docs/JobsRunsGetOutput200Response.md)
- [JobsRunsList200Response](docs/JobsRunsList200Response.md)
- [JobsRunsRepair200Response](docs/JobsRunsRepair200Response.md)
- [JobsRunsRepairRequest](docs/JobsRunsRepairRequest.md)
- [JobsRunsSubmit200Response](docs/JobsRunsSubmit200Response.md)
- [JobsRunsSubmitRequest](docs/JobsRunsSubmitRequest.md)
- [JobsUpdateRequest](docs/JobsUpdateRequest.md)
- [Library](docs/Library.md)
- [LibraryFullStatus](docs/LibraryFullStatus.md)
- [LibraryInstallStatus](docs/LibraryInstallStatus.md)
- [ListOrder](docs/ListOrder.md)
- [LogSyncStatus](docs/LogSyncStatus.md)
- [MavenLibrary](docs/MavenLibrary.md)
- [NewCluster](docs/NewCluster.md)
- [NewTaskCluster](docs/NewTaskCluster.md)
- [NodeType](docs/NodeType.md)
- [NotebookOutput](docs/NotebookOutput.md)
- [NotebookTask](docs/NotebookTask.md)
- [PermissionLevel](docs/PermissionLevel.md)
- [PermissionLevelForGroup](docs/PermissionLevelForGroup.md)
- [PipelineTask](docs/PipelineTask.md)
- [PoolClusterTerminationCode](docs/PoolClusterTerminationCode.md)
- [PythonPyPiLibrary](docs/PythonPyPiLibrary.md)
- [PythonWheelTask](docs/PythonWheelTask.md)
- [RCranLibrary](docs/RCranLibrary.md)
- [RepairHistory](docs/RepairHistory.md)
- [RepairHistoryItem](docs/RepairHistoryItem.md)
- [RepairRunInput](docs/RepairRunInput.md)
- [ResizeCause](docs/ResizeCause.md)
- [Run](docs/Run.md)
- [RunLifeCycleState](docs/RunLifeCycleState.md)
- [RunNowInput](docs/RunNowInput.md)
- [RunParameters](docs/RunParameters.md)
- [RunParametersPipelineParams](docs/RunParametersPipelineParams.md)
- [RunResultState](docs/RunResultState.md)
- [RunState](docs/RunState.md)
- [RunSubmitSettings](docs/RunSubmitSettings.md)
- [RunSubmitTaskSettings](docs/RunSubmitTaskSettings.md)
- [RunTask](docs/RunTask.md)
- [RunType](docs/RunType.md)
- [S3StorageInfo](docs/S3StorageInfo.md)
- [SparkJarTask](docs/SparkJarTask.md)
- [SparkNode](docs/SparkNode.md)
- [SparkNodeAwsAttributes](docs/SparkNodeAwsAttributes.md)
- [SparkPythonTask](docs/SparkPythonTask.md)
- [SparkSubmitTask](docs/SparkSubmitTask.md)
- [SparkVersion](docs/SparkVersion.md)
- [SqlAlertOutput](docs/SqlAlertOutput.md)
- [SqlDashboardOutput](docs/SqlDashboardOutput.md)
- [SqlDashboardWidgetOutput](docs/SqlDashboardWidgetOutput.md)
- [SqlOutput](docs/SqlOutput.md)
- [SqlOutputError](docs/SqlOutputError.md)
- [SqlQueryOutput](docs/SqlQueryOutput.md)
- [SqlStatementOutput](docs/SqlStatementOutput.md)
- [SqlTask](docs/SqlTask.md)
- [SqlTaskAlert](docs/SqlTaskAlert.md)
- [SqlTaskDashboard](docs/SqlTaskDashboard.md)
- [SqlTaskQuery](docs/SqlTaskQuery.md)
- [TaskDependenciesInner](docs/TaskDependenciesInner.md)
- [TaskSparkSubmitTask](docs/TaskSparkSubmitTask.md)
- [TerminationCode](docs/TerminationCode.md)
- [TerminationParameter](docs/TerminationParameter.md)
- [TerminationReason](docs/TerminationReason.md)
- [TerminationType](docs/TerminationType.md)
- [TriggerType](docs/TriggerType.md)
- [ViewItem](docs/ViewItem.md)
- [ViewType](docs/ViewType.md)
- [ViewsToExport](docs/ViewsToExport.md)
- [WebhookNotifications](docs/WebhookNotifications.md)
- [WebhookNotificationsOnStartInner](docs/WebhookNotificationsOnStartInner.md)
## Documentation For Authorization
## bearerAuth
- **Type**: Bearer authentication (api_token)
## Author
Raw data
{
"_id": null,
"home_page": "https://github.com/judahrand/databricks-jobs",
"name": "databricks-jobs",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": "",
"keywords": "OpenAPI,OpenAPI-Generator,Databricks,Jobs API 2.1",
"author": "Judah Rand",
"author_email": "17158624+judahrand@users.noreply.github.com",
"download_url": "https://files.pythonhosted.org/packages/72/4e/af35cbe9e084cbc25cc535bb0cc14b76fcadf37dc8e117133ef0aba51aa7/databricks-jobs-1.0.2.tar.gz",
"platform": null,
"description": "# databricks-jobs\nThe Jobs API allows you to create, edit, and delete jobs.\nYou should never hard code secrets or store them in plain text. Use the [Secrets API](https://docs.microsoft.com/azure/databricks/dev-tools/api/latest/secrets) to manage secrets in the [Databricks CLI](https://docs.microsoft.com/azure/databricks/dev-tools/cli/index). Use the [Secrets utility](https://docs.microsoft.com/azure/databricks/dev-tools/databricks-utils#dbutils-secrets) to reference secrets in notebooks and jobs.\n\nThis Python package is automatically generated by the [OpenAPI Generator](https://openapi-generator.tech) project:\n\n- API version: 2.1\n- Package version: 1.0.0\n- Build package: org.openapitools.codegen.languages.PythonNextgenClientCodegen\n\n## Requirements.\n\nPython 3.7+\n\n## Installation & Usage\n### pip install\n\nIf the python package is hosted on a repository, you can install directly using:\n\n```sh\npip install git+https://github.com/GIT_USER_ID/GIT_REPO_ID.git\n```\n(you may need to run `pip` with root permission: `sudo pip install git+https://github.com/GIT_USER_ID/GIT_REPO_ID.git`)\n\nThen import the package:\n```python\nimport databricks_jobs\n```\n\n### Setuptools\n\nInstall via [Setuptools](http://pypi.python.org/pypi/setuptools).\n\n```sh\npython setup.py install --user\n```\n(or `sudo python setup.py install` to install the package for all users)\n\nThen import the package:\n```python\nimport databricks_jobs\n```\n\n## Getting Started\n\nPlease follow the [installation procedure](#installation--usage) and then run the following:\n\n```python\nfrom __future__ import print_function\n\nimport time\nimport databricks_jobs\nfrom databricks_jobs.rest import ApiException\nfrom pprint import pprint\n\n# Defining the host is optional and defaults to https://<databricks-instance>/api\n# See configuration.py for a list of all supported configuration parameters.\nconfiguration = databricks_jobs.Configuration(\n host = \"https://<databricks-instance>/api\"\n)\n\n# The client must configure the authentication and authorization parameters\n# in accordance with the API server security policy.\n# Examples for each auth method are provided below, use the example that\n# satisfies your auth use case.\n\n# Configure Bearer authorization (api_token): bearerAuth\nconfiguration = databricks_jobs.Configuration(\n access_token = os.environ[\"BEARER_TOKEN\"]\n)\n\n\n# Enter a context with an instance of the API client\nwith databricks_jobs.ApiClient(configuration) as api_client:\n # Create an instance of the API class\n api_instance = databricks_jobs.DefaultApi(api_client)\n jobs_create_request = databricks_jobs.JobsCreateRequest() # JobsCreateRequest | \n\n try:\n # Create a new job\n api_response = api_instance.jobs_create(jobs_create_request)\n print(\"The response of DefaultApi->jobs_create:\\n\")\n pprint(api_response)\n except ApiException as e:\n print(\"Exception when calling DefaultApi->jobs_create: %s\\n\" % e)\n\n```\n\n## Documentation for API Endpoints\n\nAll URIs are relative to *https://<databricks-instance>/api*\n\nClass | Method | HTTP request | Description\n------------ | ------------- | ------------- | -------------\n*DefaultApi* | [**jobs_create**](docs/DefaultApi.md#jobs_create) | **POST** /2.1/jobs/create | Create a new job\n*DefaultApi* | [**jobs_delete**](docs/DefaultApi.md#jobs_delete) | **POST** /2.1/jobs/delete | Delete a job\n*DefaultApi* | [**jobs_get**](docs/DefaultApi.md#jobs_get) | **GET** /2.1/jobs/get | Get a single job\n*DefaultApi* | [**jobs_list**](docs/DefaultApi.md#jobs_list) | **GET** /2.1/jobs/list | List all jobs\n*DefaultApi* | [**jobs_reset**](docs/DefaultApi.md#jobs_reset) | **POST** /2.1/jobs/reset | Overwrites all settings for a job\n*DefaultApi* | [**jobs_run_now**](docs/DefaultApi.md#jobs_run_now) | **POST** /2.1/jobs/run-now | Trigger a new job run\n*DefaultApi* | [**jobs_runs_cancel**](docs/DefaultApi.md#jobs_runs_cancel) | **POST** /2.1/jobs/runs/cancel | Cancel a job run\n*DefaultApi* | [**jobs_runs_cancel_all**](docs/DefaultApi.md#jobs_runs_cancel_all) | **POST** /2.1/jobs/runs/cancel-all | Cancel all runs of a job\n*DefaultApi* | [**jobs_runs_delete**](docs/DefaultApi.md#jobs_runs_delete) | **POST** /2.1/jobs/runs/delete | Delete a job run\n*DefaultApi* | [**jobs_runs_export**](docs/DefaultApi.md#jobs_runs_export) | **GET** /2.0/jobs/runs/export | Export and retrieve a job run\n*DefaultApi* | [**jobs_runs_get**](docs/DefaultApi.md#jobs_runs_get) | **GET** /2.1/jobs/runs/get | Get a single job run\n*DefaultApi* | [**jobs_runs_get_output**](docs/DefaultApi.md#jobs_runs_get_output) | **GET** /2.1/jobs/runs/get-output | Get the output for a single run\n*DefaultApi* | [**jobs_runs_list**](docs/DefaultApi.md#jobs_runs_list) | **GET** /2.1/jobs/runs/list | List runs for a job\n*DefaultApi* | [**jobs_runs_repair**](docs/DefaultApi.md#jobs_runs_repair) | **POST** /2.1/jobs/runs/repair | Repair a job run\n*DefaultApi* | [**jobs_runs_submit**](docs/DefaultApi.md#jobs_runs_submit) | **POST** /2.1/jobs/runs/submit | Create and trigger a one-time run\n*DefaultApi* | [**jobs_update**](docs/DefaultApi.md#jobs_update) | **POST** /2.1/jobs/update | Partially updates a job\n\n\n## Documentation For Models\n\n - [AccessControlList](docs/AccessControlList.md)\n - [AccessControlRequest](docs/AccessControlRequest.md)\n - [AccessControlRequestForGroup](docs/AccessControlRequestForGroup.md)\n - [AccessControlRequestForServicePrincipal](docs/AccessControlRequestForServicePrincipal.md)\n - [AccessControlRequestForUser](docs/AccessControlRequestForUser.md)\n - [Adlsgen2Info](docs/Adlsgen2Info.md)\n - [AutoScale](docs/AutoScale.md)\n - [AwsAttributes](docs/AwsAttributes.md)\n - [AzureAttributes](docs/AzureAttributes.md)\n - [CanManage](docs/CanManage.md)\n - [CanManageRun](docs/CanManageRun.md)\n - [CanView](docs/CanView.md)\n - [ClusterAttributes](docs/ClusterAttributes.md)\n - [ClusterCloudProviderNodeInfo](docs/ClusterCloudProviderNodeInfo.md)\n - [ClusterCloudProviderNodeStatus](docs/ClusterCloudProviderNodeStatus.md)\n - [ClusterEvent](docs/ClusterEvent.md)\n - [ClusterEventType](docs/ClusterEventType.md)\n - [ClusterInfo](docs/ClusterInfo.md)\n - [ClusterInstance](docs/ClusterInstance.md)\n - [ClusterLibraryStatuses](docs/ClusterLibraryStatuses.md)\n - [ClusterLogConf](docs/ClusterLogConf.md)\n - [ClusterSize](docs/ClusterSize.md)\n - [ClusterSource](docs/ClusterSource.md)\n - [ClusterSpec](docs/ClusterSpec.md)\n - [ClusterState](docs/ClusterState.md)\n - [CronSchedule](docs/CronSchedule.md)\n - [DbfsStorageInfo](docs/DbfsStorageInfo.md)\n - [DbtOutput](docs/DbtOutput.md)\n - [DbtTask](docs/DbtTask.md)\n - [DockerBasicAuth](docs/DockerBasicAuth.md)\n - [DockerImage](docs/DockerImage.md)\n - [Error](docs/Error.md)\n - [EventDetails](docs/EventDetails.md)\n - [FileStorageInfo](docs/FileStorageInfo.md)\n - [GcpAttributes](docs/GcpAttributes.md)\n - [GitBranchSource](docs/GitBranchSource.md)\n - [GitCommitSource](docs/GitCommitSource.md)\n - [GitProvider](docs/GitProvider.md)\n - [GitSnapshot](docs/GitSnapshot.md)\n - [GitSource](docs/GitSource.md)\n - [GitTagSource](docs/GitTagSource.md)\n - [InitScriptInfo](docs/InitScriptInfo.md)\n - [IsOwner](docs/IsOwner.md)\n - [Job](docs/Job.md)\n - [JobCluster](docs/JobCluster.md)\n - [JobEmailNotifications](docs/JobEmailNotifications.md)\n - [JobSettings](docs/JobSettings.md)\n - [JobTask](docs/JobTask.md)\n - [JobTaskSettings](docs/JobTaskSettings.md)\n - [JobsCreate200Response](docs/JobsCreate200Response.md)\n - [JobsCreateRequest](docs/JobsCreateRequest.md)\n - [JobsDeleteRequest](docs/JobsDeleteRequest.md)\n - [JobsGet200Response](docs/JobsGet200Response.md)\n - [JobsList200Response](docs/JobsList200Response.md)\n - [JobsResetRequest](docs/JobsResetRequest.md)\n - [JobsRunNow200Response](docs/JobsRunNow200Response.md)\n - [JobsRunNowRequest](docs/JobsRunNowRequest.md)\n - [JobsRunsCancelAllRequest](docs/JobsRunsCancelAllRequest.md)\n - [JobsRunsCancelRequest](docs/JobsRunsCancelRequest.md)\n - [JobsRunsDeleteRequest](docs/JobsRunsDeleteRequest.md)\n - [JobsRunsExport200Response](docs/JobsRunsExport200Response.md)\n - [JobsRunsGet200Response](docs/JobsRunsGet200Response.md)\n - [JobsRunsGetOutput200Response](docs/JobsRunsGetOutput200Response.md)\n - [JobsRunsList200Response](docs/JobsRunsList200Response.md)\n - [JobsRunsRepair200Response](docs/JobsRunsRepair200Response.md)\n - [JobsRunsRepairRequest](docs/JobsRunsRepairRequest.md)\n - [JobsRunsSubmit200Response](docs/JobsRunsSubmit200Response.md)\n - [JobsRunsSubmitRequest](docs/JobsRunsSubmitRequest.md)\n - [JobsUpdateRequest](docs/JobsUpdateRequest.md)\n - [Library](docs/Library.md)\n - [LibraryFullStatus](docs/LibraryFullStatus.md)\n - [LibraryInstallStatus](docs/LibraryInstallStatus.md)\n - [ListOrder](docs/ListOrder.md)\n - [LogSyncStatus](docs/LogSyncStatus.md)\n - [MavenLibrary](docs/MavenLibrary.md)\n - [NewCluster](docs/NewCluster.md)\n - [NewTaskCluster](docs/NewTaskCluster.md)\n - [NodeType](docs/NodeType.md)\n - [NotebookOutput](docs/NotebookOutput.md)\n - [NotebookTask](docs/NotebookTask.md)\n - [PermissionLevel](docs/PermissionLevel.md)\n - [PermissionLevelForGroup](docs/PermissionLevelForGroup.md)\n - [PipelineTask](docs/PipelineTask.md)\n - [PoolClusterTerminationCode](docs/PoolClusterTerminationCode.md)\n - [PythonPyPiLibrary](docs/PythonPyPiLibrary.md)\n - [PythonWheelTask](docs/PythonWheelTask.md)\n - [RCranLibrary](docs/RCranLibrary.md)\n - [RepairHistory](docs/RepairHistory.md)\n - [RepairHistoryItem](docs/RepairHistoryItem.md)\n - [RepairRunInput](docs/RepairRunInput.md)\n - [ResizeCause](docs/ResizeCause.md)\n - [Run](docs/Run.md)\n - [RunLifeCycleState](docs/RunLifeCycleState.md)\n - [RunNowInput](docs/RunNowInput.md)\n - [RunParameters](docs/RunParameters.md)\n - [RunParametersPipelineParams](docs/RunParametersPipelineParams.md)\n - [RunResultState](docs/RunResultState.md)\n - [RunState](docs/RunState.md)\n - [RunSubmitSettings](docs/RunSubmitSettings.md)\n - [RunSubmitTaskSettings](docs/RunSubmitTaskSettings.md)\n - [RunTask](docs/RunTask.md)\n - [RunType](docs/RunType.md)\n - [S3StorageInfo](docs/S3StorageInfo.md)\n - [SparkJarTask](docs/SparkJarTask.md)\n - [SparkNode](docs/SparkNode.md)\n - [SparkNodeAwsAttributes](docs/SparkNodeAwsAttributes.md)\n - [SparkPythonTask](docs/SparkPythonTask.md)\n - [SparkSubmitTask](docs/SparkSubmitTask.md)\n - [SparkVersion](docs/SparkVersion.md)\n - [SqlAlertOutput](docs/SqlAlertOutput.md)\n - [SqlDashboardOutput](docs/SqlDashboardOutput.md)\n - [SqlDashboardWidgetOutput](docs/SqlDashboardWidgetOutput.md)\n - [SqlOutput](docs/SqlOutput.md)\n - [SqlOutputError](docs/SqlOutputError.md)\n - [SqlQueryOutput](docs/SqlQueryOutput.md)\n - [SqlStatementOutput](docs/SqlStatementOutput.md)\n - [SqlTask](docs/SqlTask.md)\n - [SqlTaskAlert](docs/SqlTaskAlert.md)\n - [SqlTaskDashboard](docs/SqlTaskDashboard.md)\n - [SqlTaskQuery](docs/SqlTaskQuery.md)\n - [TaskDependenciesInner](docs/TaskDependenciesInner.md)\n - [TaskSparkSubmitTask](docs/TaskSparkSubmitTask.md)\n - [TerminationCode](docs/TerminationCode.md)\n - [TerminationParameter](docs/TerminationParameter.md)\n - [TerminationReason](docs/TerminationReason.md)\n - [TerminationType](docs/TerminationType.md)\n - [TriggerType](docs/TriggerType.md)\n - [ViewItem](docs/ViewItem.md)\n - [ViewType](docs/ViewType.md)\n - [ViewsToExport](docs/ViewsToExport.md)\n - [WebhookNotifications](docs/WebhookNotifications.md)\n - [WebhookNotificationsOnStartInner](docs/WebhookNotificationsOnStartInner.md)\n\n\n## Documentation For Authorization\n\n\n## bearerAuth\n\n- **Type**: Bearer authentication (api_token)\n\n\n## Author\n\n\n\n\n",
"bugtrack_url": null,
"license": "",
"summary": "Databricks Jobs API 2.1 Client",
"version": "1.0.2",
"split_keywords": [
"openapi",
"openapi-generator",
"databricks",
"jobs api 2.1"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "0a09220d55df92dc69d6e5252c9cb8729a2d49a1571b194a54eea60e49dd4506",
"md5": "3159b487e59fd8253f530921abc85549",
"sha256": "6ed0a2deec5f0a9c13908f3e199dfbe843fdb53755de5645f5bd9963401be525"
},
"downloads": -1,
"filename": "databricks_jobs-1.0.2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "3159b487e59fd8253f530921abc85549",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 227188,
"upload_time": "2023-03-20T14:49:33",
"upload_time_iso_8601": "2023-03-20T14:49:33.722360Z",
"url": "https://files.pythonhosted.org/packages/0a/09/220d55df92dc69d6e5252c9cb8729a2d49a1571b194a54eea60e49dd4506/databricks_jobs-1.0.2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "724eaf35cbe9e084cbc25cc535bb0cc14b76fcadf37dc8e117133ef0aba51aa7",
"md5": "4aad6bc3e7aaf0c3f78d47019982c72a",
"sha256": "f9b4cd7089ce18a6710220383878048d5bd10bd70e98bf5318f5ba619a134b9f"
},
"downloads": -1,
"filename": "databricks-jobs-1.0.2.tar.gz",
"has_sig": false,
"md5_digest": "4aad6bc3e7aaf0c3f78d47019982c72a",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 88261,
"upload_time": "2023-03-20T14:49:35",
"upload_time_iso_8601": "2023-03-20T14:49:35.626081Z",
"url": "https://files.pythonhosted.org/packages/72/4e/af35cbe9e084cbc25cc535bb0cc14b76fcadf37dc8e117133ef0aba51aa7/databricks-jobs-1.0.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-03-20 14:49:35",
"github": true,
"gitlab": false,
"bitbucket": false,
"github_user": "judahrand",
"github_project": "databricks-jobs",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [],
"tox": true,
"lcname": "databricks-jobs"
}