apache-airflow-providers-databricks


Nameapache-airflow-providers-databricks JSON
Version 6.13.0 PyPI version JSON
download
home_pageNone
SummaryProvider package apache-airflow-providers-databricks for Apache Airflow
upload_time2024-11-18 08:57:46
maintainerNone
docs_urlNone
authorNone
requires_python~=3.9
licenseNone
keywords airflow-provider databricks airflow integration
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            
.. Licensed to the Apache Software Foundation (ASF) under one
   or more contributor license agreements.  See the NOTICE file
   distributed with this work for additional information
   regarding copyright ownership.  The ASF licenses this file
   to you under the Apache License, Version 2.0 (the
   "License"); you may not use this file except in compliance
   with the License.  You may obtain a copy of the License at

..   http://www.apache.org/licenses/LICENSE-2.0

.. Unless required by applicable law or agreed to in writing,
   software distributed under the License is distributed on an
   "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
   KIND, either express or implied.  See the License for the
   specific language governing permissions and limitations
   under the License.

 .. Licensed to the Apache Software Foundation (ASF) under one
    or more contributor license agreements.  See the NOTICE file
    distributed with this work for additional information
    regarding copyright ownership.  The ASF licenses this file
    to you under the Apache License, Version 2.0 (the
    "License"); you may not use this file except in compliance
    with the License.  You may obtain a copy of the License at

 ..   http://www.apache.org/licenses/LICENSE-2.0

 .. Unless required by applicable law or agreed to in writing,
    software distributed under the License is distributed on an
    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
    KIND, either express or implied.  See the License for the
    specific language governing permissions and limitations
    under the License.

 .. NOTE! THIS FILE IS AUTOMATICALLY GENERATED AND WILL BE
    OVERWRITTEN WHEN PREPARING PACKAGES.

 .. IF YOU WANT TO MODIFY TEMPLATE FOR THIS FILE, YOU SHOULD MODIFY THE TEMPLATE
    `PROVIDER_README_TEMPLATE.rst.jinja2` IN the `dev/breeze/src/airflow_breeze/templates` DIRECTORY


Package ``apache-airflow-providers-databricks``

Release: ``6.13.0``


`Databricks <https://databricks.com/>`__


Provider package
----------------

This is a provider package for ``databricks`` provider. All classes for this provider package
are in ``airflow.providers.databricks`` python package.

You can find package information and changelog for the provider
in the `documentation <https://airflow.apache.org/docs/apache-airflow-providers-databricks/6.13.0/>`_.

Installation
------------

You can install this package on top of an existing Airflow 2 installation (see ``Requirements`` below
for the minimum Airflow version supported) via
``pip install apache-airflow-providers-databricks``

The package supports the following python versions: 3.9,3.10,3.11,3.12

Requirements
------------

=======================================  =========================================
PIP package                              Version required
=======================================  =========================================
``apache-airflow``                       ``>=2.8.0``
``apache-airflow-providers-common-sql``  ``>=1.20.0``
``requests``                             ``>=2.27.0,<3``
``databricks-sql-connector``             ``>=2.0.0,!=2.9.0,<3.0.0``
``aiohttp``                              ``>=3.9.2,<4``
``mergedeep``                            ``>=1.3.4``
``pandas``                               ``>=2.1.2,<2.2; python_version >= "3.9"``
``pandas``                               ``>=1.5.3,<2.2; python_version < "3.9"``
``pyarrow``                              ``>=14.0.1``
=======================================  =========================================

Cross provider package dependencies
-----------------------------------

Those are dependencies that might be needed in order to use all the features of the package.
You need to install the specified provider packages in order to use them.

You can install such cross-provider dependencies when installing from PyPI. For example:

.. code-block:: bash

    pip install apache-airflow-providers-databricks[common.sql]


============================================================================================================  ==============
Dependent package                                                                                             Extra
============================================================================================================  ==============
`apache-airflow-providers-common-sql <https://airflow.apache.org/docs/apache-airflow-providers-common-sql>`_  ``common.sql``
============================================================================================================  ==============

The changelog for the provider package can be found in the
`changelog <https://airflow.apache.org/docs/apache-airflow-providers-databricks/6.13.0/changelog.html>`_.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "apache-airflow-providers-databricks",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "~=3.9",
    "maintainer_email": "Apache Software Foundation <dev@airflow.apache.org>",
    "keywords": "airflow-provider, databricks, airflow, integration",
    "author": null,
    "author_email": "Apache Software Foundation <dev@airflow.apache.org>",
    "download_url": "https://files.pythonhosted.org/packages/21/de/5c2c6b4c98df4a2d2c1dc64d54e9eb80cda503fd8635e11ad81ee0c757a3/apache_airflow_providers_databricks-6.13.0.tar.gz",
    "platform": null,
    "description": "\n.. Licensed to the Apache Software Foundation (ASF) under one\n   or more contributor license agreements.  See the NOTICE file\n   distributed with this work for additional information\n   regarding copyright ownership.  The ASF licenses this file\n   to you under the Apache License, Version 2.0 (the\n   \"License\"); you may not use this file except in compliance\n   with the License.  You may obtain a copy of the License at\n\n..   http://www.apache.org/licenses/LICENSE-2.0\n\n.. Unless required by applicable law or agreed to in writing,\n   software distributed under the License is distributed on an\n   \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY\n   KIND, either express or implied.  See the License for the\n   specific language governing permissions and limitations\n   under the License.\n\n .. Licensed to the Apache Software Foundation (ASF) under one\n    or more contributor license agreements.  See the NOTICE file\n    distributed with this work for additional information\n    regarding copyright ownership.  The ASF licenses this file\n    to you under the Apache License, Version 2.0 (the\n    \"License\"); you may not use this file except in compliance\n    with the License.  You may obtain a copy of the License at\n\n ..   http://www.apache.org/licenses/LICENSE-2.0\n\n .. Unless required by applicable law or agreed to in writing,\n    software distributed under the License is distributed on an\n    \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY\n    KIND, either express or implied.  See the License for the\n    specific language governing permissions and limitations\n    under the License.\n\n .. NOTE! THIS FILE IS AUTOMATICALLY GENERATED AND WILL BE\n    OVERWRITTEN WHEN PREPARING PACKAGES.\n\n .. IF YOU WANT TO MODIFY TEMPLATE FOR THIS FILE, YOU SHOULD MODIFY THE TEMPLATE\n    `PROVIDER_README_TEMPLATE.rst.jinja2` IN the `dev/breeze/src/airflow_breeze/templates` DIRECTORY\n\n\nPackage ``apache-airflow-providers-databricks``\n\nRelease: ``6.13.0``\n\n\n`Databricks <https://databricks.com/>`__\n\n\nProvider package\n----------------\n\nThis is a provider package for ``databricks`` provider. All classes for this provider package\nare in ``airflow.providers.databricks`` python package.\n\nYou can find package information and changelog for the provider\nin the `documentation <https://airflow.apache.org/docs/apache-airflow-providers-databricks/6.13.0/>`_.\n\nInstallation\n------------\n\nYou can install this package on top of an existing Airflow 2 installation (see ``Requirements`` below\nfor the minimum Airflow version supported) via\n``pip install apache-airflow-providers-databricks``\n\nThe package supports the following python versions: 3.9,3.10,3.11,3.12\n\nRequirements\n------------\n\n=======================================  =========================================\nPIP package                              Version required\n=======================================  =========================================\n``apache-airflow``                       ``>=2.8.0``\n``apache-airflow-providers-common-sql``  ``>=1.20.0``\n``requests``                             ``>=2.27.0,<3``\n``databricks-sql-connector``             ``>=2.0.0,!=2.9.0,<3.0.0``\n``aiohttp``                              ``>=3.9.2,<4``\n``mergedeep``                            ``>=1.3.4``\n``pandas``                               ``>=2.1.2,<2.2; python_version >= \"3.9\"``\n``pandas``                               ``>=1.5.3,<2.2; python_version < \"3.9\"``\n``pyarrow``                              ``>=14.0.1``\n=======================================  =========================================\n\nCross provider package dependencies\n-----------------------------------\n\nThose are dependencies that might be needed in order to use all the features of the package.\nYou need to install the specified provider packages in order to use them.\n\nYou can install such cross-provider dependencies when installing from PyPI. For example:\n\n.. code-block:: bash\n\n    pip install apache-airflow-providers-databricks[common.sql]\n\n\n============================================================================================================  ==============\nDependent package                                                                                             Extra\n============================================================================================================  ==============\n`apache-airflow-providers-common-sql <https://airflow.apache.org/docs/apache-airflow-providers-common-sql>`_  ``common.sql``\n============================================================================================================  ==============\n\nThe changelog for the provider package can be found in the\n`changelog <https://airflow.apache.org/docs/apache-airflow-providers-databricks/6.13.0/changelog.html>`_.\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Provider package apache-airflow-providers-databricks for Apache Airflow",
    "version": "6.13.0",
    "project_urls": {
        "Bug Tracker": "https://github.com/apache/airflow/issues",
        "Changelog": "https://airflow.apache.org/docs/apache-airflow-providers-databricks/6.13.0/changelog.html",
        "Documentation": "https://airflow.apache.org/docs/apache-airflow-providers-databricks/6.13.0",
        "Slack Chat": "https://s.apache.org/airflow-slack",
        "Source Code": "https://github.com/apache/airflow",
        "Twitter": "https://twitter.com/ApacheAirflow",
        "YouTube": "https://www.youtube.com/channel/UCSXwxpWZQ7XZ1WL3wqevChA/"
    },
    "split_keywords": [
        "airflow-provider",
        " databricks",
        " airflow",
        " integration"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "bb3604f38f313eaf48a21e29c0a4e5ebfe596b7b468961c215986663311e3f23",
                "md5": "7d148bfa3ad28f73ea2d5bebf537ba8f",
                "sha256": "dfe78f1a6efcb7abe5e32671eeb78a1a478f9625091497df586cf759456a4edc"
            },
            "downloads": -1,
            "filename": "apache_airflow_providers_databricks-6.13.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "7d148bfa3ad28f73ea2d5bebf537ba8f",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "~=3.9",
            "size": 71086,
            "upload_time": "2024-11-18T08:56:29",
            "upload_time_iso_8601": "2024-11-18T08:56:29.358374Z",
            "url": "https://files.pythonhosted.org/packages/bb/36/04f38f313eaf48a21e29c0a4e5ebfe596b7b468961c215986663311e3f23/apache_airflow_providers_databricks-6.13.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "21de5c2c6b4c98df4a2d2c1dc64d54e9eb80cda503fd8635e11ad81ee0c757a3",
                "md5": "b20613b706812880ac71272a156f84fe",
                "sha256": "e410ba5e5e1cc00aaa3304ff328ba96beebbfd9e85e1a596c9e2b643092baba9"
            },
            "downloads": -1,
            "filename": "apache_airflow_providers_databricks-6.13.0.tar.gz",
            "has_sig": false,
            "md5_digest": "b20613b706812880ac71272a156f84fe",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "~=3.9",
            "size": 52690,
            "upload_time": "2024-11-18T08:57:46",
            "upload_time_iso_8601": "2024-11-18T08:57:46.884768Z",
            "url": "https://files.pythonhosted.org/packages/21/de/5c2c6b4c98df4a2d2c1dc64d54e9eb80cda503fd8635e11ad81ee0c757a3/apache_airflow_providers_databricks-6.13.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-18 08:57:46",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "apache",
    "github_project": "airflow",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "apache-airflow-providers-databricks"
}
        
Elapsed time: 4.34871s