# PySpark Conectors - by: Eleflow BigData
[![Build and Publish on PyPI](https://github.com/eleflow/pyspark-connectors/actions/workflows/python-publish.yml/badge.svg)](https://github.com/eleflow/pyspark-connectors/actions/workflows/python-publish.yml)
This library provides many connections and integrations with another data sources.
## Index
- [PySpark Conectors - by: Eleflow BigData](#pyspark-conectors---by-eleflow-bigdata)
- [Index](#index)
- [Installing](#installing)
- [Development enviroment](#development-enviroment)
- [Packaging project in a .whl lib](#packaging-project-in-a-whl-lib)
- [Library development status](#library-development-status)
- [Connectors](#connectors)
- [Helpers and Utils](#helpers-and-utils)
- [Version history](#version-history)
## Installing
```bash
pip install pyspark-connectors
```
## Development enviroment
For develop you must guarantee that you have the Python (3.8 or higher) and Spark (3.1.2 or higher) installed, if you have ready the minimum environment for development in Python language, proceed with these steps:
```bash
# Clonning the project
$ git clone git@github.com:eleflow/pyspark-connectors.git
# Inside of the project root folder
$ python -m venv .env
# If Windows
$ .\.env\Script\Activate.ps1
# If linux dist
$ .\.env\Scripts\activate
# Installing requirements libraries
(.env) $ pip install -r .\requirements.txt
```
### Packaging project in a .whl lib
```bash
# Installing wheel package
(.env) $ pip install wheel
# Installing wheel contents
(.env) $ pip install check-wheel-contents
# Build and packaging project to .whl
(.env) $ python setup.py bdist_wheel
```
## Library development status
### Connectors
- [x] Google Sheets
- [x] Rest API
- [x] SQL Database
- [x] CosmosDB
- [ ] Elasticsearch
- [x] PipeDrive
- [ ] ActiveCampaign
- [ ] ReclameAqui
- [ ] Jira
### Helpers and Utils
- [ ] AWS Secrets Manager
## Version history
| Version | Date | Changes | Notes | Approved by |
| --- | --- | --- | --- | --- |
| 0.0.1a2 | 2022-05-08 | Initial development release | N/A | [@caiodearaujo](https://github.com/caiodearaujo) |
| 0.1.0 | 2022-06-01 | Initial release | N/A | [@caiodearaujo](https://github.com/caiodearaujo) |
| 0.2.0 | 2022-07-28 | New release with connectors stable | N/A | [@caiodearaujo](https://github.com/caiodearaujo) |
| 0.3.0 | 2024-06-10 | Fix - Pipedrive Person Service | N/A | [@cehira](https://github.com/cehira) |
Raw data
{
"_id": null,
"home_page": "https://github.com/eleflow/pyspark-connectors/",
"name": "pyspark-connectors",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": "pyspark databricks integrator connector cosmosdb sql nosql sqlserver oracle mysql postgres mariadb pipedrive activecampaign googlesheet restapi",
"author": "Eleflow BigData",
"author_email": "caio.araujo@eleflow.com.br",
"download_url": "https://files.pythonhosted.org/packages/fb/82/1fb05d384151bb0103c069a0fe621e2af24a4b125f957e5e7311cda4f8dc/pyspark-connectors-0.3.0.tar.gz",
"platform": null,
"description": "\n# PySpark Conectors - by: Eleflow BigData\n\n[![Build and Publish on PyPI](https://github.com/eleflow/pyspark-connectors/actions/workflows/python-publish.yml/badge.svg)](https://github.com/eleflow/pyspark-connectors/actions/workflows/python-publish.yml)\n\nThis library provides many connections and integrations with another data sources.\n\n## Index\n\n- [PySpark Conectors - by: Eleflow BigData](#pyspark-conectors---by-eleflow-bigdata)\n - [Index](#index)\n - [Installing](#installing)\n - [Development enviroment](#development-enviroment)\n - [Packaging project in a .whl lib](#packaging-project-in-a-whl-lib)\n - [Library development status](#library-development-status)\n - [Connectors](#connectors)\n - [Helpers and Utils](#helpers-and-utils)\n - [Version history](#version-history)\n\n## Installing\n\n```bash\npip install pyspark-connectors\n```\n\n## Development enviroment\n\nFor develop you must guarantee that you have the Python (3.8 or higher) and Spark (3.1.2 or higher) installed, if you have ready the minimum environment for development in Python language, proceed with these steps:\n\n```bash\n# Clonning the project\n$ git clone git@github.com:eleflow/pyspark-connectors.git\n\n# Inside of the project root folder\n$ python -m venv .env\n\n# If Windows\n$ .\\.env\\Script\\Activate.ps1 \n# If linux dist\n$ .\\.env\\Scripts\\activate\n\n# Installing requirements libraries\n(.env) $ pip install -r .\\requirements.txt\n```\n\n### Packaging project in a .whl lib\n\n```bash\n# Installing wheel package\n(.env) $ pip install wheel\n\n# Installing wheel contents\n(.env) $ pip install check-wheel-contents\n\n# Build and packaging project to .whl\n(.env) $ python setup.py bdist_wheel\n```\n\n## Library development status\n\n### Connectors\n\n- [x] Google Sheets\n- [x] Rest API\n- [x] SQL Database\n- [x] CosmosDB\n- [ ] Elasticsearch\n- [x] PipeDrive\n- [ ] ActiveCampaign\n- [ ] ReclameAqui\n- [ ] Jira\n\n### Helpers and Utils\n\n- [ ] AWS Secrets Manager\n\n## Version history\n\n| Version | Date | Changes | Notes | Approved by |\n| --- | --- | --- | --- | --- |\n| 0.0.1a2 | 2022-05-08 | Initial development release | N/A | [@caiodearaujo](https://github.com/caiodearaujo) |\n| 0.1.0 | 2022-06-01 | Initial release | N/A | [@caiodearaujo](https://github.com/caiodearaujo) |\n| 0.2.0 | 2022-07-28 | New release with connectors stable | N/A | [@caiodearaujo](https://github.com/caiodearaujo) |\n| 0.3.0 | 2024-06-10 | Fix - Pipedrive Person Service | N/A | [@cehira](https://github.com/cehira) |\n\n\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "The easy and quickly way to connect and integrate the Spark project with many others data sources.",
"version": "0.3.0",
"project_urls": {
"Homepage": "https://github.com/eleflow/pyspark-connectors/"
},
"split_keywords": [
"pyspark",
"databricks",
"integrator",
"connector",
"cosmosdb",
"sql",
"nosql",
"sqlserver",
"oracle",
"mysql",
"postgres",
"mariadb",
"pipedrive",
"activecampaign",
"googlesheet",
"restapi"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "26a6c4aa703ed47bc77cd91bc14834fa007e7f91d0f91c5c91cf1dbaf057cd75",
"md5": "2912d4070616a39a4334f049036107e6",
"sha256": "2b6cc8f33dc49143a82727a58f0af333dd05bd85903c6b50143d9b0be489bce6"
},
"downloads": -1,
"filename": "pyspark_connectors-0.3.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "2912d4070616a39a4334f049036107e6",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 47445,
"upload_time": "2024-06-13T22:19:13",
"upload_time_iso_8601": "2024-06-13T22:19:13.487603Z",
"url": "https://files.pythonhosted.org/packages/26/a6/c4aa703ed47bc77cd91bc14834fa007e7f91d0f91c5c91cf1dbaf057cd75/pyspark_connectors-0.3.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "fb821fb05d384151bb0103c069a0fe621e2af24a4b125f957e5e7311cda4f8dc",
"md5": "35892d7e43a5f5b411bfc97de163eb8f",
"sha256": "dc12d2f6f61fd9c80b3177aaa9c97ae988fd3ff7bcd4399a8eba205257544e49"
},
"downloads": -1,
"filename": "pyspark-connectors-0.3.0.tar.gz",
"has_sig": false,
"md5_digest": "35892d7e43a5f5b411bfc97de163eb8f",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 23078,
"upload_time": "2024-06-13T22:19:15",
"upload_time_iso_8601": "2024-06-13T22:19:15.075633Z",
"url": "https://files.pythonhosted.org/packages/fb/82/1fb05d384151bb0103c069a0fe621e2af24a4b125f957e5e7311cda4f8dc/pyspark-connectors-0.3.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-06-13 22:19:15",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "eleflow",
"github_project": "pyspark-connectors",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [],
"lcname": "pyspark-connectors"
}