sparksnake


Namesparksnake JSON
Version 0.2.2 PyPI version JSON
download
home_pagehttps://github.com/ThiagoPanini/sparksnake
SummaryImproving the development of Spark applications deployed as jobs on AWS services like Glue and EMR
upload_time2023-07-15 19:17:17
maintainer
docs_urlNone
authorThiago Panini
requires_python>=3.0.0
licenseMIT
keywords cloud aws python spark pyspark
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <div align="center">
    <br><img src="https://github.com/ThiagoPanini/snakespark/blob/main/docs/assets/imgs/header-readme.png?raw=true" alt="snakespark-logo">
</div>

<div align="center">  
  <br>
  
  [![PyPI](https://img.shields.io/pypi/v/sparksnake?color=purple)](https://pypi.org/project/sparksnake/)
  ![PyPI - Downloads](https://img.shields.io/pypi/dm/sparksnake?color=purple)
  ![PyPI - Status](https://img.shields.io/pypi/status/sparksnake?color=purple)
  <br>

  ![CI workflow](https://img.shields.io/github/actions/workflow/status/ThiagoPanini/sparksnake/ci-main.yml?label=ci)
  [![Documentation Status](https://readthedocs.org/projects/sparksnake/badge/?version=latest)](https://sparksnake.readthedocs.io/en/latest/?badge=latest)
  [![codecov](https://codecov.io/gh/ThiagoPanini/sparksnake/branch/main/graph/badge.svg?token=zSdFO9jkD8)](https://codecov.io/gh/ThiagoPanini/sparksnake)

</div>

## Table of content

- [Table of content](#table-of-content)
- [What is the sparksnake library?](#what-is-the-sparksnake-library)
- [Features](#features)
- [Contact me](#contact-me)
- [References](#references)


## What is the sparksnake library?

The *sparksnake* library provides an easy, fast, and efficient way to use Spark features inside analytics services on AWS. With *sparksnake*, it is possible to use classes, methods and functions developed in pyspark to simplify, as much as possible, the journey of building Spark applications anywhere!

> **Note**
>  Now the *sparksnake* library has an official documentation in readthedocs! Visit the [following link](https://sparksnake.readthedocs.io/en/latest/) and check out usability technical details, hands on demos and more!


## Features

- 🤖 Apply common Spark operations using few lines of code
- 💻 Start developing your Spark applications anywhere using the "default" mode or in any AWS services that uses Spark
- ⏳ Stop spending time setting up the boring stuff of your Spark applications
- 💡 Apply the best practices on your application by structuring your code following the best practices
- 👁️‍🗨️ Improve your aplication's observability by using detailed log messages on CloudWatch and exception handlers

___

## Contact me

- [Thiago Panini - LinkedIn](https://www.linkedin.com/in/thiago-panini/)
- [paninitechlab @ hashnode](https://panini.hashnode.dev/)

___

## References

**Python**

- [Python - Packaging Python Projects](https://packaging.python.org/en/latest/tutorials/packaging-projects/)

**Docs**

- [Eduardo Mendes - Live de Python 189 - MkDocs](https://www.youtube.com/watch?v=GW6nAJ1NHUQ&t=2s&ab_channel=EduardoMendes)
- [MkDocs](https://www.mkdocs.org/)
- [pmdown-extensions](https://facelessuser.github.io/pymdown-extensions/)
- [GitHub - MkDocs Themes](https://github.com/mkdocs/mkdocs/wiki/MkDocs-Themes)
- [GitHub - Material Theme for MkDocs](https://github.com/squidfunk/mkdocs-material)
- [Material for MkDocs - Setup](https://squidfunk.github.io/mkdocs-material/setup/changing-the-colors/)

**Github**

- [GitHub Actions - pypa/gh-action-pypi-publish](https://github.com/marketplace/actions/pypi-publish)
- [Medium - Major, Minor and Patch](https://medium.com/fiverr-engineering/major-minor-patch-a5298e2e1798)
- [Medium - Automate PyPI Releases with GitHub Actions](https://medium.com/@VersuS_/automate-pypi-releases-with-github-actions-4c5a9cfe947d)

**Tests**

- [Codecov - Setting Threshold](https://github.com/codecov/codecov-action/issues/554#issuecomment-1261250304)
- [Codecov - About the Codecov YAML](https://docs.codecov.com/docs/codecov-yaml)
- [Codecov - Status Checks](https://docs.codecov.com/docs/commit-status)
- [Codecov - codecov.yml Reference](https://docs.codecov.com/docs/codecovyml-reference)
- [Codecov - Ignore Paths](https://docs.codecov.com/docs/ignoring-paths)

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/ThiagoPanini/sparksnake",
    "name": "sparksnake",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.0.0",
    "maintainer_email": "",
    "keywords": "Cloud,AWS,Python,Spark,pyspark",
    "author": "Thiago Panini",
    "author_email": "panini.development@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/f0/74/a51aca65fcfde4336e601059637c7a82a623cb1081dcf75ec5392384160e/sparksnake-0.2.2.tar.gz",
    "platform": null,
    "description": "<div align=\"center\">\n    <br><img src=\"https://github.com/ThiagoPanini/snakespark/blob/main/docs/assets/imgs/header-readme.png?raw=true\" alt=\"snakespark-logo\">\n</div>\n\n<div align=\"center\">  \n  <br>\n  \n  [![PyPI](https://img.shields.io/pypi/v/sparksnake?color=purple)](https://pypi.org/project/sparksnake/)\n  ![PyPI - Downloads](https://img.shields.io/pypi/dm/sparksnake?color=purple)\n  ![PyPI - Status](https://img.shields.io/pypi/status/sparksnake?color=purple)\n  <br>\n\n  ![CI workflow](https://img.shields.io/github/actions/workflow/status/ThiagoPanini/sparksnake/ci-main.yml?label=ci)\n  [![Documentation Status](https://readthedocs.org/projects/sparksnake/badge/?version=latest)](https://sparksnake.readthedocs.io/en/latest/?badge=latest)\n  [![codecov](https://codecov.io/gh/ThiagoPanini/sparksnake/branch/main/graph/badge.svg?token=zSdFO9jkD8)](https://codecov.io/gh/ThiagoPanini/sparksnake)\n\n</div>\n\n## Table of content\n\n- [Table of content](#table-of-content)\n- [What is the sparksnake library?](#what-is-the-sparksnake-library)\n- [Features](#features)\n- [Contact me](#contact-me)\n- [References](#references)\n\n\n## What is the sparksnake library?\n\nThe *sparksnake* library provides an easy, fast, and efficient way to use Spark features inside analytics services on AWS. With *sparksnake*, it is possible to use classes, methods and functions developed in pyspark to simplify, as much as possible, the journey of building Spark applications anywhere!\n\n> **Note**\n>  Now the *sparksnake* library has an official documentation in readthedocs! Visit the [following link](https://sparksnake.readthedocs.io/en/latest/) and check out usability technical details, hands on demos and more!\n\n\n## Features\n\n- \ud83e\udd16 Apply common Spark operations using few lines of code\n- \ud83d\udcbb Start developing your Spark applications anywhere using the \"default\" mode or in any AWS services that uses Spark\n- \u23f3 Stop spending time setting up the boring stuff of your Spark applications\n- \ud83d\udca1 Apply the best practices on your application by structuring your code following the best practices\n- \ud83d\udc41\ufe0f\u200d\ud83d\udde8\ufe0f Improve your aplication's observability by using detailed log messages on CloudWatch and exception handlers\n\n___\n\n## Contact me\n\n- [Thiago Panini - LinkedIn](https://www.linkedin.com/in/thiago-panini/)\n- [paninitechlab @ hashnode](https://panini.hashnode.dev/)\n\n___\n\n## References\n\n**Python**\n\n- [Python - Packaging Python Projects](https://packaging.python.org/en/latest/tutorials/packaging-projects/)\n\n**Docs**\n\n- [Eduardo Mendes - Live de Python 189 - MkDocs](https://www.youtube.com/watch?v=GW6nAJ1NHUQ&t=2s&ab_channel=EduardoMendes)\n- [MkDocs](https://www.mkdocs.org/)\n- [pmdown-extensions](https://facelessuser.github.io/pymdown-extensions/)\n- [GitHub - MkDocs Themes](https://github.com/mkdocs/mkdocs/wiki/MkDocs-Themes)\n- [GitHub - Material Theme for MkDocs](https://github.com/squidfunk/mkdocs-material)\n- [Material for MkDocs - Setup](https://squidfunk.github.io/mkdocs-material/setup/changing-the-colors/)\n\n**Github**\n\n- [GitHub Actions - pypa/gh-action-pypi-publish](https://github.com/marketplace/actions/pypi-publish)\n- [Medium - Major, Minor and Patch](https://medium.com/fiverr-engineering/major-minor-patch-a5298e2e1798)\n- [Medium - Automate PyPI Releases with GitHub Actions](https://medium.com/@VersuS_/automate-pypi-releases-with-github-actions-4c5a9cfe947d)\n\n**Tests**\n\n- [Codecov - Setting Threshold](https://github.com/codecov/codecov-action/issues/554#issuecomment-1261250304)\n- [Codecov - About the Codecov YAML](https://docs.codecov.com/docs/codecov-yaml)\n- [Codecov - Status Checks](https://docs.codecov.com/docs/commit-status)\n- [Codecov - codecov.yml Reference](https://docs.codecov.com/docs/codecovyml-reference)\n- [Codecov - Ignore Paths](https://docs.codecov.com/docs/ignoring-paths)\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Improving the development of Spark applications deployed as jobs on AWS services like Glue and EMR",
    "version": "0.2.2",
    "project_urls": {
        "Homepage": "https://github.com/ThiagoPanini/sparksnake"
    },
    "split_keywords": [
        "cloud",
        "aws",
        "python",
        "spark",
        "pyspark"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f762c1f268f05c44d6def885da70de4fc269ce8de097f93ea90df175c056d5e6",
                "md5": "668a801bb7f3f4ff210f5eeb118593b1",
                "sha256": "2aa5e0dd534e7ae34257929a72c1b33e0297a66b5c15d2bc1dc418b2708d82bb"
            },
            "downloads": -1,
            "filename": "sparksnake-0.2.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "668a801bb7f3f4ff210f5eeb118593b1",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.0.0",
            "size": 40517,
            "upload_time": "2023-07-15T19:17:15",
            "upload_time_iso_8601": "2023-07-15T19:17:15.651669Z",
            "url": "https://files.pythonhosted.org/packages/f7/62/c1f268f05c44d6def885da70de4fc269ce8de097f93ea90df175c056d5e6/sparksnake-0.2.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f074a51aca65fcfde4336e601059637c7a82a623cb1081dcf75ec5392384160e",
                "md5": "869985fecdf811d28ba502a532aefa7f",
                "sha256": "5210824025e47e3a3256b345f636a373901c0c82ea1a0ca8731e164101b94d0f"
            },
            "downloads": -1,
            "filename": "sparksnake-0.2.2.tar.gz",
            "has_sig": false,
            "md5_digest": "869985fecdf811d28ba502a532aefa7f",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.0.0",
            "size": 37813,
            "upload_time": "2023-07-15T19:17:17",
            "upload_time_iso_8601": "2023-07-15T19:17:17.135319Z",
            "url": "https://files.pythonhosted.org/packages/f0/74/a51aca65fcfde4336e601059637c7a82a623cb1081dcf75ec5392384160e/sparksnake-0.2.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-07-15 19:17:17",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "ThiagoPanini",
    "github_project": "sparksnake",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "sparksnake"
}
        
Elapsed time: 0.08882s