typedspark


Nametypedspark JSON
Version 1.4.1 PyPI version JSON
download
home_pagehttps://github.com/kaiko-ai/typedspark
SummaryColumn-wise type annotations for pyspark DataFrames
upload_time2024-04-13 08:03:24
maintainerNone
docs_urlNone
authorNanne Aben
requires_python>=3.9.0
licenseApache-2.0
keywords pyspark spark typing type checking annotations
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Typedspark: column-wise type annotations for pyspark DataFrames

We love Spark! But in production code we're wary when we see:

```python
from pyspark.sql import DataFrame

def foo(df: DataFrame) -> DataFrame:
    # do stuff
    return df
```

Becauseā€¦ How do we know which columns are supposed to be in ``df``?

Using ``typedspark``, we can be more explicit about what these data should look like.

```python
from typedspark import Column, DataSet, Schema
from pyspark.sql.types import LongType, StringType

class Person(Schema):
    id: Column[LongType]
    name: Column[StringType]
    age: Column[LongType]

def foo(df: DataSet[Person]) -> DataSet[Person]:
    # do stuff
    return df
```
The advantages include:

* Improved readability of the code
* Typechecking, both during runtime and linting
* Auto-complete of column names
* Easy refactoring of column names
* Easier unit testing through the generation of empty ``DataSets`` based on their schemas
* Improved documentation of tables

## Documentation
Please see our documentation on [readthedocs](https://typedspark.readthedocs.io/en/latest/index.html).

## Installation

You can install ``typedspark`` from [pypi](https://pypi.org/project/typedspark/) by running:

```bash
pip install typedspark
```
By default, ``typedspark`` does not list ``pyspark`` as a dependency, since many platforms (e.g. Databricks) come with ``pyspark`` preinstalled.  If you want to install ``typedspark`` with ``pyspark``, you can run:

```bash
pip install "typedspark[pyspark]"
```

## Demo videos

### IDE demo

https://github.com/kaiko-ai/typedspark/assets/47976799/e6f7fa9c-6d14-4f68-baba-fe3c22f75b67

You can find the corresponding code [here](docs/videos/ide.ipynb).

### Jupyter / Databricks notebooks demo

https://github.com/kaiko-ai/typedspark/assets/47976799/39e157c3-6db0-436a-9e72-44b2062df808

You can find the corresponding code [here](docs/videos/notebook.ipynb).

## FAQ

**I found a bug! What should I do?**</br>
Great! Please make an issue and we'll look into it.

**I have a great idea to improve typedspark! How can we make this work?**</br>
Awesome, please make an issue and let us know!

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/kaiko-ai/typedspark",
    "name": "typedspark",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9.0",
    "maintainer_email": null,
    "keywords": "pyspark spark typing type checking annotations",
    "author": "Nanne Aben",
    "author_email": "nanne@kaiko.ai",
    "download_url": "https://files.pythonhosted.org/packages/65/fc/4a21aadd8e75afb76bd713a7a5d75847aad23920abbe7693e255fae18223/typedspark-1.4.1.tar.gz",
    "platform": null,
    "description": "# Typedspark: column-wise type annotations for pyspark DataFrames\n\nWe love Spark! But in production code we're wary when we see:\n\n```python\nfrom pyspark.sql import DataFrame\n\ndef foo(df: DataFrame) -> DataFrame:\n    # do stuff\n    return df\n```\n\nBecause\u2026 How do we know which columns are supposed to be in ``df``?\n\nUsing ``typedspark``, we can be more explicit about what these data should look like.\n\n```python\nfrom typedspark import Column, DataSet, Schema\nfrom pyspark.sql.types import LongType, StringType\n\nclass Person(Schema):\n    id: Column[LongType]\n    name: Column[StringType]\n    age: Column[LongType]\n\ndef foo(df: DataSet[Person]) -> DataSet[Person]:\n    # do stuff\n    return df\n```\nThe advantages include:\n\n* Improved readability of the code\n* Typechecking, both during runtime and linting\n* Auto-complete of column names\n* Easy refactoring of column names\n* Easier unit testing through the generation of empty ``DataSets`` based on their schemas\n* Improved documentation of tables\n\n## Documentation\nPlease see our documentation on [readthedocs](https://typedspark.readthedocs.io/en/latest/index.html).\n\n## Installation\n\nYou can install ``typedspark`` from [pypi](https://pypi.org/project/typedspark/) by running:\n\n```bash\npip install typedspark\n```\nBy default, ``typedspark`` does not list ``pyspark`` as a dependency, since many platforms (e.g. Databricks) come with ``pyspark`` preinstalled.  If you want to install ``typedspark`` with ``pyspark``, you can run:\n\n```bash\npip install \"typedspark[pyspark]\"\n```\n\n## Demo videos\n\n### IDE demo\n\nhttps://github.com/kaiko-ai/typedspark/assets/47976799/e6f7fa9c-6d14-4f68-baba-fe3c22f75b67\n\nYou can find the corresponding code [here](docs/videos/ide.ipynb).\n\n### Jupyter / Databricks notebooks demo\n\nhttps://github.com/kaiko-ai/typedspark/assets/47976799/39e157c3-6db0-436a-9e72-44b2062df808\n\nYou can find the corresponding code [here](docs/videos/notebook.ipynb).\n\n## FAQ\n\n**I found a bug! What should I do?**</br>\nGreat! Please make an issue and we'll look into it.\n\n**I have a great idea to improve typedspark! How can we make this work?**</br>\nAwesome, please make an issue and let us know!\n",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "Column-wise type annotations for pyspark DataFrames",
    "version": "1.4.1",
    "project_urls": {
        "Homepage": "https://github.com/kaiko-ai/typedspark"
    },
    "split_keywords": [
        "pyspark",
        "spark",
        "typing",
        "type",
        "checking",
        "annotations"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "28f6ddc41f7a19298a786c4c320bf2f067b626c66764868bdfa5b5bb08a1a39f",
                "md5": "2c0fd622f7f1979d343e531ecc7670a4",
                "sha256": "3a0f973daee53b42a982ba57bab5cc8af656ed72ce8e76ae3c011381334dcb38"
            },
            "downloads": -1,
            "filename": "typedspark-1.4.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "2c0fd622f7f1979d343e531ecc7670a4",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9.0",
            "size": 34866,
            "upload_time": "2024-04-13T08:03:23",
            "upload_time_iso_8601": "2024-04-13T08:03:23.001848Z",
            "url": "https://files.pythonhosted.org/packages/28/f6/ddc41f7a19298a786c4c320bf2f067b626c66764868bdfa5b5bb08a1a39f/typedspark-1.4.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "65fc4a21aadd8e75afb76bd713a7a5d75847aad23920abbe7693e255fae18223",
                "md5": "ffbfc9ef1846fae8cb1890cb9a49531f",
                "sha256": "cb8ffade34448f20df14daa42a72df70ac08c014cf87db9f3561747afbf64837"
            },
            "downloads": -1,
            "filename": "typedspark-1.4.1.tar.gz",
            "has_sig": false,
            "md5_digest": "ffbfc9ef1846fae8cb1890cb9a49531f",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9.0",
            "size": 26998,
            "upload_time": "2024-04-13T08:03:24",
            "upload_time_iso_8601": "2024-04-13T08:03:24.856561Z",
            "url": "https://files.pythonhosted.org/packages/65/fc/4a21aadd8e75afb76bd713a7a5d75847aad23920abbe7693e255fae18223/typedspark-1.4.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-04-13 08:03:24",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "kaiko-ai",
    "github_project": "typedspark",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [],
    "tox": true,
    "lcname": "typedspark"
}
        
Elapsed time: 0.25414s