snowpark-checkpoints-hypothesis


Namesnowpark-checkpoints-hypothesis JSON
Version 0.1.3 PyPI version JSON
download
home_pageNone
SummaryHypothesis extension for generating Snowpark DataFrames
upload_time2025-02-07 20:19:34
maintainerNone
docs_urlNone
authorNone
requires_python<3.12,>=3.9
licenseApache License, Version 2.0
keywords snowflake snowpark analytics cloud database db
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # snowpark-checkpoints-hypothesis

---
##### This package is on Public Preview.
---

**snowpark-checkpoints-hypothesis** is a [Hypothesis](https://hypothesis.readthedocs.io/en/latest/) extension for generating Snowpark DataFrames. This project provides strategies to facilitate testing and data generation for Snowpark DataFrames using the Hypothesis library.

## Installation

You can install this package using either **pip** or **conda**:

```shell
pip install snowpark-checkpoints-hypothesis
--or--
conda install snowpark-checkpoints-hypothesis
```

## Usage

The typical workflow for using the Hypothesis library to generate Snowpark dataframes is as follows:

1. Create a standard Python test function with the different assertions or conditions your code should satisfy for all inputs.
2. Add the Hypothesis `@given` decorator to your test function and pass the `dataframe_strategy` function as an argument.
3. Run the test. When the test is executed, Hypothesis will automatically provide the generated inputs as arguments to the test.

### Example 1: Generate Snowpark DataFrames from a JSON schema file

You can use the `dataframe_strategy` function to create Snowpark DataFrames from a JSON schema file generated by the `collect_dataframe_checkpoint` function of the [snowpark-checkpoints-collectors](https://pypi.org/project/snowpark-checkpoints-collectors/) package:

```python
from hypothesis import given
from snowflake.hypothesis_snowpark import dataframe_strategy
from snowflake.snowpark import DataFrame, Session


@given(
    df=dataframe_strategy(
        schema="path/to/schema.json",
        session=Session.builder.getOrCreate(),
        size=10,
    )
)
def test_my_function(df: DataFrame):
    # Test your function here
    ...
```

### Example 2: Generate Snowpark DataFrames from a Pandera DataFrameSchema object

You can also use the `dataframe_strategy` function to create Snowpark DataFrames from a Pandera DataFrameSchema object:

```python
import pandera as pa
from hypothesis import given
from snowflake.hypothesis_snowpark import dataframe_strategy
from snowflake.snowpark import DataFrame, Session

@given(
    df=dataframe_strategy(
        schema=pa.DataFrameSchema(
            {
                "A": pa.Column(pa.Int, checks=pa.Check.in_range(0, 10)),
                "B": pa.Column(pa.Bool),
            }
        ),
        session=Session.builder.getOrCreate(),
        size=10,
    )
)
def test_my_function(df: DataFrame):
    # Test your function here
    ...
```

## Development

### Set up a development environment

To set up a development environment, follow the steps below:

1. Create a virtual environment using **venv** or **conda**. Replace \<env-name\> with the name of your environment.

    Using **venv**:

    ```shell
    python3.11 -m venv <env-name>
    source <env-name>/bin/activate
    ```

    Using **conda**:

    ```shell
    conda create -n <env-name> python=3.11
    conda activate <env-name>
    ```

2. Configure your IDE to use the previously created virtual environment:

    * [Configuring a Python interpreter in PyCharm](https://www.jetbrains.com/help/pycharm/configuring-python-interpreter.html)
    * [Configuring a Python interpreter in VS Code](https://code.visualstudio.com/docs/python/environments#_manually-specify-an-interpreter)

3. Install the project dependencies:

    ```shell
    pip install hatch
    pip install -e .
    ```

### Running Tests

To run tests, run the following command.

```shell
hatch run test:check
```

------

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "snowpark-checkpoints-hypothesis",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<3.12,>=3.9",
    "maintainer_email": null,
    "keywords": "Snowflake, Snowpark, analytics, cloud, database, db",
    "author": null,
    "author_email": "\"Snowflake, Inc.\" <snowflake-python-libraries-dl@snowflake.com>",
    "download_url": "https://files.pythonhosted.org/packages/39/d8/2c1fb8175dfcf80b9c8a245064f1c64fecbeff23a86ac30ae52523b80b35/snowpark_checkpoints_hypothesis-0.1.3.tar.gz",
    "platform": null,
    "description": "# snowpark-checkpoints-hypothesis\n\n---\n##### This package is on Public Preview.\n---\n\n**snowpark-checkpoints-hypothesis** is a [Hypothesis](https://hypothesis.readthedocs.io/en/latest/) extension for generating Snowpark DataFrames. This project provides strategies to facilitate testing and data generation for Snowpark DataFrames using the Hypothesis library.\n\n## Installation\n\nYou can install this package using either **pip** or **conda**:\n\n```shell\npip install snowpark-checkpoints-hypothesis\n--or--\nconda install snowpark-checkpoints-hypothesis\n```\n\n## Usage\n\nThe typical workflow for using the Hypothesis library to generate Snowpark dataframes is as follows:\n\n1. Create a standard Python test function with the different assertions or conditions your code should satisfy for all inputs.\n2. Add the Hypothesis `@given` decorator to your test function and pass the `dataframe_strategy` function as an argument.\n3. Run the test. When the test is executed, Hypothesis will automatically provide the generated inputs as arguments to the test.\n\n### Example 1: Generate Snowpark DataFrames from a JSON schema file\n\nYou can use the `dataframe_strategy` function to create Snowpark DataFrames from a JSON schema file generated by the `collect_dataframe_checkpoint` function of the [snowpark-checkpoints-collectors](https://pypi.org/project/snowpark-checkpoints-collectors/) package:\n\n```python\nfrom hypothesis import given\nfrom snowflake.hypothesis_snowpark import dataframe_strategy\nfrom snowflake.snowpark import DataFrame, Session\n\n\n@given(\n    df=dataframe_strategy(\n        schema=\"path/to/schema.json\",\n        session=Session.builder.getOrCreate(),\n        size=10,\n    )\n)\ndef test_my_function(df: DataFrame):\n    # Test your function here\n    ...\n```\n\n### Example 2: Generate Snowpark DataFrames from a Pandera DataFrameSchema object\n\nYou can also use the `dataframe_strategy` function to create Snowpark DataFrames from a Pandera DataFrameSchema object:\n\n```python\nimport pandera as pa\nfrom hypothesis import given\nfrom snowflake.hypothesis_snowpark import dataframe_strategy\nfrom snowflake.snowpark import DataFrame, Session\n\n@given(\n    df=dataframe_strategy(\n        schema=pa.DataFrameSchema(\n            {\n                \"A\": pa.Column(pa.Int, checks=pa.Check.in_range(0, 10)),\n                \"B\": pa.Column(pa.Bool),\n            }\n        ),\n        session=Session.builder.getOrCreate(),\n        size=10,\n    )\n)\ndef test_my_function(df: DataFrame):\n    # Test your function here\n    ...\n```\n\n## Development\n\n### Set up a development environment\n\nTo set up a development environment, follow the steps below:\n\n1. Create a virtual environment using **venv** or **conda**. Replace \\<env-name\\> with the name of your environment.\n\n    Using **venv**:\n\n    ```shell\n    python3.11 -m venv <env-name>\n    source <env-name>/bin/activate\n    ```\n\n    Using **conda**:\n\n    ```shell\n    conda create -n <env-name> python=3.11\n    conda activate <env-name>\n    ```\n\n2. Configure your IDE to use the previously created virtual environment:\n\n    * [Configuring a Python interpreter in PyCharm](https://www.jetbrains.com/help/pycharm/configuring-python-interpreter.html)\n    * [Configuring a Python interpreter in VS Code](https://code.visualstudio.com/docs/python/environments#_manually-specify-an-interpreter)\n\n3. Install the project dependencies:\n\n    ```shell\n    pip install hatch\n    pip install -e .\n    ```\n\n### Running Tests\n\nTo run tests, run the following command.\n\n```shell\nhatch run test:check\n```\n\n------\n",
    "bugtrack_url": null,
    "license": "Apache License, Version 2.0",
    "summary": "Hypothesis extension for generating Snowpark DataFrames",
    "version": "0.1.3",
    "project_urls": {
        "Bug Tracker": "https://github.com/snowflakedb/snowpark-checkpoints/issues",
        "Source code": "https://github.com/snowflakedb/snowpark-checkpoints/"
    },
    "split_keywords": [
        "snowflake",
        " snowpark",
        " analytics",
        " cloud",
        " database",
        " db"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "d2bb22acc85b25b9d916ab6524d36aa00413b984b28391a0d1538a2e1351b285",
                "md5": "e4a9c20d216f2bfbd57f7dae8bb2c454",
                "sha256": "055327e86626efb3dc9ee30712d91aa98909dd3d1838bc90dfb6b8edfba0ce8d"
            },
            "downloads": -1,
            "filename": "snowpark_checkpoints_hypothesis-0.1.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "e4a9c20d216f2bfbd57f7dae8bb2c454",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<3.12,>=3.9",
            "size": 25424,
            "upload_time": "2025-02-07T20:19:24",
            "upload_time_iso_8601": "2025-02-07T20:19:24.288838Z",
            "url": "https://files.pythonhosted.org/packages/d2/bb/22acc85b25b9d916ab6524d36aa00413b984b28391a0d1538a2e1351b285/snowpark_checkpoints_hypothesis-0.1.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "39d82c1fb8175dfcf80b9c8a245064f1c64fecbeff23a86ac30ae52523b80b35",
                "md5": "02ce236e37114f36b3af9e49d9629e44",
                "sha256": "3d16eaa927cbfec7ffb5f176c061615e690bae7924c17fc600460ac1febc182e"
            },
            "downloads": -1,
            "filename": "snowpark_checkpoints_hypothesis-0.1.3.tar.gz",
            "has_sig": false,
            "md5_digest": "02ce236e37114f36b3af9e49d9629e44",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<3.12,>=3.9",
            "size": 32026,
            "upload_time": "2025-02-07T20:19:34",
            "upload_time_iso_8601": "2025-02-07T20:19:34.065527Z",
            "url": "https://files.pythonhosted.org/packages/39/d8/2c1fb8175dfcf80b9c8a245064f1c64fecbeff23a86ac30ae52523b80b35/snowpark_checkpoints_hypothesis-0.1.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-02-07 20:19:34",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "snowflakedb",
    "github_project": "snowpark-checkpoints",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "snowpark-checkpoints-hypothesis"
}
        
Elapsed time: 0.46135s